WorldWideScience

Sample records for analysis manual section

  1. H. W. Laboratory manual: 100 Area section

    Energy Technology Data Exchange (ETDEWEB)

    1950-07-01

    The purpose of this manual is to present a Hazard Breakdown of all jobs normally encountered in the laboratory work of the three sections comprising the Analytic Section, Metallurgy and Control Division of the Technical Department. A Hazard Breakdown is a careful analysis of any job in which the source of possible dangers is clearly indicated for each particular step. The analysis is prepared by individuals who are thoroughly familiar with the specific job or procedure. It is felt that if the hazards herein outlined are recognized by the Laboratory personnel and the suggested safety cautions followed, the chance for injury will be minimized and the worker will become generally more safety conscious. The manual, which is prefaced by the general safety rules applying to all the laboratories, is divided into three main sections, one for each of the three sections into which the Laboratories Division is divided. These sections are as follows: Section 1 -- 200 Area Control; Section 2 -- 100 Area Control; Section 3 -- 300 Area Control, Essential Materials, and Methods Improvement.

  2. Manual for subject analysis

    International Nuclear Information System (INIS)

    2002-01-01

    This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)

  3. User's manual for BECAS. A cross section analysis tool for anisotropic and inhomogeneous beam sections of arbitrary geometry

    Energy Technology Data Exchange (ETDEWEB)

    Blasques, J.P.

    2012-02-15

    The BEam Cross section Analysis Software - BECAS - is a group of Matlab functions used for the analysis of the stiffness and mass properties of beam cross sections. The report presents BECAS' code and user's guide. (LN)

  4. MXS cross-section preprocessor user's manual

    International Nuclear Information System (INIS)

    Parker, F.; Ishikawa, M.; Luck, L.

    1987-03-01

    The MXS preprocessor has been designed to reduce the execution time of programs using isotopic cross-section data and to both reduce the execution time and improve the accuracy of shielding-factor interpolation in the SIMMER-II accident analysis program. MXS is a dual-purpose preprocessing code to: (1) mix isotopes into materials and (2) fit analytic functions to the shelf-shielding data. The program uses the isotope microscopic neutron cross-section data from the CCCC standard interface file ISOTXS and the isotope Bondarenko self-shielding data from the CCCC standard interface file BRKOXS to generate cross-section and self-shielding data for materials. The materials may be a mixture of several isotopes. The self-shielding data for the materials may be the actual shielding factors or a set of coefficients for functions representing the background dependence of the shielding factors. A set of additional data is given to describe the functions necessary to interpolate the shielding factors over temperature

  5. Department of the Army Cost Analysis Manual

    National Research Council Canada - National Science Library

    1997-01-01

    .... The specific goal of this manual is to help the cost analyst serve the customer. This is done by providing reference material on cost analysis processes, methods, techniques, structures, and definitions...

  6. Microscopic Analysis of Activated Sludge. Training Manual.

    Science.gov (United States)

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This training manual presents material on the use of a compound microscope to analyze microscope communities, present in wastewater treatment processes, for operational control. Course topics include: sampling techniques, sample handling, laboratory analysis, identification of organisms, data interpretation, and use of the compound microscope.…

  7. Manual for wave generation and analysis

    DEFF Research Database (Denmark)

    Jakobsen, Morten Møller

    This Manual is for the included wave generation and analysis software and graphical user interface. The package is made for Matlab and is meant for educational purposes. The code is free to use under the GNU Public License (GPL). It is still in development and should be considered as such. If you...

  8. Radiation protection technician job task analysis manual

    International Nuclear Information System (INIS)

    1990-03-01

    This manual was developed to assist all DOE contractors in the design and conduct of job task analysis (JTA) for the radiation protection technician. Experience throughout the nuclear industry and the DOE system has indicated that the quality and efficiency in conducting a JTA at most sites is greatly enhanced by using a generic task list for the position, and clearly written guidelines on the JTA process. This manual is designed to provide this information for personnel to use in developing and conducting site-specific JTAs. (VC)

  9. Database Changes (Post-Publication). ERIC Processing Manual, Section X.

    Science.gov (United States)

    Brandhorst, Ted, Ed.

    The purpose of this section is to specify the procedure for making changes to the ERIC database after the data involved have been announced in the abstract journals RIE or CIJE. As a matter of general ERIC policy, a document or journal article is not re-announced or re-entered into the database as a new accession for the purpose of accomplishing a…

  10. Manual of program operation for data analysis from radiometer system

    International Nuclear Information System (INIS)

    Silva Mello, L.A.R. da; Migliora, C.G.S.

    1987-12-01

    This manual describes how to use the software to retrieve and analyse data from radiometer systems and raingauges used in the 12 GHz PROPAGATION MEASUREMENTS/CANADA - TELEBRAS COOPERATION PROGRAM. The data retrieval and analisys is being carried out by CETUC, as part of the activities of the project Simulacao de Enlaces Satelite (SES). The software for these tasks has been supplied by the Canadian Research Centre (CRC), together with the measurement equipment. The two following sections describe the use of the data retrieval routines and the data analysis routines of program ATTEN. Also, a quick reference guide for commands that can be used when a microcomputer is local or remotely connected to a radiometer indoor unit is included as a last section. A more detailed description of these commands, their objectives and cautions that should de taken when using them can be found in the manual ''12 GHz Propagation Measurements System - Volume 1 - Dual Slope Radiometer and Data Aquisition System'', supplied by Diversitel Communications Inc. (author) [pt

  11. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    Science.gov (United States)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  12. Electronic manual of the nuclear characteristics analysis code-set for FBR

    International Nuclear Information System (INIS)

    Makino, Tohru

    2001-03-01

    Reactor Physics Gr., System Engineering Technology Division, O-arai Engineering Center has consolidated the nuclear design database to improve analytical methods and prediction accuracy for large fast breeder cores such as demonstration or commercial FBRs from the previous research. The up-to-date information about usage of the nuclear characteristics analysis code-set was compiled as a part of the improvement of basic design data base for FBR core. The outlines of the electronic manual are as follows; (1) The electronic manual includes explanations of following codes: JOINT : Code Interface Program. SLAROM, CASUP : Effective Cross Section Calculation Code. CITATION-FBR : Diffusion Analysis Code. PERKY : Perturbative Diffusion Analysis Code. SNPERT, SNPERT-3D : Perturbative Transport Analysis Code. SAGEP, SAGEP-3D : Sensitivity Coefficient Calculation Code. NSHEX : Transport Analysis Code using Nodal Method. ABLE : Cross Section Adjustment Calculation Code. ACCEPT : Predicting Accuracy Evaluation Code. (2) The electronic manual is described using HTML file format and PDF file for easy maintenance, updating and for easy referring through JNC Intranet. User can refer manual pages by usual Web browser software without any special setup. (3) Many of manual pages include link-tags to jump to related pages. String search is available in both HTML and PDF documents. (4) User can download source code, sample input data and shell script files to carry out each analysis from download page of each code (JNC inside only). (5) Usage of the electronic manual and maintenance/updating process are described in this report and it makes possible to enroll new code or new information in the electronic manual. Since the information has been taken into account about modifications and error fixings, added to each code after the last consolidation in 1994, the electronic manual would cover most recent status of the nuclear characteristics analysis code-set. One of other advantages of use

  13. Department of the Army Cost Analysis Manual

    Science.gov (United States)

    2001-05-01

    SECTION I - AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) ................................................................179 SECTION II - AUTOMATED...Management & Comptroller) endorsed the Automated Cost Estimating Integrated Tools ( ACEIT ) model and since it is widely used to prepare POEs, CCAs and...CRB IPT (in ACEIT ) will be the basis for information contained in the CAB. Any remaining unresolved issues from the IPT process will be raised at the

  14. Some connections for manuals of empirical logic to functional analysis

    International Nuclear Information System (INIS)

    Cook, T.A.

    1981-01-01

    In this informal presentation, the theory of manuals of operations is connected with some familiar concepts in functional analysis; namely, base normed and order unit normed spaces. The purpose of this discussion is to present several general open problems which display the interplay of empirical logic with functional analysis. These are mathematical problems with direct physical interpretation. (orig./HSI)

  15. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  16. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    International Nuclear Information System (INIS)

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits

  17. ECNJEFI. A JEFI based 219-group neutron cross-section library: User's manual

    International Nuclear Information System (INIS)

    Stad, R.C.L. van der; Gruppelaar, H.

    1992-07-01

    This manual describes the contents of the ECNJEF1 library. The ECNJEF1 library is a JEF1.1 based 219-group AMPX-Master library for reactor calculations with the AMPX/SCALE-system, e.g. the PASC-3 system as implemented at the Netherlands Energy Research Foundation in Petten, Netherlands. The group cross-section data were generated with NJOY and NPTXS/XLACS-2 from the AMPX system. The data on the ECNJEF1 library allows resolved-resonance treatment by NITAWL and/or unresolved resonance self-shielding by BONAMI. These codes are based upon the Nordheim and Bondarenko methods, respectively. (author). 10 refs., 7 tabs

  18. HORECA. Hoger onderwijs reactor elementary core analysis system. User's manual

    International Nuclear Information System (INIS)

    Battum, E. van; Serov, I.V.

    1993-07-01

    HORECA is developed at IRI Delft for quick analysis of power distribution, burnup and safety for the HOR. It can be used for the manual search of a better loading of the reactor. HORECA is based on the Penn State Fuel Management Package and uses the MCRAC code included in this package as a calculation engine. (orig./HP)

  19. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  20. User's manual for the Graphical Constituent Loading Analysis System (GCLAS)

    Science.gov (United States)

    Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.

    2006-01-01

    This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats

  1. Model for Analysis of Energy Demand (MAED-2). User's manual

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  2. Model for Analysis of Energy Demand (MAED-2). User's manual

    International Nuclear Information System (INIS)

    2006-01-01

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  3. Status of reliability in determining SDDR for manual maintenance activities in ITER: Quality assessment of relevant activation cross sections involved

    International Nuclear Information System (INIS)

    Garcia, R.; Garcia, M.; Pampin, R.; Sanz, J.

    2016-01-01

    Highlights: • Feasibility of manual maintenance activities in ITER port cell and port interspace. • Activation of relevant materials and components placed in the current ITER model. • Dominant radionuclides and pathways for shutdown dose rate in ITER. • Quality analysis of typically used EAF and TENDL activation libraries is performed. • EAF performance found as trustworthy with slight recommended improvements. - Abstract: This paper assesses the quality of the EAF-2007 and 2010 activation cross sections for relevant reactions in the determination of the Shutdown Dose Rate (SDDR) in the Port Cell (PC) and Port Interspace (PI) areas of ITER. For each of relevant ITER materials, dominant radionuclides responsible of SDDR and their production pathways are listed. This information comes from a review of the recent reports/papers about SDDR in ITER and own calculations. A total of 26 relevant pathways are found. The quality of these cross sections pathways is assessed following EAF validation procedure, and for those found as not validated last TENDL library versions have been investigated in order to check possible improvements when compared to EAF. The use of EAF libraries is found as trustworthy and it is recommended for the prediction of SDDR in the ITER PC and PI. However, 3 cross section reactions are considered for further improvement: Co59(n,2n)Co58, Cu63(n,g)Cu64 and Cr50(n,g)Cr51.

  4. Status of reliability in determining SDDR for manual maintenance activities in ITER: Quality assessment of relevant activation cross sections involved

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, R., E-mail: rgarciam@ind.uned.es [UNED, Power Engineering Department, C/Juan del Rosal 12, 28040 Madrid (Spain); Garcia, M. [UNED, Power Engineering Department, C/Juan del Rosal 12, 28040 Madrid (Spain); Pampin, R. [F4E, Torres Diagonal Litoral B3, Barcelona (Spain); Sanz, J. [UNED, Power Engineering Department, C/Juan del Rosal 12, 28040 Madrid (Spain)

    2016-11-15

    Highlights: • Feasibility of manual maintenance activities in ITER port cell and port interspace. • Activation of relevant materials and components placed in the current ITER model. • Dominant radionuclides and pathways for shutdown dose rate in ITER. • Quality analysis of typically used EAF and TENDL activation libraries is performed. • EAF performance found as trustworthy with slight recommended improvements. - Abstract: This paper assesses the quality of the EAF-2007 and 2010 activation cross sections for relevant reactions in the determination of the Shutdown Dose Rate (SDDR) in the Port Cell (PC) and Port Interspace (PI) areas of ITER. For each of relevant ITER materials, dominant radionuclides responsible of SDDR and their production pathways are listed. This information comes from a review of the recent reports/papers about SDDR in ITER and own calculations. A total of 26 relevant pathways are found. The quality of these cross sections pathways is assessed following EAF validation procedure, and for those found as not validated last TENDL library versions have been investigated in order to check possible improvements when compared to EAF. The use of EAF libraries is found as trustworthy and it is recommended for the prediction of SDDR in the ITER PC and PI. However, 3 cross section reactions are considered for further improvement: Co59(n,2n)Co58, Cu63(n,g)Cu64 and Cr50(n,g)Cr51.

  5. Manual vs. computer-assisted sperm analysis: can CASA replace manual assessment of human semen in clinical practice?

    Science.gov (United States)

    Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr

    2017-01-01

    The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.

  6. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    Science.gov (United States)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  7. User's manual of JT-60 experimental data analysis system

    International Nuclear Information System (INIS)

    Hirayama, Takashi; Morishima, Soichi; Yoshioka, Yuji

    2010-02-01

    In the Japan Atomic Energy Agency Naka Fusion Institute, a lot of experiments have been conducted by using the large tokamak device JT-60 aiming to realize fusion power plant. In order to optimize the JT-60 experiment and to investigate complex characteristics of plasma, JT-60 experimental data analysis system was developed and used for collecting, referring and analyzing the JT-60 experimental data. Main components of the system are a data analysis server and a database server for the analyses and accumulation of the experimental data respectively. Other peripheral devices of the system are magnetic disk units, NAS (Network Attached Storage) device, and a backup tape drive. This is a user's manual of the JT-60 experimental data analysis system. (author)

  8. FDA (Food and Drug Administration) Compliance Program Guidance Manual (FY 88). Section 4. Medical and radiological devices

    International Nuclear Information System (INIS)

    1988-01-01

    The FDA Compliance Program Guidance Manual provides a system for issuing and filing program plans and instructions directed to Food and Drug Administration Field operations for project implementation. Section IV provides those chapters of the Compliance Program Guidance Manual which pertain to the areas of medical and radiological devices. Some of the areas of coverage include laser and sunlamp standards inspections, compliance testing of various radiation-emitting products such as television receivers and microwave ovens, emergency response planning and policy, premarket approval and device manufacturers inspections, device problem reporting, sterilization of devices, and consumer education programs on medical and radiological devices

  9. Presenteeism, stress resilience, and physical activity in older manual workers: a person-centred analysis.

    Science.gov (United States)

    Thogersen-Ntoumani, Cecilie; Black, Julie; Lindwall, Magnus; Whittaker, Anna; Balanos, George M

    2017-12-01

    This study used a person-centred approach to explore typologies of older manual workers based on presenteeism, stress resilience, and physical activity. Older manual workers ( n  = 217; 69.1% male; age range 50-77; M age = 57.11 years; SD = 5.62) from a range of UK-based organisations, representing different manual job roles, took part in the study. A cross-sectional survey design was used. Based on the three input variables: presenteeism, stress resilience and physical activity, four distinct profiles were identified on using Latent Profile Analysis. One group ('High sport/exercise and well-functioning'; 5.50%) engaged in high levels of sport/exercise and exhibited low levels of stress resilience and all types of presenteeism. Another profile ('Physically burdened'; 9.70%) reported high levels of work and leisure-time physical activity, low stress resilience, as well as high levels of presenteeism due to physical and time demands. A 'Moderately active and functioning' group (46.50%) exhibited moderate levels on all variables. Finally, the fourth profile ('Moderately active with high presenteeism'; 38.20%) reported engaging in moderate levels of physical activity and had relatively high levels of stress resilience, yet also high levels of presenteeism. The profiles differed on work affect and health perceptions largely in the expected directions. There were no differences between the profiles in socio-demographics. These results highlight complex within-person interactions between presenteeism, stress resilience, and physical activity in older manual workers. The identification of profiles of older manual workers who are at risk of poor health and functioning may inform targeted interventions to help retain them in the workforce for longer.

  10. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    Wenzel, D.R.

    1994-02-01

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  11. Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sam Alessi; Dennis Keiser

    2012-10-01

    This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economic parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester

  12. A prescribed wake rotor inflow and flow field prediction analysis, user's manual and technical approach

    Science.gov (United States)

    Egolf, T. A.; Landgrebe, A. J.

    1982-01-01

    A user's manual is provided which includes the technical approach for the Prescribed Wake Rotor Inflow and Flow Field Prediction Analysis. The analysis is used to provide the rotor wake induced velocities at the rotor blades for use in blade airloads and response analyses and to provide induced velocities at arbitrary field points such as at a tail surface. This analysis calculates the distribution of rotor wake induced velocities based on a prescribed wake model. Section operating conditions are prescribed from blade motion and controls determined by a separate blade response analysis. The analysis represents each blade by a segmented lifting line, and the rotor wake by discrete segmented trailing vortex filaments. Blade loading and circulation distributions are calculated based on blade element strip theory including the local induced velocity predicted by the numerical integration of the Biot-Savart Law applied to the vortex wake model.

  13. Process based analysis of manually controlled drilling processes for bone

    Science.gov (United States)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  14. User's manual for seismic analysis code 'SONATINA-2V'

    International Nuclear Information System (INIS)

    Hanawa, Satoshi; Iyoku, Tatsuo

    2001-08-01

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  15. comparative analysis of mechanical and manual modes of traffic

    African Journals Online (AJOL)

    user

    with a difference of (0.999x10^6 ) between the mechanical and manual, this value shows a downward effect using manual data .... Various methods can be adopted for verification such as listed ..... Maintenance cost is high using mechanical.

  16. Cytogenetic analysis for radiation dose assessment. A manual

    International Nuclear Information System (INIS)

    2001-01-01

    Chromosome aberration analysis is recognized as a valuable dose assessment method which fills a gap in dosimetric technology, particularly when there are difficulties in interpreting the data, in cases where there is reason to believe that persons not wearing dosimeters have been exposed to radiation, in cases of claims for compensation for radiation injuries that are not supported by unequivocal dosimetric evidence, or in cases of exposure over an individual's working lifetime. The IAEA has maintained a long standing involvement in biological dosimetry commencing in 1978. This has been via a sequence of Co-ordinated Research Programmes (CRPs), the running of Regional Training Courses, the sponsorship of individual training fellowships and the provision of necessary equipment to laboratories in developing Member States. The CRP on the 'Use of Chromosome Aberration Analysis in Radiation Protection' was initiated by IAEA in 1982. It ended with the publication of the IAEA Technical Report Series No. 260, titled 'Biological Dosimetry: Chromosomal Aberration Analysis for Dose Assessment' in 1986. The overall objective of the CRP (1998-2000) on 'Radiation Dosimetry through Biological Indicators' is to review and standardize the available methods and amend the above mentioned previous IAEA publication with current techniques on cytogenetic bioindicators which may be of practical use in biological dosimetry worldwide. An additional objective is to identify promising cytogenetic techniques to provide Member States with up to date and generally agreed advice regarding the best focus for research and suggestions for the most suitable techniques for near future practice in biodosimetry. This activity is in accordance with the International Basic Safety Standards (BSS) published in 1996. To pursue this task the IAEA has conducted a Research Co-ordination Meeting (Budapest, Hungary, June 1998) with the participation of senior scientists of 24 biodosimetry laboratories to discuss

  17. Structures manual

    Science.gov (United States)

    2001-01-01

    This manual was written as a guide for use by design personnel in the Vermont Agency of Transportation Structures Section. This manual covers the design responsibilities of the Section. It does not cover other functions that are a part of the Structu...

  18. Wetlands Research Program. Corps of Engineers Wetlands Delineation Manual. Appendix C. Sections 1 and 2. Region 2 - Southeast.

    Science.gov (United States)

    1987-01-01

    22 7.V.: 14 -1b Jil -7 1 N.- .- WETLANDS RESEARCH PROGRAM TECHNICAL REPORT Y-87-1 CORPS OF ENGINEERS WETLANDS DELINEATION MANUAL APPENDIX C SECTIONS ...ARMYLA- US Army Corps of Engineers Washington, DC 20314-1000 B , , , -I *. 4 -w *" APPENDIX C SECTION 1 NATIONAL LIST OF PLANT SPECIES THAT OCCUR IN...Redtop FACW A. hiernalia (Walter) B.S.P. Winter bent FAC A. scabra Wilid. Rough bentgrass FAC A. st~nfyaL. Carpet bentgrass FACW Aletris aurea

  19. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  20. The development of practice manual for LSC based on job analysis in radiation measurement and analysis

    International Nuclear Information System (INIS)

    Shin, W H; Park, T J

    2017-01-01

    Radiation technology is closely related to the industrial growth and the creation of employment in Korea. The techniques as radiation or/and radioactivity measurement, and the practical skills achieving a higher level analysis are required. In this study, practice manual for liquid scintillation counter were developed by job analysis. Raw data applied in job analysis are collected by on/off line survey by 420 workers employed in KOREA. Importance-priority analysis was performed to make duties and competency unit that consists of knowledge, skills as each task. Refined data was reviewed by expert who experienced actual duties on site. Classification was conducted by focus group interview to deduct duties and competency unit. From the radiation devices in measurement and analysis, liquid scintillation counter was preferentially selected because of the high demands for training. Investigation of build-up status to liquid scintillation counter in KOREA was conducted. Then technical specification and operating procedure of 2 main devices were analyzed and integrated by practice manual. Duties and competency unit were applied to integrated materials respectively. To validate effectiveness, test curriculum was designed by the advanced course to workers who engaged in radiation measurement and analysis. The developed manual is structured to take advantage of test training. This manual will be a practical handbook that can improve the knowledge, skills of radiation workers in Korea. (paper)

  1. The development of practice manual for LSC based on job analysis in radiation measurement and analysis

    Science.gov (United States)

    Shin, W. H.; Park, T. J.

    2017-06-01

    Radiation technology is closely related to the industrial growth and the creation of employment in Korea. The techniques as radiation or/and radioactivity measurement, and the practical skills achieving a higher level analysis are required. In this study, practice manual for liquid scintillation counter were developed by job analysis. Raw data applied in job analysis are collected by on/off line survey by 420 workers employed in KOREA. Importance-priority analysis was performed to make duties and competency unit that consists of knowledge, skills as each task. Refined data was reviewed by expert who experienced actual duties on site. Classification was conducted by focus group interview to deduct duties and competency unit. From the radiation devices in measurement and analysis, liquid scintillation counter was preferentially selected because of the high demands for training. Investigation of build-up status to liquid scintillation counter in KOREA was conducted. Then technical specification and operating procedure of 2 main devices were analyzed and integrated by practice manual. Duties and competency unit were applied to integrated materials respectively. To validate effectiveness, test curriculum was designed by the advanced course to workers who engaged in radiation measurement and analysis. The developed manual is structured to take advantage of test training. This manual will be a practical handbook that can improve the knowledge, skills of radiation workers in Korea.

  2. Wetlands Research Program. Corps of Engineers Wetlands Delineation Manual. Appendix C. Section 1. Region O - California.

    Science.gov (United States)

    1987-01-01

    status is questioned. An X prior to the species name in the scientific name column denotes a hybrid . For purposes of this manual, all species appearing...S. Wats. Rusty molly kochia FACW K. scc’pari.a (L.) Schrader Sunmmer cypress ...actuca pu~ZchelZa (Pursh) DC. Chicory lettuce -serriola L. Prickly... lettuce FAG :antana cwnara L. Lantana% :asthenia burkei (Greene) Greene Gold-fields chry8Sstoma (Fisch. & C. A. Lasthenia Meyer) Greene Lconjtgen8 Greene

  3. A Manual for Basic Techniques of Data Analysis and Distribution

    OpenAIRE

    Alvi, Mohsin

    2014-01-01

    A manual is designed to support and help the basic concepts of statistics and its implications in econometric, beside this, interpretation of further statistical techniques have been shown as well by illustrations and graphical methods. It is comprised on several instances of test, obtained from statistical software like SPSS, E-views, Stata and R-language with the understanding of their research models and essentials for the running the test. A basic of manual is included on two elements, fi...

  4. Code development and analysis program. RELAP4/MOD7 (Version 2): user's manual

    International Nuclear Information System (INIS)

    1978-08-01

    This manual describes RELAP4/MOD7 (Version 2), which is the latest version of the RELAP4 LPWR blowdown code. Version 2 is a precursor to the final version of RELAP4/MOD7, which will address LPWR LOCA analysis in integral fashion (i.e., blowdown, refill, and reflood in continuous fashion). This manual describes the new code models and provides application information required to utilize the code. It must be used in conjunction with the RELAP4/MOD5 User's Manual (ANCR-NUREG-1335, dated September 1976), and the RELAP4/MOD6 User's Manual

  5. Depth of manual dismantling analysis: A cost–benefit approach

    Energy Technology Data Exchange (ETDEWEB)

    Achillas, Ch., E-mail: c.achillas@ihu.edu.gr [School of Economics and Business Administration, International Hellenic University, 14th km Thessaloniki-Moudania, 57001 Thermi (Greece); Aidonis, D. [Department of Logistics, Alexander Technological Educational Institute, Branch of Katerini, 60100 Katerini (Greece); Vlachokostas, Ch.; Karagiannidis, A.; Moussiopoulos, N.; Loulos, V. [Laboratory of Heat Transfer and Environmental Engineering, Department of Mechanical Engineering, Aristotle University, Thessaloniki, Box 483, 54124 Thessaloniki (Greece)

    2013-04-15

    Highlights: ► A mathematical modeling tool for OEMs. ► The tool can be used by OEMs, recyclers of electr(on)ic equipment or WEEE management systems’ regulators. ► The tool makes use of cost–benefit analysis in order to determine the optimal depth of product disassembly. ► The reusable materials and the quantity of metals and plastics recycled can be quantified in an easy-to-comprehend manner. - Abstract: This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in order to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models’ applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product’s components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93–23.06 €, depending on the level of disassembly.

  6. Depth of manual dismantling analysis: A cost–benefit approach

    International Nuclear Information System (INIS)

    Achillas, Ch.; Aidonis, D.; Vlachokostas, Ch.; Karagiannidis, A.; Moussiopoulos, N.; Loulos, V.

    2013-01-01

    Highlights: ► A mathematical modeling tool for OEMs. ► The tool can be used by OEMs, recyclers of electr(on)ic equipment or WEEE management systems’ regulators. ► The tool makes use of cost–benefit analysis in order to determine the optimal depth of product disassembly. ► The reusable materials and the quantity of metals and plastics recycled can be quantified in an easy-to-comprehend manner. - Abstract: This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in order to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models’ applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product’s components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93–23.06 €, depending on the level of disassembly

  7. A Content Analysis of General Chemistry Laboratory Manuals for Evidence of Higher-Order Cognitive Tasks

    Science.gov (United States)

    Domin, Daniel S.

    1999-01-01

    The science laboratory instructional environment is ideal for fostering the development of problem-solving, manipulative, and higher-order thinking skills: the skills needed by today's learner to compete in an ever increasing technology-based society. This paper reports the results of a content analysis of ten general chemistry laboratory manuals. Three experiments from each manual were examined for evidence of higher-order cognitive activities. Analysis was based upon the six major cognitive categories of Bloom's Taxonomy of Educational Objectives: knowledge, comprehension, application, analysis, synthesis, and evaluation. The results of this study show that the overwhelming majority of general chemistry laboratory manuals provide tasks that require the use of only the lower-order cognitive skills: knowledge, comprehension, and application. Two of the laboratory manuals were disparate in having activities that utilized higher-order cognition. I describe the instructional strategies used within these manuals to foster higher-order cognitive development.

  8. Financial Reporting and Cost Analysis Manual for Day Care Centers, Head Start, and Other Programs.

    Science.gov (United States)

    Bedger, Jean E.; And Others

    This manual is designed to provide fundamental directions for systematic financial reporting and cost analysis for the administrators, accountants, bookkeepers, and staff of day care, Project Head Start, and other programs. The major aims of the manual are to induce day care directors to adopt uniform bookkeeping procedures and to analyze costs…

  9. Manuals of food quality control 10. training in mycotoxins analysis

    International Nuclear Information System (INIS)

    1991-01-01

    This manual is designed to cover a course of about three weeks to train food analysts in developing countries. Mycotoxins are described and analytical methods for detecting their presence in food and animal feeds are presented, with especial emphasis on immunoassay and thin-layer chromatographic procedures. 40 figs, 10 tabs

  10. Microscopic Analysis of Plankton, Periphyton, and Activated Sludge. Training Manual.

    Science.gov (United States)

    Environmental Protection Agency, Washington, DC. Office of Water Programs.

    This manual is intended for professional personnel in the fields of water pollution control, limnology, water supply and waste treatment. Primary emphasis is given to practice in the identification and enumeration of microscopic organisms which may be encountered in water and activated sludge. Methods for the chemical and instrumental evaluation…

  11. Interference analysis of fission cross section

    International Nuclear Information System (INIS)

    Toshkov, S.A.; Yaneva, N.B.

    1976-01-01

    The formula for the reaction cross-section based on the R-matrix formalism considering the interference between the two neighbouring resonances, referred to the same value of total momentum was used for the analysis of the cross-section of resonance neutron induced fission of 230Pu. The experimental resolution and thermal motion of the target nuclei were accounted for numerical integration

  12. Nuclear design manual for generation of cross section and heterogeneous formfunction for CASMO-3/MASTER

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Ho; Cho, Byung Oh; Song, Jae Seong; Lee, Chung Chan

    1996-12-01

    A three-dimensional reactor core simulation code, MASTER, has been developed as a part of the ADONIS project in KAERI. CASMO-3 prepares various two-group cross sections for the constituents of a reactor core such as fuel assembly, radial and axial reflectors, control rod and detector for MASTER. This report includes the standard design procedure for generation of two-group cross sections and heterogeneous formfunction by CASMO-3/FORM for MASTER. (author). 16 refs., 16 tabs., 12 figs.

  13. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  14. Synfuel program analysis. Volume 2: VENVAL users manual

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.

  15. Thoracic manual therapy is not more effective than placebo thoracic manual therapy in patients with shoulder dysfunctions: A systematic review with meta-analysis.

    Science.gov (United States)

    Bizzarri, Paolo; Buzzatti, Luca; Cattrysse, Erik; Scafoglieri, Aldo

    2018-02-01

    Manual treatments targeting different regions (shoulder, cervical spine, thoracic spine, ribs) have been studied to deal with patients complaining of shoulder pain. Thoracic manual treatments seem able to produce beneficial effects on this group of patients. However, it is not clear whether the patient improvement is a consequence of thoracic manual therapy or a placebo effect. To compare the efficacy of thoracic manual therapy and placebo thoracic manual treatment for patients with shoulder dysfunction. Electronic databases (MEDLINE, CENTRAL, PEDro, CINAHL, WoS, EMBASE, ERIC) were searched through November 2016. Randomized Controlled Trials assessing pain, mobility and function were selected. The Cochrane bias estimation tool was applied. Outcome results were either extracted or computed from raw data. Meta-analysis was performed for outcomes with low heterogeneity. Four studies were included in the review. The methodology of the included studies was generally good except for one study that was rated as high risk of bias. Meta-analysis showed no significant effect for "pain at present" (SMD -0.02; 95% CI: -0.35, 0.32) and "pain during movement" (SMD -0.12; 95% CI: -0.45, 0.21). There is very low to low quality of evidence that a single session of thoracic manual therapy is not more effective than a single session of placebo thoracic manual therapy in patients with shoulder dysfunction at immediate post-treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Sample problem manual for benchmarking of cask analysis codes

    International Nuclear Information System (INIS)

    Glass, R.E.

    1988-02-01

    A series of problems have been defined to evaluate structural and thermal codes. These problems were designed to simulate the hypothetical accident conditions given in Title 10 of the Code of Federal Regulation, Part 71 (10CFR71) while retaining simple geometries. This produced a problem set that exercises the ability of the codes to model pertinent physical phenomena without requiring extensive use of computer resources. The solutions that are presented are consensus solutions based on computer analyses done by both national laboratories and industry in the United States, United Kingdom, France, Italy, Sweden, and Japan. The intent of this manual is to provide code users with a set of standard structural and thermal problems and solutions which can be used to evaluate individual codes. 19 refs., 19 figs., 14 tabs

  17. Micromechanical combined stress analysis: MICSTRAN, a user manual

    Science.gov (United States)

    Naik, R. A.

    1992-01-01

    Composite materials are currently being used in aerospace and other applications. The ability to tailor the composite properties by the appropriate selection of its constituents, the fiber and matrix, is a major advantage of composite materials. The Micromechanical Combined Stress Analysis (MICSTRAN) code provides the materials engineer with a user-friendly personal computer (PC) based tool to calculate overall composite properties given the constituent fiber and matrix properties. To assess the ability of the composite to carry structural loads, the materials engineer also needs to calculate the internal stresses in the composite material. MICSTRAN is a simple tool to calculate such internal stresses with a composite ply under combined thermomechanical loading. It assumes that the fibers have a circular cross-section and are arranged either in a repeating square or diamond array pattern within a ply. It uses a classical elasticity solution technique that has been demonstrated to calculate accurate stress results. Input to the program consists of transversely isotropic fiber properties and isotropic matrix properties such as moduli, Poisson's ratios, coefficients of thermal expansion, and volume fraction. Output consists of overall thermoelastic constants and stresses. Stresses can be computed under the combined action of thermal, transverse, longitudinal, transverse shear, and longitudinal shear loadings. Stress output can be requested along the fiber-matrix interface, the model boundaries, circular arcs, or at user-specified points located anywhere in the model. The MICSTRAN program is Windows compatible and takes advantage of the Microsoft Windows graphical user interface which facilitates multitasking and extends memory access far beyond the limits imposed by the DOS operating system.

  18. The Effect of Aging on Physical Performance Among Elderly Manual Workers: Protocol of a Cross-Sectional Study.

    Science.gov (United States)

    Norheim, Kristoffer Larsen; Hjort Bønløkke, Jakob; Samani, Afshin; Omland, Øyvind; Madeleine, Pascal

    2017-11-22

    In 2012, the Danish Parliament decided to increase retirement age. Unfortunately, elderly people working in a physically demanding environment may be rendered unable to retain the ability to adequately perform the physical requirements of their jobs, due to age-related decreases in physical performance. Therefore, increasing the retirement age may not necessarily lead to the goal of keeping everybody in the labor market for a longer time. To date, our knowledge about the variations in physical performance of the elderly workforce is limited. In this cross-sectional study we seek to investigate the effects of aging on physical performance among elderly manual workers. Approximately 100 Danish manual workers between 50 and 70 years of age will be recruited. The main measurement outcomes include: (1) inflammatory status from blood samples; (2) body composition; (3) lung function; (4) static and dynamic balance; (5) reaction time, precision, and movement variability during a hammering task; (6) handgrip strength, rate of force development, and force tracking; (7) estimated maximal rate of oxygen consumption; and (8) back mobility. Additionally, information regarding working conditions, physical activity levels, and health status will be assessed with a questionnaire. Data collection is expected to take place between autumn 2017 and spring 2018. This study will increase the knowledge regarding variations in physical performance in the elderly workforce and may identify potential workplace hazards. Moreover, this study might shed light on the potentially problematic decision to increase retirement age for all Danish citizens. ©Kristoffer Larsen Norheim, Jakob Hjort Bønløkke, Afshin Samani, Øyvind Omland, Pascal Madeleine. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 22.11.2017.

  19. HASL procedures manual

    International Nuclear Information System (INIS)

    Harley, J.H.

    1977-08-01

    Additions and corrections to the following sections of the HASL Procedures Manual are provided: General, Sampling, Field Measurements; General Analytical Chemistry, Chemical Procedures, Data Section, and Specifications

  20. Users' manual for fault tree analysis code: CUT-TD

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kiyota, Mikio.

    1992-06-01

    The CUT-TD code has been developed to find minimal cut sets for a given fault tree and to calculate the occurrence probability of its top event. This code uses an improved top-down algorithm which can enhance the efficiency in deriving minimal cut sets. The features in processing techniques incorporated into CUT-TD are as follows: (1) Consecutive OR gates or consecutive AND gates can be coalesced into a single gate. As a result, this processing directly produces cut sets for the redefined single gate with each gate not being developed. (2) The independent subtrees are automatically identified and their respective cut sets are separately found to enhance the efficiency in processing. (3) The minimal cut sets can be obtained for the top event of a fault tree by combining their respective minimal cut sets for several gates of the fault tree. (4) The user can reduce the computing time for finding minimal cut sets and control the size and significance of cut sets by inputting a minimum probability cut off and/or a maximum order cut off. (5) The user can select events that need not to be further developed in the process of obtaining minimal cut sets. This option can reduce the number of minimal cut sets, save the computing time and assists the user in reviewing the result. (6) Computing time is monitored by the CUT-TD code so that it can prevent the running job from abnormally ending due to excessive CPU time and produce an intermediate result. The CUT-TD code has the ability to restart the calculation with use of the intermediate result. This report provides a users' manual for the CUT-TD code. (author)

  1. Military construction program economic analysis manual: Sample economic analyses: Hazardous Waste Remedial Actions Program

    International Nuclear Information System (INIS)

    1987-12-01

    This manual enables the US Air Force to comprehensively and systematically analyze alternative approaches to meeting its military construction requirements. The manual includes step-by-step procedures for completing economic analyses for military construction projects, beginning with determining if an analysis is necessary. Instructions and a checklist of the tasks involved for each step are provided; and examples of calculations and illustrations of completed forms are included. The manual explains the major tasks of an economic analysis, including identifying the problem, selecting realistic alternatives for solving it, formulating appropriate assumptions, determining the costs and benefits of the alternatives, comparing the alternatives, testing the sensitivity of major uncertainties, and ranking the alternatives. Appendixes are included that contain data, indexes, and worksheets to aid in performing the economic analyses. For reference, Volume 2 contains sample economic analyses that illustrate how each form is filled out and that include a complete example of the documentation required

  2. Military construction program economic analysis manual: Text and appendixes: Hazardous Waste Remedial Actions Program

    International Nuclear Information System (INIS)

    1987-12-01

    This manual enables the US Air Force to comprehensively and systematically analyze alternative approaches to meeting its military construction requirements. The manual includes step-by-step procedures for completing economic analyses for military construction projects, beginning with determining if an analysis is necessary. Instructions and a checklist of the tasks involved for each step are provided; and examples of calculations and illustrations of completed forms are included. The manual explains the major tasks of an economic analysis, including identifying the problem, selecting realistic alternatives for solving it, formulating appropriate assumptions, determining the costs and benefits of the alternatives, comparing the alternatives, testing the sensitivity of major uncertainties, and ranking the alternatives. Appendixes are included that contain data, indexes, and worksheets to aid in performing the economic analyses. For reference, Volume 2 contains sample economic analyses that illustrate how each form is filled out and that include a complete example of the documentation required. 6 figs., 12 tabs

  3. Manual Therapy in the Treatment of Idiopathic Scoliosis. Analysis of Current Knowledge.

    Science.gov (United States)

    Czaprowski, Dariusz

    2016-10-28

    Apart from the recommended specific physiotherapy, the treatment of idiopathic scoliosis (IS) also incorporates non-specific manual therapy (NMT). The aim of this paper is to assess the efficacy of NMT (manual therapy, chiropractic, osteopathy) used in the treatment of children and adolescents with IS. The study analysed systematic reviews (Analysis 1) and other recent scientific publications (Analysis 2). Analysis 1 encompassed papers on the use of NMT in patients with IS. Works concerning specific physiotherapy (SP) or bracing (B) and other types of scoliosis were excluded from the analysis. Inclusion criteria for Analysis 2 were: treatment with NMT; subjects aged 10-18 years with IS. The following types of papers were excluded: works analysing NMT combined with SP or B, reports concerning adult pa tients, analyses of single cases and publications included in Analysis 1. Analysis 1: six systematic reviews contained 6 papers on the efficacy of NMT in the treatment of IS. The results of these studies are contradictory, ranging from Cobb angle reduction to no treatment effects whatsoever. The papers analysed are characterised by poor methodological quality: small group sizes, incomplete descriptions of the study groups, no follow-up and no control groups. Analysis 2: in total, 217 papers were found. None of them met the criteria set for the analysis. 1. Few papers verifying the efficacy of manual therapy, chiropractic and osteopathy in the treatment of idiopathic scoliosis have been published to date. 2. The majority are experimental studies with poor methodology or observational case studies. 3. At present, the efficacy of non-specific manual therapy in the treatment of patients with idiopathic scoliosis cannot be reliably evaluated. 4. It is necessary to conduct further research based on appropriate methods (prospective, rando mi s ed, controlled studies) in order to reliably assess the usefulness of non-specific manual therapy in the treatment of idiopathic

  4. GRACE manual

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1993-02-01

    This manual is composed of three kinds of objects, theoretical background for calculating the cross section of elementary process, usage and technical details of the GRACE system. Throughout this manual we take the tree level process e + e - → W + W - γ as an example, including the e ± -scalar boson interactions. The real FORTRAN source code for this process is attached in the relevant sections as well as the results of calculation, which might be a great help for understanding the practical use of the system. (J.P.N.)

  5. Integrating guideline development and implementation: analysis of guideline development manual instructions for generating implementation advice

    Directory of Open Access Journals (Sweden)

    Gagliardi Anna R

    2012-07-01

    Full Text Available Abstract Background Guidelines are important tools that inform healthcare delivery based on best available research evidence. Guideline use is in part based on quality of the guidelines, which includes advice for implementation and has been shown to vary. Others hypothesized this is due to limited instructions in guideline development manuals. The purpose of this study was to examine manual instructions for implementation advice. Methods We used a directed and summative content analysis approach based on an established framework of guideline implementability. Six manuals identified by another research group were examined to enumerate implementability domains and elements. Results Manuals were similar in content but lacked sufficient detail in particular domains. Most frequently this was Accomodation, which includes information that would help guideline users anticipate and/or overcome organizational and system level barriers. In more than one manual, information was also lacking for Communicability, information that would educate patients or facilitate their involvement in shared decision making, and Applicability, or clinical parameters to help clinicians tailor recommendations for individual patients. Discussion Most manuals that direct guideline development lack complete information about incorporating implementation advice. These findings can be used by those who developed the manuals to consider expanding their content in these domains. It can also be used by guideline developers as they plan the content and implementation of their guidelines so that the two are integrated. New approaches for guideline development and implementation may need to be developed. Use of guidelines might be improved if they included implementation advice, but this must be evaluated through ongoing research.

  6. Model for Analysis of the Energy Demand (MAED) users' manual for version MAED-1

    International Nuclear Information System (INIS)

    1986-09-01

    This manual is organized in two major parts. The first part includes eight main sections describing how to use the MAED-1 computer program and the second one consists of five appendices giving some additional information about the program. Concerning the main sections of the manual, Section 1 gives a summary description and some background information about the MAED-1 model. Section 2 extends the description of the MAED-1 model in more detail. Section 3 introduces some concepts, mainly related to the computer requirements imposed by the program, that are used throughout this document. Sections 4 to 7 describe how to execute each of the various programs (or modules) of the MAED-1 package. The description for each module shows the user how to prepare the control and data cards needed to execute the module and how to interpret the printed output produced. Section 8 recapitulates about the use of MAED-1 for carrying out energy and electricity planning studies, describes the several phases normally involved in this type of study and provides the user with practical hints about the most important aspects that need to be verified at each phase while executing the various MAED modules

  7. CONPAS 1.0 (CONtainment Performance Analysis System). User's manual

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Jin, Young Ho

    1996-04-01

    CONPAS (CONtainment Performance Analysis System) is a verified computer code package to integrate the numerical, graphical, and results-operation aspects of Level 2 probabilistic safety assessments (PSA) for nuclear power plants automatically under a PC window environment. Compared with the existing DOS-based computer codes for Level 2 PSA, the most important merit of the window-based computer code is that user can easily describe and quantify the accident progression models, and manipulate the resultant outputs in a variety of ways. As a main logic for accident progression analysis, CONPAS employs a concept of the small containment phenomenological event tree (CPET) helpful to trace out visually individual accident progressions and of the large supporting event tree (LSET) for its detailed quantification. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules; (1) ET Editor for construction of several event tree models describing the accident progressions, (2) Computer for quantification of the constructed event trees and graphical display of the resultant outputs, (3) Text Editor for preparation of input decks for quanification and utilization of calculational results, and (4) Mechanistic Code Plotter for utilization of results obtained from severe accident analysis codes. Compared with other existing computer codes for Level 2 PSA, the CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friend interface. 10 refs. (Author) .new

  8. FDA (Food and Drug Administration) compliance program guidance manual and updates (FY 86). Section 4. Medical and radiological devices. Irregular report

    International Nuclear Information System (INIS)

    1986-01-01

    The FDA Compliance Program Guidance Manual provides a system for issuing and filing program plans and instructions directed to Food and Drug Administration Field operations for project implementation. Section IV provides those chapters of the Compliance Program Guidance Manual which pertain to the areas of medical and radiological devices. Some of the areas of coverage include laser and sunlamp standards inspections, compliance testing of various radiation-emitting products such as television receivers and microwave ovens, emergency response planning and policy, premarket approval and device manufacturers inspections, device problem reporting, sterilization of devices, and consumer education programs on medical and radiological devices

  9. Module type plant system dynamics analysis code (MSG-COPD). Code manual

    International Nuclear Information System (INIS)

    Sakai, Takaaki

    2002-11-01

    MSG-COPD is a module type plant system dynamics analysis code which involves a multi-dimensional thermal-hydraulics calculation module to analyze pool type of fast breeder reactors. Explanations of each module and the methods for the input data are described in this code manual. (author)

  10. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    Science.gov (United States)

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  11. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  12. CORRELATIONS BETWEEN FINDINGS OF OCCLUSAL AND MANUAL ANALYSIS IN TMD-PATIENTS

    Directory of Open Access Journals (Sweden)

    Mariana Dimova

    2016-08-01

    Full Text Available The aim of this study was to investigate and analyze the possible correlations between findings by manual functional analysis and clinical occlusal analysis in TMD-patients. Material and methods: Material of this study are 111 TMD-patients selected after visual diagnostics, functional brief review under Ahlers Jakstatt, intraoral examination and taking periodontal status. In the period September 2014 - March 2016 all patients were subjected to manual functional analysis and clinical occlusal analysis. 17 people (10 women and 7 men underwent imaging with cone-beam computed tomography. Results: There were found many statistically significant correlations between tests of the structural analysis that indicate the relationships between findings. Conclusion: The presence of statistically significant correlations between occlusal relationships, freedom in the centric and condition of the muscle complex of masticatory system and TMJ confirm the relationship between the state of occlusal components and TMD.

  13. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    International Nuclear Information System (INIS)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.; Wecksung, M.J.; Willcutt, G.J.E. Jr.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework

  14. Information to licensees regarding two NRC Inspection Manual sections on resolution of degraded and nonconforming conditions and on operability (Generic Letter 91-18)

    International Nuclear Information System (INIS)

    Partlow, J.G.

    1992-01-01

    The NRC staff has issued two sections to be included in Part 9900, Technical Guidance, of the NRC Inspection Manual. The first is, ''Resolution of Degraded and Nonconforming Conditions.'' The second is, ''Operable/Operability: Ensuring the Functional Capability of a System or Component.'' Copies of the additions to the NRC Inspection Manual are provided for information only. No specific licensee actions are required. The additions to the NRC Inspection Manual are based upon previously issued guidance. However, because of the complexity involved in operability determinations and the resolution of degraded and nonconforming conditions, there have been differences in application by NRC staff during past inspection activities. Thus, the purpose of publishing this guidance is to ensure consistency in application of this guidance by the NRC. Regional inspection personnel have been briefed on this guidance. The NRC will conduct further training on these topics to ensure uniform staff understanding

  15. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    Science.gov (United States)

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  16. Decrease in use of manual vacuum aspiration in postabortion care in Malawi: a cross-sectional study from three public hospitals, 2008-2012.

    Directory of Open Access Journals (Sweden)

    Maria L Odland

    Full Text Available OBJECTIVES: To investigate the use of manual vacuum aspiration in postabortion care in Malawi between 2008-2012. METHODS: A retrospective cross-sectional study was done at the referral hospital Queen Elisabeth Central Hospital, and the two district hospitals of Chiradzulu and Mangochi. The data were collected simultaneously at the three sites from Feb-March 2013. All records available for women admitted to the gynaecological ward from 2008-2012 were reviewed. Women who had undergone surgical uterine evacuation after incomplete abortion were included and the use of manual vacuum aspiration versus sharp curettage was analysed. RESULTS: Altogether, 5121 women were included. One third (34.2% of first trimester abortions were treated with manual vacuum aspiration, while all others were treated with sharp curettage. There were significant differences between the hospitals and between years. Overall there was an increase in the use of manual vacuum aspiration from 2008 (19.7% to 2009 (31.0%, with a rapid decline after 2010 (28.5% ending at only 4.9% in 2012. Conversely there was an increase in use of sharp curettage in all hospitals from 2010 to 2012. CONCLUSION: Use of manual vacuum aspiration as part of the postabortion care in Malawi is rather low, and decreased from 2010 to 2012, while the use of sharp curettage became more frequent. This is in contrast with current international guidelines.

  17. High prevalence of respiratory symptoms among workers in the development section of a manually operated coal mine in a developing country: A cross sectional study

    Directory of Open Access Journals (Sweden)

    Bråtveit Magne

    2007-02-01

    Full Text Available Abstract Background Few studies of miners have been carried out in African countries; most are from South Africa, where the working conditions are assumed to be better than in the rest of Africa. Several studies have focused on respiratory disorders among miners, but development workers responsible for creating underground road ways have not been studied explicitly. This is the first study assessing the associations between exposure to dust and quartz and respiratory symptoms among coal mine workers in a manually operated coal mine in Tanzania, focusing on development workers, as they have the highest exposure to coal dust. Methods A cross-sectional study was carried out among 250 production workers from a coal mine. Interviews were performed using modified standardized questionnaires to elicit information on occupational history, demographics, smoking habits and acute and chronic respiratory symptoms. The relationships between current dust exposure as well as cumulative respirable dust and quartz and symptoms were studied by group comparisons as well as logistic regression. Results Workers from the development group had the highest dust exposure, with arithmetic mean of 10.3 mg/m3 for current respirable dust and 1.268 mg/m3 for quartz. Analogous exposure results for mine workers were 0.66 mg/m3 and 0.03 mg/m3, respectively; and for other development workers were 0.88 mg/m3 and 0.10 mg/m3, respectively. The workers from the development section had significantly higher prevalence of the acute symptoms of dry cough (45.7%, breathlessness (34.8% and blocked nose (23.9%. In addition, development workers had significantly more chronic symptoms of breathlessness (17.0% than the mine workers (6.4% and the other production workers (2.4%. The highest decile of cumulative exposure to respirable dust was significantly associated with cough (OR = 2.91, 95% CI 1.06, 7.97 as were cumulative exposure to quartz and cough (OR = 2.87, CI 1.05, 7.88, compared with

  18. Analysis of manual material handling activity to increase work productivity (Case study: manufacturing company

    Directory of Open Access Journals (Sweden)

    Suryoputro Muhammad Ragil

    2018-01-01

    Full Text Available Manual material handling is one of work activities that have an effect on the physical aspect of workers in manufacturing industry, it is necessary to do the analysis of the risks from such activities. Analysis was performed on worker when performing manual lifting activity and when using two tools (Automatic Handlift and Manual Handlift. In addition to analyse in ergonomics aspect, time study analysis and productivity measurement were carried out to determine the effects of the equipment. Nordic Body Map (NBM questionnaire on worker using Automatic Handlift showed declining level of musculoskeletal disorders by 22%. REBA method obtained score of 10 and was declined to 4 after using these tool. The results of MPL method showed declining of Fc value from 4756.37 N to 1346.56 N. The results of RWL method showed declining of LI value (Lifting Index origin and destination from 1.84 and 1.18 to 1.12 and 0.89 respectively. As for worker using Manual Handlift, NBM questionnaire result shows declining level of musculoskeletal disorders by 57%. REBA method obtained scores of 8 and was reduced to 5. For MPL methode result, the Fc value is 4906.99 N and reduced to 2047.88 N. RWL method results showed declining of LI value (Lifting Index origin and destination from 1.02 and 0.67 to 0.74 and 0.58. The results of time study analysis showed declining of Standard Time when use the two tools and make the increasing productivity of 9% by worker using Automatic Hadlift and 4% by worker using Manual Handlift.

  19. Quantitative X ray analysis system. User's manual and guide to X ray fluorescence technique

    International Nuclear Information System (INIS)

    2009-01-01

    This guide covers trimmed and re-arranged version 3.6 of the Quantitative X ray Analysis System (QXAS) software package that includes the most frequently used methods of quantitative analysis. QXAS is a comprehensive quantitative analysis package that has been developed by the IAEA through research and technical contracts. Additional development has also been carried out in the IAEA Laboratories in Seibersdorf where QXAS was extensively tested. New in this version of the manual are the descriptions of the Voigt-profile peak fitting, the backscatter fundamental parameters' and emission-transmission methods of chemical composition analysis, an expanded chapter on the X ray fluorescence physics, and completely revised and increased number of practical examples of utilization of the QXAS software package. The analytical data accompanying this manual were collected in the IAEA Seibersdorf Laboratories in the years 2006/2007

  20. BWR plant dynamic analysis code BWRDYN user's manual

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Yoshida, Kazuo; Fujiki, Kazuo

    1989-06-01

    Computer code BWRDYN has been developed for thermal-hydraulic analysis of a BWR plant. It can analyze the various types of transient caused by not only small but also large disturbances such as operating mode changes and/or system malfunctions. The verification of main analytical models of the BWRDYN code has been performed with measured data of actual BWR plant. Furthermore, the installation of BOP (Balance of Plant) model has made it possible to analyze the effect of BOP on reactor system. This report describes on analytical models and instructions for user of the BWRDYN code. (author)

  1. Diagnostic Value of Manual and Computerized Methods of Dental Casts Analysis

    Directory of Open Access Journals (Sweden)

    H. Rahimi

    2009-06-01

    Full Text Available Objective: The aim of this study was to evaluate the validity of computerized and manual methods of dental cast analysis.Materials and Methods: Twenty set-ups of upper and lower casts using artificial teeth corresponding to various malocclusions were created for a diagnostic in vitro study. Values of tooth size were calculated from the isolated artificial teeth out of the set-ups, results were considered as a gold standard for the tooth size. Arch width was calculated from the existing set-ups on the dentins.Impressions were taken of the casts with alginate and duplicated with dental stone. Models were measured with digital caliper manually. Then images were taken from the occlusal views of the casts by a digital camera. Measurements were done on digital images with the AutoCAD software.The results of the computerized and manual methods were compared with the gold standard.Intra class correlation coefficient of reliability was used to measure the accuracy ofthe methods and the Friedman technique used to evaluate the significance of differences.Results: Results indicated that all measurements were highly correlated, e.g. gold standard and manual (0.9613-0.9991, gold standard and computerized (0.7118-0.9883, manual and computerized (0.6734-0.9914. Statistically significant differences were present between these methods (P<0.05, but they proved not to be clinically significant.Conclusion: Manual measurement is still the most accurate method when compared to the computerized measurements and the results of measurement by computer should be interpreted with caution.

  2. Integrated dynamic landscape analysis and modeling system (IDLAMS) : installation manual.

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z.; Majerus, K. A.; Sundell, R. C.; Sydelko, P. J.; Vogt, M. C.

    1999-02-24

    The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

  3. Comparison of urine analysis using manual and sedimentation methods.

    Science.gov (United States)

    Kurup, R; Leich, M

    2012-06-01

    Microscopic examination of urine sediment is an essential part in the evaluation of renal and urinary tract diseases. Traditionally, urine sediments are assessed by microscopic examination of centrifuged urine. However the current method used by the Georgetown Public Hospital Corporation Medical Laboratory involves uncentrifuged urine. To encourage high level of care, the results provided to the physician must be accurate and reliable for proper diagnosis. The aim of this study is to determine whether the centrifuge method is more clinically significant than the uncentrifuged method. In this study, a comparison between the results obtained from centrifuged and uncentrifuged methods were performed. A total of 167 urine samples were randomly collected and analysed during the period April-May 2010 at the Medical Laboratory, Georgetown Public Hospital Corporation. The urine samples were first analysed microscopically by the uncentrifuged, and then by the centrifuged method. The results obtained from both methods were recorded in a log book. These results were then entered into a database created in Microsoft Excel, and analysed for differences and similarities using this application. Analysis was further done in SPSS software to compare the results using Pearson ' correlation. When compared using Pearson's correlation coefficient analysis, both methods showed a good correlation between urinary sediments with the exception of white bloods cells. The centrifuged method had a slightly higher identification rate for all of the parameters. There is substantial agreement between the centrifuged and uncentrifuged methods. However the uncentrifuged method provides for a rapid turnaround time.

  4. An analysis of the process and results of manual geocode correction

    Science.gov (United States)

    McDonald, Yolanda J.; Schwind, Michael; Goldberg, Daniel W.; Lampley, Amanda; Wheeler, Cosette M.

    2018-01-01

    Geocoding is the science and process of assigning geographical coordinates (i.e. latitude, longitude) to a postal address. The quality of the geocode can vary dramatically depending on several variables, including incorrect input address data, missing address components, and spelling mistakes. A dataset with a considerable number of geocoding inaccuracies can potentially result in an imprecise analysis and invalid conclusions. There has been little quantitative analysis of the amount of effort (i.e. time) to perform geocoding correction, and how such correction could improve geocode quality type. This study used a low-cost and easy to implement method to improve geocode quality type of an input database (i.e. addresses to be matched) through the processes of manual geocode intervention, and it assessed the amount of effort to manually correct inaccurate geocodes, reported the resulting match rate improvement between the original and the corrected geocodes, and documented the corresponding spatial shift by geocode quality type resulting from the corrections. Findings demonstrated that manual intervention of geocoding resulted in a 90% improvement of geocode quality type, took 42 hours to process, and the spatial shift ranged from 0.02 to 151,368 m. This study provides evidence to inform research teams considering the application of manual geocoding intervention that it is a low-cost and relatively easy process to execute. PMID:28555477

  5. An analysis of the process and results of manual geocode correction

    Directory of Open Access Journals (Sweden)

    Yolanda J. McDonald

    2017-05-01

    Full Text Available Geocoding is the science and process of assigning geographical coordinates (i.e. latitude, longitude to a postal address. The quality of the geocode can vary dramatically depending on several variables, including incorrect input address data, missing address components, and spelling mistakes. A dataset with a considerable number of geocoding inaccuracies can potentially result in an imprecise analysis and invalid conclusions. There has been little quantitative analysis of the amount of effort (i.e. time to perform geocoding correction, and how such correction could improve geocode quality type. This study used a low-cost and easy to implement method to improve geocode quality type of an input database (i.e. addresses to be matched through the processes of manual geocode intervention, and it assessed the amount of effort to manually correct inaccurate geocodes, reported the resulting match rate improvement between the original and the corrected geocodes, and documented the corresponding spatial shift by geocode quality type resulting from the corrections. Findings demonstrated that manual intervention of geocoding resulted in a 90% improvement of geocode quality type, took 42 hours to process, and the spatial shift ranged from 0.02 to 151,368 m. This study provides evidence to inform research teams considering the application of manual geocoding intervention that it is a low-cost and relatively easy process to execute.

  6. User's manual of a support system for human reliability analysis

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Tamura, Kazuo.

    1995-10-01

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user's guide of the system. (author)

  7. Cleaning capacity promoted by motor-driven or manual instrumentation using ProTaper Universal system: Histological analysis.

    Science.gov (United States)

    da Frota, Matheus Franco; Filho, Idomeo Bonetti; Berbert, Fábio Luiz Camargo Villela; Sponchiado, Emilio Carlos; Marques, André Augusto Franco; Garcia, Lucas da Fonseca Roberti

    2013-01-01

    The aim of this study was to assess the cleaning capacity of the Protaper system using motor-driven or manual instrumentation. Ten mandibular molars were randomly separated into 2 groups (n = 5) according to the type of instrumentation performed, as follows: Group 1 - instrumentation with rotary nickel-titanium (Ni-Ti) files using ProTaper Universal System (Dentsply/Maillefer); and, Group 2 - instrumentation with Ni-Ti hand files using ProTaper Universal (Dentsply-Maillefer). Afterwards, the teeth were sectioned transversely and submitted to histotechnical processing to obtain histological sections for microscopic evaluation. The images were analyzed by the Corel Photo-Paint X5 program (Corel Corporation) using an integration grid superimposed on the image. Statistical analysis (U-Mann-Whitney - P < 0.05) demonstrated that G1 presented higher cleaning capacity when compared to G2. The rotary technique presented better cleaning results in the apical third of the root canal system when compared to the manual technique.

  8. Oscillatory neuronal dynamics associated with manual acupuncture: a magnetoencephalography study using beamforming analysis

    Directory of Open Access Journals (Sweden)

    Aziz eAsghar

    2012-11-01

    Full Text Available Magnetoencephalography (MEG enables non-invasive recording of neuronal activity, with reconstruction methods providing estimates of underlying brain source locations and oscillatory dynamics from externally recorded neuromagnetic fields. The aim of our study was to use MEG to determine the effect of manual acupuncture on neuronal oscillatory dynamics. A major problem in MEG investigations of manual acupuncture is the absence of onset times for each needle manipulation. Given that beamforming (spatial filtering analysis is not dependent upon stimulus-driven responses being phase-locked to stimulus onset, we postulated that beamforming could reveal source locations and induced changes in neuronal activity during manual acupuncture. In a beamformer analysis, a two-minute period of manual acupuncture needle manipulation delivered to the ipsilateral right LI-4 (Hegu acupoint was contrasted with a two-minute baseline period. We considered oscillatory power changes in the theta (4-8Hz, alpha (8-13Hz, beta (13-30Hz and gamma (30-100Hz frequency bands. We found significant decreases in beta band power in the contralateral primary somatosensory cortex and superior frontal gyrus. In the ipsilateral cerebral hemisphere, we found significant power decreases in beta and gamma frequency bands in only the superior frontal gyrus. No significant power modulations were found in theta and alpha bands. Our results indicate that beamforming is a useful analytical tool to reconstruct underlying neuronal activity associated with manual acupuncture. Our main finding was of beta power decreases in primary somatosensory cortex and superior frontal gyrus, which opens up a line of future investigation regarding whether this contributes towards an underlying mechanism of acupuncture.

  9. A kinetic analysis of manual wheelchair propulsion during start-up on select indoor and outdoor surfaces

    NARCIS (Netherlands)

    Koontz, AM; Cooper, RA; Boninger, ML; Yang, YS; Impink, BG; van der Woude, LHV

    2005-01-01

    The objective of this study was to conduct a kinetic analysis of manual wheelchair propulsion during start-LIP on select indoor and Outdoor surfaces. Eleven manual wheelchairs were fitted with a SMART(Wheel) and their users were asked to Push on a Course consisting of high- and low-pile carpet,

  10. A Manual of Style.

    Science.gov (United States)

    Nebraska State Dept. of Education, Lincoln.

    This "Manual of Style" is offered as a guide to assist Nebraska State employees in producing quality written communications and in presenting a consistently professional image of government documents. The manual is not designed to be all-inclusive. Sections of the manual discuss formatting documents, memorandums, letters, mailing…

  11. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  12. Intra-observer reliability and agreement of manual and digital orthodontic model analysis.

    Science.gov (United States)

    Koretsi, Vasiliki; Tingelhoff, Linda; Proff, Peter; Kirschneck, Christian

    2018-01-23

    Digital orthodontic model analysis is gaining acceptance in orthodontics, but its reliability is dependent on the digitalisation hardware and software used. We thus investigated intra-observer reliability and agreement / conformity of a particular digital model analysis work-flow in relation to traditional manual plaster model analysis. Forty-eight plaster casts of the upper/lower dentition were collected. Virtual models were obtained with orthoX®scan (Dentaurum) and analysed with ivoris®analyze3D (Computer konkret). Manual model analyses were done with a dial caliper (0.1 mm). Common parameters were measured on each plaster cast and its virtual counterpart five times each by an experienced observer. We assessed intra-observer reliability within method (ICC), agreement/conformity between methods (Bland-Altman analyses and Lin's concordance correlation), and changing bias (regression analyses). Intra-observer reliability was substantial within each method (ICC ≥ 0.7), except for five manual outcomes (12.8 per cent). Bias between methods was statistically significant, but less than 0.5 mm for 87.2 per cent of the outcomes. In general, larger tooth sizes were measured digitally. Total difference maxilla and mandible had wide limits of agreement (-3.25/6.15 and -2.31/4.57 mm), but bias between methods was mostly smaller than intra-observer variation within each method with substantial conformity of manual and digital measurements in general. No changing bias was detected. Although both work-flows were reliable, the investigated digital work-flow proved to be more reliable and yielded on average larger tooth sizes. Averaged differences between methods were within 0.5 mm for directly measured outcomes but wide ranges are expected for some computed space parameters due to cumulative error. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  13. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    Science.gov (United States)

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  14. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  15. Comparative analysis among several cross section sets

    International Nuclear Information System (INIS)

    Caldeira, A.D.

    1983-01-01

    Critical parameters were calculated using the one dimensional multigroup transport theory for several cross section sets. Calculations have been performed for water mixtures of uranium metal, plutonium metal and uranium-thorium oxide, and for metallics systems, to determine the critical dimensions of geometries (sphere and cylinder). For this aim, the following cross section sets were employed: 1) multigroup cross section sets obtained from the GAMTEC-II code; 2) the HANSEN-ROACH cross section sets; 3) cross section sets from the ENDF/B-IV, processed by the NJOY code. Finally, we have also calculated the corresponding critical radius using the one dimensional multigroup transport DTF-IV code. The numerical results agree within a few percent with the critical values obtained in the literature (where the greatest discrepancy occured in the critical dimensions of water mixtures calculated with the values generated by the NJOY code), a very good results in comparison with similar works. (Author) [pt

  16. Solutions manual to accompany An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2014-01-01

    A solutions manual to accompany An Introduction to Numerical Methods and Analysis, Second Edition An Introduction to Numerical Methods and Analysis, Second Edition reflects the latest trends in the field, includes new material and revised exercises, and offers a unique emphasis on applications. The author clearly explains how to both construct and evaluate approximations for accuracy and performance, which are key skills in a variety of fields. A wide range of higher-level methods and solutions, including new topics such as the roots of polynomials, sp

  17. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  18. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation

    International Nuclear Information System (INIS)

    1976-09-01

    This portion of the RELAP4/MOD5 User's Manual presents the details of setting up and entering the reactor model to be evaluated. The input card format and arrangement is presented in depth, including not only cards for data but also those for editing and restarting. Problem initalization including pressure distribution and energy balance is discussed. A section entitled ''User Guidelines'' is included to provide modeling recommendations, analysis and verification techniques, and computational difficulty resolution. The section is concluded with a discussion of the computer output form and format

  19. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  20. Diagnostic performance of 64-section CT using CT gastrography in preoperative T staging of gastric cancer according to 7th edition of AJCC cancer staging manual

    International Nuclear Information System (INIS)

    Kim, Jin Woong; Shin, Sang Soo; Heo, Suk Hee; Lim, Hyo Soon; Jeong, Yong Yeon; Kang, Heoung Keun; Choi, Yoo Duk; Park, Young Kyu; Park, Chang Hwan

    2012-01-01

    To evaluate the accuracy of 64-section multidetector CT with CT gastrography for determining the depth of mural invasion in patients with gastric cancer according to the 7th edition of the AJCC cancer staging manual. A total of 127 patients with gastric cancer and who had undergone both esophago-gastro-duodenoscopy and 64-section CT were included in this study. Two radiologists independently reviewed the preoperative CT images with respect to the detectability and T-staging of the gastric cancers. The sensitivity, specificity, accuracy and overall accuracy of each reviewer for the T staging of gastric cancer were calculated. Overall, gastric cancer was detected in 123 (96.9%) of the 127 cancers on the CT images. Reviewer 1 correctly staged 98 gastric cancers, and reviewer 2 correctly classified 105 gastric cancers. The overall diagnostic accuracy of the T staging was 77.2% (98/127) for reviewer 1 and 82.7% (105/127) for reviewer 2. 64-section CT using CT gastrography showed a reasonable diagnostic performance for determining the T staging in patients with gastric cancer according to the 7th edition of the AJCC cancer staging manual. (orig.)

  1. VIPRE-01: a thermal-hydraulic analysis code for reactor cores. Volume 2. User's manual

    International Nuclear Information System (INIS)

    Cuta, J.M.; Koontz, A.S.; Stewart, C.W.; Montgomery, S.D.

    1983-04-01

    VIPRE (Versatile Internals and Component Program for Reactors; EPRI) has been developed for nuclear power utility thermal-hydraulic analysis applications. It is designed to help evaluate nuclear energy reactor core safety limits including minimum departure from nucleate boiling ratio (MDNBR), critical power ratio (CPR), fuel and clad temperatures, and coolant state in normal operation and assumed accident conditions. This volume (Volume 2: User's Manual) describes the input requirements of VIPRE and its auxiliary programs, SPECSET, ASP and DECCON, and lists the input instructions for each code

  2. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    Science.gov (United States)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  3. Robotic-Assisted Versus Manual Prostatic Arterial Embolization for Benign Prostatic Hyperplasia: A Comparative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bagla, Sandeep, E-mail: sandeep.bagla@gmail.com [Vascular Institute of Virginia, LLC (United States); Smirniotopoulos, John [New York Presbyterian Hospital/Weill Cornell Medical Center (United States); Orlando, Julie C.; Piechowiak, Rachel [Vascular Institute of Virginia, LLC (United States)

    2017-03-15

    PurposeProstatic artery embolization (PAE) is a safe and efficacious procedure for benign prostatic hyperplasia (BPH), though is technically challenging. We present our experience of technical and clinical outcomes of robotic and manual PAE in patients with BPH.Materials and MethodsIRB-approved retrospective study of 40 consecutive patients 49–81 years old with moderate or severe grade BPH from May 2014 to July 2015: 20 robotic-assisted PAE (group 1), 20 manual PAE (group 2). Robotic-assisted PAE was performed using the Magellan Robotic System. American Urological Association (AUA-SI) score, cost, technical and clinical success, radiation dose, fluoroscopy, and procedure time were reviewed. Statistical analysis was performed within and between each group using paired t test and one-way analysis of variance respectively, at 1 and 3 months.ResultsNo significant baseline differences in age and AUA-SI between groups. Technical success was 100% (group 1) and 95% (group 2). One unsuccessful subject from group 2 returned for a successful embolization using robotic assistance. Fluoroscopy and procedural times were similar between groups, with a non-significant lower patient radiation dose in group 1 (30,632.8 mGy/cm{sup 2} vs 35,890.9, p = 0.269). Disposable cost was significantly different between groups with the robotic-assisted PAE incurring a higher cost (group 1 $4530.2; group 2 $1588.5, p < 0.0001). Clinical improvement was significant in both arms at 3 months: group 1 mean change in AUA-SI of 8.3 (p = 0.006), group 2: 9.6 (p < 0.0001). No minor or major complications occurred.ConclusionsRobotic-assisted PAE offers technical success comparable to manual PAE, with similar clinical improvement with an increased cost.

  4. Robotic-Assisted Versus Manual Prostatic Arterial Embolization for Benign Prostatic Hyperplasia: A Comparative Analysis

    International Nuclear Information System (INIS)

    Bagla, Sandeep; Smirniotopoulos, John; Orlando, Julie C.; Piechowiak, Rachel

    2017-01-01

    PurposeProstatic artery embolization (PAE) is a safe and efficacious procedure for benign prostatic hyperplasia (BPH), though is technically challenging. We present our experience of technical and clinical outcomes of robotic and manual PAE in patients with BPH.Materials and MethodsIRB-approved retrospective study of 40 consecutive patients 49–81 years old with moderate or severe grade BPH from May 2014 to July 2015: 20 robotic-assisted PAE (group 1), 20 manual PAE (group 2). Robotic-assisted PAE was performed using the Magellan Robotic System. American Urological Association (AUA-SI) score, cost, technical and clinical success, radiation dose, fluoroscopy, and procedure time were reviewed. Statistical analysis was performed within and between each group using paired t test and one-way analysis of variance respectively, at 1 and 3 months.ResultsNo significant baseline differences in age and AUA-SI between groups. Technical success was 100% (group 1) and 95% (group 2). One unsuccessful subject from group 2 returned for a successful embolization using robotic assistance. Fluoroscopy and procedural times were similar between groups, with a non-significant lower patient radiation dose in group 1 (30,632.8 mGy/cm"2 vs 35,890.9, p = 0.269). Disposable cost was significantly different between groups with the robotic-assisted PAE incurring a higher cost (group 1 $4530.2; group 2 $1588.5, p < 0.0001). Clinical improvement was significant in both arms at 3 months: group 1 mean change in AUA-SI of 8.3 (p = 0.006), group 2: 9.6 (p < 0.0001). No minor or major complications occurred.ConclusionsRobotic-assisted PAE offers technical success comparable to manual PAE, with similar clinical improvement with an increased cost.

  5. Operation and management manual of JT-60 experimental data analysis system

    International Nuclear Information System (INIS)

    Hirayama, Takashi; Morishima, Soichi

    2014-03-01

    In the Japan Atomic Energy Agency Naka Fusion Institute, a lot of experiments have been conducted by using the large tokamak device JT-60 aiming to realize fusion power plant. In order to optimize the JT-60 experiment and to investigate complex characteristics of plasma, JT-60 experimental data analysis system was developed and used for collecting, referring and analyzing the JT-60 experimental data. Main components of the system are a data analysis server and a database server for the analyses and accumulation of the experimental data respectively. Other peripheral devices of the system are magnetic disk units, NAS (Network Attached Storage) device, and a backup tape drive. This is an operation and management manual the JT-60 experimental data analysis system. (author)

  6. A national cross-sectional study in the Danish wood and furniture industry on working postures and manual materials handling.

    Science.gov (United States)

    Christensen, H; Pedersen, M B; Sjøgaard, G

    1995-04-01

    Musculoskeletal disorders constitute a major problem in the wood and furniture industry and identification of risk factors is needed urgently. Therefore, exposures to different work tasks and variation in the job were recorded based on an observation survey in combination with an interview among 281 employees working in wood working and painting departments. A questionnaire survey confirmed high frequencies of symptoms from the musculoskeletal system: The one-year prevalence of symptoms from the low back was 42% and symptoms from the neck/shoulder was 40%. The exposure was evaluated based on: (1) classification of work tasks, (2) work cycle time, (3) manual materials handling, (4) working postures, and (5) variation in the job. Among the employees 47% performed feeding or clearing of machines, 35% performed wood working or painting materials, and 18% performed various other operations. Among the employees 20% had no variation in their job while 44% had little variation. Manual materials handling of 375 different burdens was observed, which most often occurred during feeding or clearing of machines. The weight of burdens lifted was 0.5-87.0 kg, where 2% had a weight of more than 50 kg. Among the lifting conditions 30% were evaluated as implying a risk of injury. An additional risk factor was the high total tonnage lifted per day, which was estimated to range from 132 kg to 58,800 kg. Working postures implied a risk of injury due to prolonged forward and lateral flexions of the neck, which was seen most frequently during wood working or painting materials. These data substantiate the finding that work tasks mainly during feeding or clearing of machines imply a risk of injury to the low back and a risk of injury to the neck and shoulder area mainly during wood working or painting materials. Optimal strategies for job redesign may be worked out by using these data in order to prevent occupational musculoskeletal disorders.

  7. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  8. Detailed description and user`s manual of high burnup fuel analysis code EXBURN-I

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    EXBURN-I has been developed for the analysis of LWR high burnup fuel behavior in normal operation and power transient conditions. In the high burnup region, phenomena occur which are different in quality from those expected for the extension of behaviors in the mid-burnup region. To analyze these phenomena, EXBURN-I has been formed by the incorporation of such new models as pellet thermal conductivity change, burnup-dependent FP gas release rate, and cladding oxide layer growth to the basic structure of low- and mid-burnup fuel analysis code FEMAXI-IV. The present report describes in detail the whole structure of the code, models, and materials properties. Also, it includes a detailed input manual and sample output, etc. (author). 55 refs.

  9. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  10. Computer analysis of digital sky surveys using citizen science and manual classification

    Science.gov (United States)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  11. Ergonomic analysis in the manual transport of loads: a case study in a cement production company

    Directory of Open Access Journals (Sweden)

    Jarbas Rocha Martins

    2017-03-01

    Full Text Available This study aims to analyze the manual transport activity of 50kg bags in a cement production company by means of ergonomic work analysis (EWA, identifying environmental factors and loaders’ working conditions. In order for EWA to take place, photographic records, videos and interviews were performed. It covered aspects related to lifting, transportation and handling of cement bags, as well as observations of loaders’ posture while performing the activity. In this sense, both qualitative and quantitative aspects were taken into account, which were evidenced by the use of instruments for determining the concentration of agents in the environment. In addition, some management tools, such as the cause and effect diagram, and interviews conducted during the field research were used. It was found that the high rate of absenteeism in the activity is related to the strict control of productivity, the excessive lifting and manual transport of weights, and repetitive trunk flexion movements. The results shown by the EWA made it possible to propose preventive measures and some changes in the work environment.

  12. Analysis of kinematic, kinetic and electromyographic patterns during root canal preparation with rotary and manual instruments

    Directory of Open Access Journals (Sweden)

    Braulio Pasternak-Júnior

    2012-02-01

    Full Text Available OBJECTIVE: This study assessed the muscular activity during root canal preparation through kinematics, kinetics, and electromyography (EMG. MATERIAL AND METHODS: The operators prepared one canal with RaCe rotary instruments and another with Flexo-files. The kinematics of the major joints was reconstructed using an optoelectronic system and electromyographic responses of the flexor carpi radialis, extensor carpi radialis, brachioradialis, biceps brachii, triceps brachii, middle deltoid, and upper trapezius were recorded. The joint torques of the shoulder, elbow and wrist were calculated using inverse dynamics. In the kinematic analysis, angular movements of the wrist and elbow were classified as low risk factors for work-related musculoskeletal disorders. With respect to the shoulder, the classification was medium-risk. RESULTS: There was no significant difference revealed by the kinetic reports. The EMG results showed that for the middle deltoid and upper trapezius the rotary instrumentation elicited higher values. The flexor carpi radialis and extensor carpi radialis, as well as the brachioradialis showed a higher value with the manual method. CONCLUSION: The muscular recruitment for accomplishment of articular movements for root canal preparation with either the rotary or manual techniques is distinct. Nevertheless, the rotary instrument presented less difficulty in the generation of the joint torque in each articulation, thus, presenting a greater uniformity of joint torques.

  13. Analysis of kinematic, kinetic and electromyographic patterns during root canal preparation with rotary and manual instruments

    Science.gov (United States)

    PASTERNAK-JÚNIOR, Braulio; de SOUSA NETO, Manoel Damião; DIONÍSIO, Valdeci Carlos; PÉCORA, Jesus Djalma; SILVA, Ricardo Gariba

    2012-01-01

    Objective This study assessed the muscular activity during root canal preparation through kinematics, kinetics, and electromyography (EMG). Material and Methods The operators prepared one canal with RaCe rotary instruments and another with Flexo-files. The kinematics of the major joints was reconstructed using an optoelectronic system and electromyographic responses of the flexor carpi radialis, extensor carpi radialis, brachioradialis, biceps brachii, triceps brachii, middle deltoid, and upper trapezius were recorded. The joint torques of the shoulder, elbow and wrist were calculated using inverse dynamics. In the kinematic analysis, angular movements of the wrist and elbow were classified as low risk factors for work-related musculoskeletal disorders. With respect to the shoulder, the classification was medium-risk. Results There was no significant difference revealed by the kinetic reports. The EMG results showed that for the middle deltoid and upper trapezius the rotary instrumentation elicited higher values. The flexor carpi radialis and extensor carpi radialis, as well as the brachioradialis showed a higher value with the manual method. Conclusion The muscular recruitment for accomplishment of articular movements for root canal preparation with either the rotary or manual techniques is distinct. Nevertheless, the rotary instrument presented less difficulty in the generation of the joint torque in each articulation, thus, presenting a greater uniformity of joint torques. PMID:22437679

  14. Gandhi and Mao on manual labour in the school: A retrospective analysis

    Science.gov (United States)

    Zachariah, Mathew; Hoffman, Arlene

    1985-12-01

    Mahatma Gandhi's views on relating the world of formal education to the world of work were developed first in his experimental `Tolstoy Farm' in South Africa. On his return to India, Gandhi insisted that a required manual labour component in the curriculum would help regenerate India's village economy, develop in India's children a deeper understanding of India's cultural roots, motivate children to relate `book learning' to life in society, and destroy invidious caste distinctions. The major proposals and suggestions in Gandhi's writing will be discussed in the context of his hopes for using schooling as an agent of progress in India. Mao Ze-Dong's views, on the other hand, were developed in the context of his Yenan experience in the 1930s, i.e. the decision to consolidate a power base in the interior of China before waging a class war against the landlords and capitalists of China. Mao's views were also, to some extent, rooted in the Chinese reality of stagnant, poverty-stricken rural areas. But, Mao's writings indicate that Marxist hopes to relate theory and practice (as understood in dialectical materialism) and to ensure that everyone participated in mental as well as manual labour in a socialist society had led him to formulate his proposals. Both Gandhi's and Mao's views and proposals have been more or less abandoned in India and China respectively. The similar and dissimilar reasons which led to such a fate are examined in this retrospective analysis.

  15. SECTION 6.2 SURFACE TOPOGRAPHY ANALYSIS

    DEFF Research Database (Denmark)

    Seah, M. P.; De Chiffre, Leonardo

    2005-01-01

    Surface physical analysis, i.e. topography characterisation, encompasses measurement, visualisation, and quantification. This is critical for both component form and for surface finish at macro-, micro- and nano-scales. The principal methods of surface topography measurement are stylus profilometry......, optical scanning techniques, and scanning probe microscopy (SPM). These methods, based on acquisition of topography data from point by point scans, give quantitative information of heights with respect to position. Based on a different approach, the so-called integral methods produce parameters...

  16. I spy with my little eye: Analysis of airline pilots' gaze patterns in a manual instrument flight scenario.

    Science.gov (United States)

    Haslbeck, Andreas; Zhang, Bo

    2017-09-01

    The aim of this study was to analyze pilots' visual scanning in a manual approach and landing scenario. Manual flying skills suffer from increasing use of automation. In addition, predominantly long-haul pilots with only a few opportunities to practice these skills experience this decline. Airline pilots representing different levels of practice (short-haul vs. long-haul) had to perform a manual raw data precision approach while their visual scanning was recorded by an eye-tracking device. The analysis of gaze patterns, which are based on predominant saccades, revealed one main group of saccades among long-haul pilots. In contrast, short-haul pilots showed more balanced scanning using two different groups of saccades. Short-haul pilots generally demonstrated better manual flight performance and within this group, one type of scan pattern was found to facilitate the manual landing task more. Long-haul pilots tend to utilize visual scanning behaviors that are inappropriate for the manual ILS landing task. This lack of skills needs to be addressed by providing specific training and more practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Determining the energy performance of manually controlled solar shades: A stochastic model based co-simulation analysis

    International Nuclear Information System (INIS)

    Yao, Jian

    2014-01-01

    Highlights: • Driving factor for adjustment of manually controlled solar shades was determined. • A stochastic model for manual solar shades was constructed using Markov method. • Co-simulation with Energyplus was carried out in BCVTB. • External shading even manually controlled should be used prior to LOW-E windows. • Previous studies on manual solar shades may overestimate energy savings. - Abstract: Solar shading devices play a significant role in reducing building energy consumption and maintaining a comfortable indoor condition. In this paper, a typical office building with internal roller shades in hot summer and cold winter zone was selected to determine the driving factor of control behavior of manual solar shades. Solar radiation was determined as the major factor in driving solar shading adjustment based on field measurements and logit analysis and then a stochastic model for manually adjusted solar shades was constructed by using Markov method. This model was used in BCVTB for further co-simulation with Energyplus to determine the impact of the control behavior of solar shades on energy performance. The results show that manually adjusted solar shades, whatever located inside or outside, have a relatively high energy saving performance than clear-pane windows while only external shades perform better than regularly used LOW-E windows. Simulation also indicates that using an ideal assumption of solar shade adjustment as most studies do in building simulation may lead to an overestimation of energy saving by about 16–30%. There is a need to improve occupants’ actions on shades to more effectively respond to outdoor conditions in order to lower energy consumption, and this improvement can be easily achieved by using simple strategies as a guide to control manual solar shades

  18. The effects of work organization on the health of immigrant manual workers: A longitudinal analysis.

    Science.gov (United States)

    Arcury, Thomas A; Chen, Haiying; Mora, Dana C; Walker, Francis O; Cartwright, Michael S; Quandt, Sara A

    2016-01-01

    This analysis uses a longitudinal design to examine the associations of work organization and health outcomes among Latino manual workers. Participants included 247 Latino workers who completed baseline and 1-year follow-up interviews and clinical examinations. Health outcome measures were epicondylitis, rotator cuff syndrome, back pain, and depressive symptoms. Independent measures were measures of job demand, job control, and job support. Workers commonly experienced rotator cuff syndrome (6.5%), back pain (8.9%), and depressive symptoms (11.2%); fewer experienced epicondylitis (2.4%). Psychological demand was associated with rotator cuff syndrome; awkward position and decision latitude were associated with back pain. Decreased skill variety but increased decision latitude was associated with elevated depressive symptoms. Work context factors are important for health outcomes among vulnerable workers. Further research is needed to expand upon this work, particularly cultural perspectives on job support.

  19. Light water reactor fuel analysis code FEMAXI-IV(Ver.2). Detailed structure and user's manual

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Saitou, Hiroaki.

    1997-11-01

    A light water reactor fuel behavior analysis code FEMAXI-IV(Ver.2) was developed as an improved version of FEMAXI-IV. Development of FEMAXI-IV has been already finished in 1992, though a detailed structure and input manual of the code have not been open to users yet. Here, the basic theories and structure, the models and numerical solutions applied to FEMAXI-IV(Ver.2), and the material properties adopted in the code are described in detail. In FEMAXI-IV(Ver.2), programming bugs in previous FEMAXI-IV were eliminated, renewal of the pellet thermal conductivity was performed, and a model of thermal-stress restraint on FP gas release was incorporated. For facilitation of effective and wide-ranging application of the code, methods of input/output of the code are also described in detail, and sample output is included. (author)

  20. MATADOR (Methods for the Analysis of Transport And Deposition Of Radionuclides) code description and User's Manual

    International Nuclear Information System (INIS)

    Avci, H.I.; Raghuram, S.; Baybutt, P.

    1985-04-01

    A new computer code called MATADOR (Methods for the Analysis of Transport And Deposition Of Radionuclides) has been developed to replace the CORRAL-2 computer code which was written for the Reactor Safety Study (WASH-1400). This report is a User's Manual for MATADOR. MATADOR is intended for use in system risk studies to analyze radionuclide transport and deposition in reactor containments. The principal output of the code is information on the timing and magnitude of radionuclide releases to the environment as a result of severely degraded core accidents. MATADOR considers the transport of radionuclides through the containment and their removal by natural deposition and by engineered safety systems such as sprays. It is capable of analyzing the behavior of radionuclides existing either as vapors or aerosols in the containment. The code requires input data on the source terms into the containment, the geometry of the containment, and thermal-hydraulic conditions in the containment

  1. Marketing Research. Instructor's Manual.

    Science.gov (United States)

    Small Business Administration, Washington, DC.

    Prepared for the Administrative Management Course Program, this instructor's manual was developed to serve small-business management needs. The sections of the manual are as follows: (1) Lesson Plan--an outline of material covered, which may be used as a teaching guide, presented in two columns: the presentation, and a step-by-step indication of…

  2. GERTS GQ User's Manual.

    Science.gov (United States)

    Akiba, Y.; And Others

    This user's manual for the simulation program Graphical Evaluation and Review Technique (GERT) GQ contains sections on nodes, branches, program input description and format, and program output, as well as examples. Also included is a programmer's manual which contains information on scheduling, subroutine descriptions, COMMON Variables, and…

  3. User's manual of SECOM2: a computer code for seismic system reliability analysis

    International Nuclear Information System (INIS)

    Uchiyama, Tomoaki; Oikawa, Tetsukuni; Kondo, Masaaki; Tamura, Kazuo

    2002-03-01

    This report is the user's manual of seismic system reliability analysis code SECOM2 (Seismic Core Melt Frequency Evaluation Code Ver.2) developed at the Japan Atomic Energy Research Institute for systems reliability analysis, which is one of the tasks of seismic probabilistic safety assessment (PSA) of nuclear power plants (NPPs). The SECOM2 code has many functions such as: Calculation of component failure probabilities based on the response factor method, Extraction of minimal cut sets (MCSs), Calculation of conditional system failure probabilities for given seismic motion levels at the site of an NPP, Calculation of accident sequence frequencies and the core damage frequency (CDF) with use of the seismic hazard curve, Importance analysis using various indicators, Uncertainty analysis, Calculation of the CDF taking into account the effect of the correlations of responses and capacities of components, and Efficient sensitivity analysis by changing parameters on responses and capacities of components. These analyses require the fault tree (FT) representing the occurrence condition of the system failures and core damage, information about response and capacity of components and seismic hazard curve for the NPP site as inputs. This report presents the models and methods applied in the SECOM2 code and how to use those functions. (author)

  4. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  5. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    International Nuclear Information System (INIS)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions

  6. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  7. Comparison of manual and semi-automated delineation of regions of interest for radioligand PET imaging analysis

    International Nuclear Information System (INIS)

    Chow, Tiffany W; Verhoeff, Nicolaas PLG; Takeshita, Shinichiro; Honjo, Kie; Pataky, Christina E; St Jacques, Peggy L; Kusano, Maggie L; Caldwell, Curtis B; Ramirez, Joel; Black, Sandra

    2007-01-01

    As imaging centers produce higher resolution research scans, the number of man-hours required to process regional data has become a major concern. Comparison of automated vs. manual methodology has not been reported for functional imaging. We explored validation of using automation to delineate regions of interest on positron emission tomography (PET) scans. The purpose of this study was to ascertain improvements in image processing time and reproducibility of a semi-automated brain region extraction (SABRE) method over manual delineation of regions of interest (ROIs). We compared 2 sets of partial volume corrected serotonin 1a receptor binding potentials (BPs) resulting from manual vs. semi-automated methods. BPs were obtained from subjects meeting consensus criteria for frontotemporal degeneration and from age- and gender-matched healthy controls. Two trained raters provided each set of data to conduct comparisons of inter-rater mean image processing time, rank order of BPs for 9 PET scans, intra- and inter-rater intraclass correlation coefficients (ICC), repeatability coefficients (RC), percentages of the average parameter value (RM%), and effect sizes of either method. SABRE saved approximately 3 hours of processing time per PET subject over manual delineation (p < .001). Quality of the SABRE BP results was preserved relative to the rank order of subjects by manual methods. Intra- and inter-rater ICC were high (>0.8) for both methods. RC and RM% were lower for the manual method across all ROIs, indicating less intra-rater variance across PET subjects' BPs. SABRE demonstrated significant time savings and no significant difference in reproducibility over manual methods, justifying the use of SABRE in serotonin 1a receptor radioligand PET imaging analysis. This implies that semi-automated ROI delineation is a valid methodology for future PET imaging analysis

  8. ENDF-6 formats manual. Version of June 1997. Written by the members of the US cross section evaluation working group

    International Nuclear Information System (INIS)

    McLane, V.; Dunford, C.L.; Rose, P.F.

    1997-01-01

    ENDF-6 is the international computer file format for evaluated nuclear data. This document gives a detailed description of the formats and procedures adopted for ENDF-6. It consists of the report BNL-NCS-44945 (Rev. 2/97) (=ENDF-201, Rev. 2/97) with an Interim Revision of June 1997 and a few front pages added by the IAEA Nuclear Data Section. (author)

  9. User's manuals of probabilistic fracture mechanics analysis code for aged piping, PASCAL-SP

    International Nuclear Information System (INIS)

    Itoh, Hiroto; Nishikawa, Hiroyuki; Onizawa, Kunio; Kato, Daisuke; Osakabe, Kazuya

    2010-03-01

    As a part of research on the material degradation and structural integrity assessment for aged LWR components, a PFM (Probabilistic Fracture Mechanics) analysis code PASCAL-SP (PFM Analysis of Structural Components in Aging LWR - Stress Corrosion Cracking at Welded Joints of Piping) has been developed. This code evaluates the failure probabilities at welded joints of aged piping by a Monte Carlo method. PASCAL-SP treats stress corrosion cracking (SCC) and fatigue crack growth in piping, according to the approaches of NISA and JSME FFS Code. The development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering the latest knowledge in the SCC assessment and fracture criteria of piping. In addition, the accuracy of flaw detection and sizing at in-service inspection and residual stress distribution were modeled based on experimental data and introduced into PASCAL-SP. This code has been developed for a cross-check use by the regulatory body in Japan. In addition to this, this code can also be used for a research purpose by researchers in academia and industries. This report provides the user's manual and theoretical background of the code. (author)

  10. Digital image analysis of Ki67 in hot spots is superior to both manual Ki67 and mitotic counts in breast cancer.

    Science.gov (United States)

    Stålhammar, Gustav; Robertson, Stephanie; Wedlund, Lena; Lippert, Michael; Rantalainen, Mattias; Bergh, Jonas; Hartman, Johan

    2018-05-01

    During pathological examination of breast tumours, proliferative activity is routinely evaluated by a count of mitoses. Adding immunohistochemical stains of Ki67 provides extra prognostic and predictive information. However, the currently used methods for these evaluations suffer from imperfect reproducibility. It is still unclear whether analysis of Ki67 should be performed in hot spots, in the tumour periphery, or as an average of the whole tumour section. The aim of this study was to compare the clinical relevance of mitoses, Ki67 and phosphohistone H3 in two cohorts of primary breast cancer specimens (total n = 294). Both manual and digital image analysis scores were evaluated for sensitivity and specificity for luminal B versus A subtype as defined by PAM50 gene expression assays, for high versus low transcriptomic grade, for axillary lymph node status, and for prognostic value in terms of prediction of overall and relapse-free survival. Digital image analysis of Ki67 outperformed the other markers, especially in hot spots. Tumours with high Ki67 expression and high numbers of phosphohistone H3-positive cells had significantly increased hazard ratios for all-cause mortality within 10 years from diagnosis. Replacing manual mitotic counts with digital image analysis of Ki67 in hot spots increased the differences in overall survival between the highest and lowest histological grades, and added significant prognostic information. Digital image analysis of Ki67 in hot spots is the marker of choice for routine analysis of proliferation in breast cancer. © 2017 John Wiley & Sons Ltd.

  11. STEM mode in the SEM for the analysis of cellular sections prepared by ultramicrotome sectioning

    International Nuclear Information System (INIS)

    Hondow, N; Harrington, J; Brydson, R; Brown, A

    2012-01-01

    The use of the dual imaging capabilities of a scanning electron microscope fitted with a transmitted electron detector is highlighted in the analysis of samples with importance in the field of nanotoxicology. Cellular uptake of nanomaterials is often examined by transmission electron microscopy of thin sections prepared by ultramicrotome sectioning. Examination by SEM allows for the detection of artefacts caused by sample preparation (eg. nanomaterial pull-out) and the complementary STEM mode permits study of the interaction between nanomaterials and cells. Thin sections of two nanomaterials of importance in nanotoxicology (cadmium selenide quantum dots and single walled carbon nanotubes) are examined using STEM mode in the SEM.

  12. MS_HistoneDB, a manually curated resource for proteomic analysis of human and mouse histones.

    Science.gov (United States)

    El Kennani, Sara; Adrait, Annie; Shaytan, Alexey K; Khochbin, Saadi; Bruley, Christophe; Panchenko, Anna R; Landsman, David; Pflieger, Delphine; Govin, Jérôme

    2017-01-01

    Histones and histone variants are essential components of the nuclear chromatin. While mass spectrometry has opened a large window to their characterization and functional studies, their identification from proteomic data remains challenging. Indeed, the current interpretation of mass spectrometry data relies on public databases which are either not exhaustive (Swiss-Prot) or contain many redundant entries (UniProtKB or NCBI). Currently, no protein database is ideally suited for the analysis of histones and the complex array of mammalian histone variants. We propose two proteomics-oriented manually curated databases for mouse and human histone variants. We manually curated >1700 gene, transcript and protein entries to produce a non-redundant list of 83 mouse and 85 human histones. These entries were annotated in accordance with the current nomenclature and unified with the "HistoneDB2.0 with Variants" database. This resource is provided in a format that can be directly read by programs used for mass spectrometry data interpretation. In addition, it was used to interpret mass spectrometry data acquired on histones extracted from mouse testis. Several histone variants, which had so far only been inferred by homology or detected at the RNA level, were detected by mass spectrometry, confirming the existence of their protein form. Mouse and human histone entries were collected from different databases and subsequently curated to produce a non-redundant protein-centric resource, MS_HistoneDB. It is dedicated to the proteomic study of histones in mouse and human and will hopefully facilitate the identification and functional study of histone variants.

  13. HASL procedures manual

    International Nuclear Information System (INIS)

    1980-08-01

    Addition and corrections to the following sections of the HASL Procedures Manual are provided: Table of Contents; Bibliography; Fallout Collection Methods; Wet/Dry Fallout Collection; Fluoride in Soil and Sediment; Strontium-90; Natural Series; Alpha Emitters; and Gamma Emitters

  14. A quantitative analysis of two-dimensional manually segmented transrectal ultrasound axial images in planning high dose rate brachytherapy for prostate cancer

    Directory of Open Access Journals (Sweden)

    Dabić-Stanković Kata

    2017-01-01

    Full Text Available Background/Aim. Prostate delineation, pre-planning and catheter implantation procedures, in high-dose rate brachytherapy (HDR-BT, are commonly based on the prostate manually segmented transrectal ultrasound (TRUS images. The aim of this study was to quantitatively analyze the consistency of prostate capsule delineation, done by a single therapist, prior to each HDR-BT fraction and the changes in the shape of the prostate capsule during HDR-BT, using two dimensional (2D TRUS axial image. Methods. A group of 16 patients were treated at the Medical System Belgrade Brachytherapy Department with definitive HDRBT. The total applied median dose of 52 Gy was divided into four individual fractions, each fraction being delivered 2– 3 weeks apart. Real time prostate axial visualization and the manual segmentation prior to each fraction were performed using B-K Medical ultrasound. Quantitative analyses, analysis of an area and shape were applied on 2D-TRUS axial images of the prostate. Area analyses were used to calculate the average value of the cross-sectional area of the prostate image. The parameters of the prostate shape, the fractal dimension and the circularity ratio of the prostate capsule contour were estimated at the maximum axial cross section of the prostate image. Results. The sample group consisted of four phases, each phase being performed prior to the first, second, third and fourth HDR-BT fraction, respectively. Statistical analysis showed that during HDR-BT fractions there were no significant differences in the average value of area, as well as in the maximum shape of prostate capsule. Conclusions. Quantitative analysis of TRUS axial prostate segmented images shows a successful capsule delineation in the series of manually segmented TRUS images, and the prostate maximum shape remaining unchanged during HDR-BT fractions.

  15. User's manual for seismic analysis code 'SONATINA-2V'

    Energy Technology Data Exchange (ETDEWEB)

    Hanawa, Satoshi; Iyoku, Tatsuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2001-08-01

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  16. Nuclear power plant control room crew task analysis database: SEEK system. Users manual

    International Nuclear Information System (INIS)

    Burgy, D.; Schroeder, L.

    1984-05-01

    The Crew Task Analysis SEEK Users Manual was prepared for the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission. It is designed for use with the existing computerized Control Room Crew Task Analysis Database. The SEEK system consists of a PR1ME computer with its associated peripherals and software augmented by General Physics Corporation SEEK database management software. The SEEK software programs provide the Crew Task Database user with rapid access to any number of records desired. The software uses English-like sentences to allow the user to construct logical sorts and outputs of the task data. Given the multiple-associative nature of the database, users can directly access the data at the plant, operating sequence, task or element level - or any combination of these levels. A complete description of the crew task data contained in the database is presented in NUREG/CR-3371, Task Analysis of Nuclear Power Plant Control Room Crews (Volumes 1 and 2)

  17. Analysis of Dynamic Characteristics of Portal Frame with Variable Section

    OpenAIRE

    Hao Jianing

    2016-01-01

    Combined with a portal frame design, by the use of finite element software ANSYS, the finite element model of single specimens of portal rigid frame and the overall portal rigid frame building are established. portal rigid frame’s beam and column is variable cross section. Through the modal analysis, comparative analysis of the frequency and vibration type of the radiolabeling specimens and finite element model of the whole, for the further development of variable cross-section portal rigid f...

  18. WheelerLab: An interactive program for sequence stratigraphic analysis of seismic sections, outcrops and well sections and the generation of chronostratigraphic sections and dynamic chronostratigraphic sections

    Science.gov (United States)

    Amosu, Adewale; Sun, Yuefeng

    WheelerLab is an interactive program that facilitates the interpretation of stratigraphic data (seismic sections, outcrop data and well sections) within a sequence stratigraphic framework and the subsequent transformation of the data into the chronostratigraphic domain. The transformation enables the identification of significant geological features, particularly erosional and non-depositional features that are not obvious in the original seismic domain. Although there are some software products that contain interactive environments for carrying out chronostratigraphic analysis, none of them are open-source codes. In addition to being open source, WheelerLab adds two important functionalities not present in currently available software: (1) WheelerLab generates a dynamic chronostratigraphic section and (2) WheelerLab enables chronostratigraphic analysis of older seismic data sets that exist only as images and not in the standard seismic file formats; it can also be used for the chronostratigraphic analysis of outcrop images and interpreted well sections. The dynamic chronostratigraphic section sequentially depicts the evolution of the chronostratigraphic chronosomes concurrently with the evolution of identified genetic stratal packages. This facilitates a better communication of the sequence-stratigraphic process. WheelerLab is designed to give the user both interactive and interpretational control over the transformation; this is most useful when determining the correct stratigraphic order for laterally separated genetic stratal packages. The program can also be used to generate synthetic sequence stratigraphic sections for chronostratigraphic analysis.

  19. WheelerLab: An interactive program for sequence stratigraphic analysis of seismic sections, outcrops and well sections and the generation of chronostratigraphic sections and dynamic chronostratigraphic sections

    Directory of Open Access Journals (Sweden)

    Adewale Amosu

    2017-01-01

    Full Text Available WheelerLab is an interactive program that facilitates the interpretation of stratigraphic data (seismic sections, outcrop data and well sections within a sequence stratigraphic framework and the subsequent transformation of the data into the chronostratigraphic domain. The transformation enables the identification of significant geological features, particularly erosional and non-depositional features that are not obvious in the original seismic domain. Although there are some software products that contain interactive environments for carrying out chronostratigraphic analysis, none of them are open-source codes. In addition to being open source, WheelerLab adds two important functionalities not present in currently available software: (1 WheelerLab generates a dynamic chronostratigraphic section and (2 WheelerLab enables chronostratigraphic analysis of older seismic data sets that exist only as images and not in the standard seismic file formats; it can also be used for the chronostratigraphic analysis of outcrop images and interpreted well sections. The dynamic chronostratigraphic section sequentially depicts the evolution of the chronostratigraphic chronosomes concurrently with the evolution of identified genetic stratal packages. This facilitates a better communication of the sequence-stratigraphic process. WheelerLab is designed to give the user both interactive and interpretational control over the transformation; this is most useful when determining the correct stratigraphic order for laterally separated genetic stratal packages. The program can also be used to generate synthetic sequence stratigraphic sections for chronostratigraphic analysis.

  20. Multimodal manual therapy vs. pharmacological care for management of tension type headache: A meta-analysis of randomized trials.

    Science.gov (United States)

    Mesa-Jiménez, Juan A; Lozano-López, Cristina; Angulo-Díaz-Parreño, Santiago; Rodríguez-Fernández, Ángel L; De-la-Hoz-Aizpurua, Jose L; Fernández-de-Las-Peñas, Cesar

    2015-12-01

    Manual therapies are generally requested by patients with tension type headache. To compare the efficacy of multimodal manual therapy vs. pharmacological care for the management of tension type headache pain by conducting a meta-analysis of randomized controlled trials. PubMed, MEDLINE, EMBASE, AMED, CINAHL, EBSCO, Cochrane Database of Systematic Reviews, Cochrane Collaboration Trials Register, PEDro and SCOPUS were searched from their inception until June 2014. All randomized controlled trials comparing any manual therapy vs. medication care for treating tension type headache adults were included. Data were extracted and methodological quality assessed independently by two reviewers. We pooled headache frequency as the main outcome and also intensity and duration. The weighted mean difference between manual therapy and pharmacological care was used to determine effect sizes. Five randomized controlled trials met our inclusion criteria and were included in the meta-analysis. Pooled analyses found that manual therapies were more effective than pharmacological care in reducing frequency (weighted mean difference -0.8036, 95% confidence interval -1.66 to -0.44; three trials), intensity (weighted mean difference -0.5974, 95% confidence interval -0.8875 to -0.3073; five trials) and duration (weighted mean difference -0.5558, 95% confidence interval -0.9124 to -0.1992; three trials) of the headache immediately after treatment. No differences were found at longer follow-up for headache intensity (weighted mean difference -0.3498, 95% confidence interval -1.106 to 0.407; three trials). Manual therapies were associated with moderate effectiveness at short term, but similar effectiveness at longer follow-up for reducing headache frequency, intensity and duration in tension type headache than pharmacological medical drug care. However, due to the heterogeneity of the interventions, these results should be considered with caution at this stage. © International Headache

  1. Closed-loop double-vasopressor automated system vs manual bolus vasopressor to treat hypotension during spinal anaesthesia for caesarean section: a randomised controlled trial.

    Science.gov (United States)

    Sng, B L; Tan, H S; Sia, A T H

    2014-01-01

    Hypotension necessitating vasopressor administration occurs commonly during caesarean section under spinal anaesthesia. We developed a novel vasopressor delivery system that automatically administers phenylephrine or ephedrine based on continuous non-invasive arterial pressure monitoring. A phenylephrine bolus of 50 μg was given at 30-s intervals when systolic blood pressure fell manual boluses of either phenylephrine 100 μg or ephedrine 8 mg, administered at 1-min intervals based on the same thresholds for systolic pressure and heart rate. This randomised, controlled, double-blinded trial involved 213 healthy women who underwent elective caesarean delivery under spinal anaesthesia using 11 mg hyperbaric bupivacaine with 15 μg fentanyl and 100 μg morphine. The automated vasopressor group had better systolic pressure control, with 37/106 (34.9%) having any beat-to-beat systolic pressure reading 120% of baseline, with 8/106 (7.5%) in the automated vasopressor group vs 14/107 (13.1%) in the control group, or total dose of vasopressors. The automated vasopressor group had lower median absolute performance error of 8.5% vs control of 9.8% (p = 0.013), and reduced incidence of nausea (1/106 (0.9%) vs 11/107 (10.3%), p = 0.005). Neonatal umbilical cord pH, umbilical lactate and Apgar scores were similar. Hence, our system afforded better control of maternal blood pressure and reduced nausea with no increase in reactive hypertension when compared with manual boluses. © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  2. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  3. Integrated Reliability and Risk Analysis System (IRRAS), Version 2.5: Reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1991-03-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.5 and is the subject of this Reference Manual. Version 2.5 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 7 refs., 348 figs

  4. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  5. Kinesiology Workbook and Laboratory Manual.

    Science.gov (United States)

    Harris, Ruth W.

    This manual is written for students in anatomy, kinesiology, or introductory biomechanics courses. The book is divided into two sections, a kinesiology workbook and a laboratory manual. The two sections parallel each other in content and format. Each is divided into three corresponding sections: (1) Anatomical bases for movement description; (2)…

  6. Consumer Decisions. Student Manual.

    Science.gov (United States)

    Florida State Dept. of Education, Tallahassee. Div. of Vocational Education.

    This student manual covers five areas relating to consumer decisions. Titles of the five sections are Consumer Law, Consumer Decision Making, Buying a Car, Convenience Foods, and Books for Preschool Children. Each section may contain some or all of these materials: list of objectives, informative sections, questions on the information and answers,…

  7. Pdap Manual

    DEFF Research Database (Denmark)

    Pedersen, Mads Mølgaard; Larsen, Torben J.

    Pdap, Python Data Analysis Program, is a program for post processing, analysis, visualization and presentation of data e.g. simulation results and measurements. It is intended but not limited to the domain of wind turbines. It combines an intuitive graphical user interface with python scripting...... that allows automation and implementation of custom functions. This manual gives a short introduction to the graphical user interface, describes the mathematical background for some of the functions, describes the scripting API and finally a few examples on how automate analysis via scripting is presented....... The newest version, and more documentation and help on how to used, extend and automate Pdap can be found at the webpage www.hawc2.dk...

  8. ANALISA PERBANDINGAN BIAYA PENGGUNAAN NUTRUNNER MANUAL DAN OTOMATIS MENGGUNAKAN METODE COST & BENEFITS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Ucok Mulyo Sugeng

    2017-10-01

    Full Text Available In a situation of increasingly fierce competition in every manufacturing company, PT SZI supposedly able to apply concepts and production strategies well as improve product quality. Thus the need for the selection of the right nutrunner as an investment in the assembly process. The assembly process is the part that greatly affects the quantity and quality of production in the manufacturing industry, the faster the process of assembling the more units are produced. Likewise with the production quality, the better the nutrunner is used it will be less risk of rejection unit which will incur additional costs. Nutrunner is a tool used for tightening bolts / nuts that are powered by wind energy or electricity. The different types of energy use in nutrunner that is the term naming nutrunner automatic and manual will be compared based on several aspects that will be selected which are more efficient in the production process. In search results of the comparison, cost calculation will use the application Desoutter Rightway. After known perbandinagn some aspects costs (energy costs, productivity costs, calibraton cost, maintenance cost, and quality cost then the analysis will be continued use Cost & Benefits Analysis to get some of the goals of this research, which is a long period of return on investment or breakeven on the project this nutrunner with Payback Period method. Besides the cost analysis is also used to find the value of benefits in order to determine whether the project is feasible and acceptable methods Return on Investment. After the data were analyzed using the method of cost and benefit analysis then obtained a long period of payback or breakeven for automatic nutrunner investment is 1.22 years and 14.63 months. As for the value of the benefits of 146%, this means that the project is acceptable because it provides the advantages of the total investment cost.

  9. Analysis of Dynamic Characteristics of Portal Frame with Variable Section

    Directory of Open Access Journals (Sweden)

    Hao Jianing

    2016-01-01

    Full Text Available Combined with a portal frame design, by the use of finite element software ANSYS, the finite element model of single specimens of portal rigid frame and the overall portal rigid frame building are established. portal rigid frame’s beam and column is variable cross section. Through the modal analysis, comparative analysis of the frequency and vibration type of the radiolabeling specimens and finite element model of the whole, for the further development of variable cross-section portal rigid frame of earthquake and wind vibration analysis lay the foundation.

  10. User's manual and analysis methodology of probabilistic fracture mechanics analysis code PASCAL Ver.2 for reactor pressure vessel (Contract research)

    International Nuclear Information System (INIS)

    Osakabe, Kazuya; Onizawa, Kunio; Shibata, Katsuyuki; Kato, Daisuke

    2006-09-01

    As a part of the aging structural integrity research for LWR components, the probabilistic fracture mechanics (PFM) analysis code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed in JAEA. This code evaluates the conditional probabilities of crack initiation and fracture of a reactor pressure vessel (RPV) under transient conditions such as pressurized thermal shock (PTS). The development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering the recent development in the fracture mechanics and computer performance. PASCAL Ver.1 has functions of optimized sampling in the stratified Monte Carlo simulation, elastic-plastic fracture criterion of the R6 method, crack growth analysis models for a semi-elliptical crack, recovery of fracture toughness due to thermal annealing and so on. Since then, under the contract between the Ministry of Economy, Trading and Industry of Japan and JAEA, we have continued to develop and introduce new functions into PASCAL Ver.2 such as the evaluation method for an embedded crack, K I database for a semi-elliptical crack considering stress discontinuity at the base/cladding interface, PTS transient database, and others. A generalized analysis method is proposed on the basis of the development of PASCAL Ver.2 and results of sensitivity analyses. Graphical user interface (GUI) including a generalized method as default values has been also developed for PASCAL Ver.2. This report provides the user's manual and theoretical background of PASCAL Ver.2. (author)

  11. PCR evaluation : considering transition from manual to semi-automated pavement distress collection and analysis.

    Science.gov (United States)

    2013-07-01

    This study is designed to assist the Ohio Department of Transportation (ODOT) in determining : whether transitioning from manual to state-of the-practice semi-automated pavement distress : data collection is feasible and recommended. Statistical and ...

  12. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  13. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual. Companion CD

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  14. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  15. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  16. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  17. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    Schrader, Bradley J.

    2009-01-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  18. 14 CFR 135.23 - Manual contents.

    Science.gov (United States)

    2010-01-01

    ... AND ON DEMAND OPERATIONS AND RULES GOVERNING PERSONS ON BOARD SUCH AIRCRAFT General § 135.23 Manual... this section, to assist each crewmember and person performing or directly supervising the following job... Analysis establishing runway safety margins at destination airports, taking into account the following...

  19. CRISP instrument manual

    International Nuclear Information System (INIS)

    Bucknall, D.G.; Langridge, Sean

    1997-05-01

    This document is a user manual for CRISP, one of the two neutron reflectomers at ISIS. CRISP is highly automated allowing precision reproducible measurements. The manual provides detailed instructions for the setting-up and running of the instrument and advice on data analysis. (UK)

  20. Cost-benefit analysis model: A tool for area-wide fruit fly management. Procedures manual

    International Nuclear Information System (INIS)

    Enkerlin, W.; Mumford, J.; Leach, A.

    2007-03-01

    The Generic Fruit Fly Cost-Benefit Analysis Model assists in economic decision making associated with area-wide fruit fly control options. The FRUIT FLY COST-BENEFIT ANALYSIS PROGRAM (available on 1 CD-ROM from the Joint FAO/IAEA Programme of Nuclear Techniques in Food and Agriculture) is an Excel 2000 Windows based program, for which all standard Windows and Excel conventions apply. The Model is user friendly and thus largely self-explanatory. Nevertheless, it includes a procedures manual that has been prepared to guide the user, and thus should be used together with the software. Please note that the table presenting the pest management options in the Introductory Page of the model is controlled by spin buttons and click boxes. These controls are linked to macros that hide non relevant tables and boxes. N.B. it is important that the medium level of security is selected from the Tools menu of Excel, to do this go to Tools|Macros|Security| and select Medium. When the file is opened a form will appear containing three buttons, click on the middle button, 'Enable Macros', so that the macros may be used. Ideally the model should be used as a support tool by working groups aiming at assessing the economic returns of different fruit fly control options (suppression, eradication, containment and prevention). The working group should include professionals in agriculture with experience in area-wide implementation of integrated pest management programmes, an economist or at least someone with basic knowledge in economics, and if relevant, an entomologist with some background in the application of the sterile insect technique (SIT)

  1. UniProtKB/Swiss-Prot, the Manually Annotated Section of the UniProt KnowledgeBase: How to Use the Entry View.

    Science.gov (United States)

    Boutet, Emmanuel; Lieberherr, Damien; Tognolli, Michael; Schneider, Michel; Bansal, Parit; Bridge, Alan J; Poux, Sylvain; Bougueleret, Lydie; Xenarios, Ioannis

    2016-01-01

    The Universal Protein Resource (UniProt, http://www.uniprot.org ) consortium is an initiative of the SIB Swiss Institute of Bioinformatics (SIB), the European Bioinformatics Institute (EBI) and the Protein Information Resource (PIR) to provide the scientific community with a central resource for protein sequences and functional information. The UniProt consortium maintains the UniProt KnowledgeBase (UniProtKB), updated every 4 weeks, and several supplementary databases including the UniProt Reference Clusters (UniRef) and the UniProt Archive (UniParc).The Swiss-Prot section of the UniProt KnowledgeBase (UniProtKB/Swiss-Prot) contains publicly available expertly manually annotated protein sequences obtained from a broad spectrum of organisms. Plant protein entries are produced in the frame of the Plant Proteome Annotation Program (PPAP), with an emphasis on characterized proteins of Arabidopsis thaliana and Oryza sativa. High level annotations provided by UniProtKB/Swiss-Prot are widely used to predict annotation of newly available proteins through automatic pipelines.The purpose of this chapter is to present a guided tour of a UniProtKB/Swiss-Prot entry. We will also present some of the tools and databases that are linked to each entry.

  2. Analysis of femtosecond laser assisted capsulotomy cutting edges and manual capsulorhexis using environmental scanning electron microscopy.

    Science.gov (United States)

    Serrao, Sebastiano; Lombardo, Giuseppe; Desiderio, Giovanni; Buratto, Lucio; Schiano-Lomoriello, Domenico; Pileri, Marco; Lombardo, Marco

    2014-01-01

    Purpose. To investigate the structure and irregularity of the capsulotomy cutting edges created by two femtosecond (FS) laser platforms in comparison with manual continuous circular capsulorhexis (CCC) using environmental scanning electron microscopy (eSEM). Methods. Ten anterior capsulotomies were obtained using two different FS laser cataract platforms (LenSx, n = 5, and Victus, n = 5). In addition, five manual CCC (n = 5) were obtained using a rhexis forceps. The specimens were imaged by eSEM (FEI Quanta 400, OR, USA). Objective metrics, which included the arithmetic mean deviation of the surface (Sa) and the root-mean-square deviation of the surface (Sq), were used to evaluate the irregularity of both the FS laser capsulotomies and the manual CCC cutting edges. Results. Several microirregularities were shown across the FS laser capsulotomy cutting edges. The edges of manually torn capsules were shown, by comparison of Sa and Sq values, to be smoother (P < 0.05) than the FS laser capsulotomy edges. Conclusions. Work is needed to understand whether the FS laser capsulotomy edge microirregularities, not seen in manual CCC, may act as focal points for the concentration of stress that would increase the risk of capsular tear during phacoemulsification as recently reported in the literature.

  3. Analysis of Femtosecond Laser Assisted Capsulotomy Cutting Edges and Manual Capsulorhexis Using Environmental Scanning Electron Microscopy

    Directory of Open Access Journals (Sweden)

    Sebastiano Serrao

    2014-01-01

    Full Text Available Purpose. To investigate the structure and irregularity of the capsulotomy cutting edges created by two femtosecond (FS laser platforms in comparison with manual continuous circular capsulorhexis (CCC using environmental scanning electron microscopy (eSEM. Methods. Ten anterior capsulotomies were obtained using two different FS laser cataract platforms (LenSx, n=5, and Victus, n=5. In addition, five manual CCC (n=5 were obtained using a rhexis forceps. The specimens were imaged by eSEM (FEI Quanta 400, OR, USA. Objective metrics, which included the arithmetic mean deviation of the surface (Sa and the root-mean-square deviation of the surface (Sq, were used to evaluate the irregularity of both the FS laser capsulotomies and the manual CCC cutting edges. Results. Several microirregularities were shown across the FS laser capsulotomy cutting edges. The edges of manually torn capsules were shown, by comparison of Sa and Sq values, to be smoother (P<0.05 than the FS laser capsulotomy edges. Conclusions. Work is needed to understand whether the FS laser capsulotomy edge microirregularities, not seen in manual CCC, may act as focal points for the concentration of stress that would increase the risk of capsular tear during phacoemulsification as recently reported in the literature.

  4. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Svenson, Ola [Stockholm Univ. (Sweden). Dept. of Psychology

    2000-02-01

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses.

  5. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    International Nuclear Information System (INIS)

    Svenson, Ola

    2000-02-01

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses

  6. VIPRE-01: a thermal-hydraulic analysis code for reactor cores. Volume 3. Programmer's manual. Final report

    International Nuclear Information System (INIS)

    Stewart, C.W.; Koontz, A.S.; Cuta, J.M.; Montgomery, S.D.

    1983-05-01

    VIPRE (Versatile Internals and Component Program for Reactors; EPRI) has been developed for nuclear power utility thermal-hydraulic analysis applications. It is designed to help evaluate nuclear-reactor-core safety limits including minimum departure from nucleate boiling ratio (MDNBR), critical power ratio (CPR), fuel and clad temperatures, and coolant state in normal operation and assumed accident conditions. This is Volume 3, the Programmer's Manual. It explains the codes' structures and the computer interfaces

  7. International Reactor Physics Handbook Database and Analysis Tool (IDAT) - IDAT user manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IRPhEP Database and Analysis Tool (IDAT) was first released in 2013 and is included on the DVD. This database and corresponding user interface allows easy access to handbook information. Selected information from each configuration was entered into IDAT, such as the measurements performed, benchmark values, calculated values and materials specifications of the benchmark. In many cases this is supplemented with calculated data such as neutron balance data, spectra data, k-eff nuclear data sensitivities, and spatial reaction rate plots. IDAT accomplishes two main objectives: 1. Allow users to search the handbook for experimental configurations that satisfy their input criteria. 2. Allow users to trend results and identify suitable benchmarks experiments for their application. IDAT provides the user with access to several categories of calculated data, including: - 1-group neutron balance data for each configuration with individual isotope contributions in the reactor system. - Flux and other reaction rates spectra in a 299-group energy scheme. Plotting capabilities were implemented into IDAT allowing the user to compare the spectra of selected configurations in the original fine energy structure or on any user-defined broader energy structure. - Sensitivity coefficients (percent changes of k-effective due to elementary change of basic nuclear data) for the major nuclides and nuclear processes in a 238-group energy structure. IDAT is actively being developed. Those approved to access the online version of the handbook will also have access to an online version of IDAT. As May 2013 marks the first release, IDAT may contain data entry errors and omissions. The handbook remains the primary source of reactor physics benchmark data. A copy of IDAT user's manual is attached to this document. A copy of the IRPhE Handbook can be obtained on request at http://www.oecd-nea.org/science/wprs/irphe/irphe-handbook/form.html

  8. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    Science.gov (United States)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  9. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  10. Seismic analysis of the in-pile test section

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. M.; Park, K. N.; Chi, D. Y.; Park, S. K.; Sim, B. S.; Ahn, S. H.; Lee, C. Y.; Kim, Y. J. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    This study gives the results of the seismic analysis of the IPS (In Pile Section) with lower bracket support. The results cover the natural frequency and seismic response of the IPS for the SSE and OBE events. An FE (Finite Element) model which includes the two vessels of the IPS and its support structure were analyzed by ABAQUS.

  11. Reliability and Maintainability Model (RAM): User and Maintenance Manual. Part 2; Improved Supportability Analysis

    Science.gov (United States)

    Ebeling, Charles E.

    1996-01-01

    This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. This Manual updates and supersedes the 1995 RAM User and Maintenance Manual. Changes and enhancements from the 1995 version of the model are primarily a result of the addition of more recent aircraft and shuttle R&M data.

  12. Manual Therapy

    OpenAIRE

    Hakgüder, Aral; Kokino, Siranuş

    2002-01-01

    Manual therapy has been used in the treatment of pain and dysfunction of spinal and peripheral joints for more than a hundred years. Manual medicine includes manipulation, mobilization, and postisometric relaxation techniques. The aim of manual therapy is to enhance restricted movement caused by blockage of joints keeping postural balance, restore function and maintain optimal body mechanics. Anatomic, biomechanical, and neurophysiological evaluations of the leucomotor system is essential for...

  13. Cross-Sectional Analysis of Longitudinal Mediation Processes.

    Science.gov (United States)

    O'Laughlin, Kristine D; Martin, Monica J; Ferrer, Emilio

    2018-01-01

    Statistical mediation analysis can help to identify and explain the mechanisms behind psychological processes. Examining a set of variables for mediation effects is a ubiquitous process in the social sciences literature; however, despite evidence suggesting that cross-sectional data can misrepresent the mediation of longitudinal processes, cross-sectional analyses continue to be used in this manner. Alternative longitudinal mediation models, including those rooted in a structural equation modeling framework (cross-lagged panel, latent growth curve, and latent difference score models) are currently available and may provide a better representation of mediation processes for longitudinal data. The purpose of this paper is twofold: first, we provide a comparison of cross-sectional and longitudinal mediation models; second, we advocate using models to evaluate mediation effects that capture the temporal sequence of the process under study. Two separate empirical examples are presented to illustrate differences in the conclusions drawn from cross-sectional and longitudinal mediation analyses. Findings from these examples yielded substantial differences in interpretations between the cross-sectional and longitudinal mediation models considered here. Based on these observations, researchers should use caution when attempting to use cross-sectional data in place of longitudinal data for mediation analyses.

  14. Flight dynamics analysis and simulation of heavy lift airships. Volume 2: Technical manual

    Science.gov (United States)

    Ringland, R. F.; Tischler, M. B.; Jex, H. R.; Emmen, R. D.; Ashkenas, I. L.

    1982-01-01

    The mathematical models embodied in the simulation are described in considerable detail and with supporting evidence for the model forms chosen. In addition the trimming and linearization algorithms used in the simulation are described. Appendices to the manual identify reference material for estimating the needed coefficients for the input data and provide example simulation results.

  15. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  16. COMPARISON OF MANUAL AND SEMIAUTOMATED FUNDUS AUTOFLUORESCENCE ANALYSIS OF MACULAR ATROPHY IN STARGARDT DISEASE PHENOTYPE.

    Science.gov (United States)

    Kuehlewein, Laura; Hariri, Amir H; Ho, Alexander; Dustin, Laurie; Wolfson, Yulia; Strauss, Rupert W; Scholl, Hendrik P N; Sadda, SriniVas R

    2016-06-01

    To evaluate manual and semiautomated grading techniques for assessing decreased fundus autofluorescence (DAF) in patients with Stargardt disease phenotype. Certified reading center graders performed manual and semiautomated (region finder-based) grading of confocal scanning laser ophthalmoscopy (cSLO) fundus autofluorescence (FAF) images for 41 eyes of 22 patients. Lesion types were defined based on the black level and sharpness of the border: definite decreased autofluorescence (DDAF), well, and poorly demarcated questionably decreased autofluorescence (WDQDAF, PDQDAF). Agreement in grading between the two methods and inter- and intra-grader agreement was assessed by kappa coefficients (κ) and intraclass correlation coefficients (ICC). The mean ± standard deviation (SD) area was 3.07 ± 3.02 mm for DDAF (n = 31), 1.53 ± 1.52 mm for WDQDAF (n = 9), and 6.94 ± 10.06 mm for PDQDAF (n = 17). The mean ± SD absolute difference in area between manual and semiautomated grading was 0.26 ± 0.28 mm for DDAF, 0.20 ± 0.26 mm for WDQDAF, and 4.05 ± 8.32 mm for PDQDAF. The ICC (95% confidence interval) for method comparison was 0.992 (0.984-0.996) for DDAF, 0.976 (0.922-0.993) for WDQDAF, and 0.648 (0.306-0.842) for PDQDAF. Inter- and intra-grader agreement in manual and semiautomated quantitative grading was better for DDAF (0.981-0.996) and WDQDAF (0.995-0.999) than for PDQDAF (0.715-0.993). Manual and semiautomated grading methods showed similar levels of reproducibility for assessing areas of decreased autofluorescence in patients with Stargardt disease phenotype. Excellent agreement and reproducibility were observed for well demarcated lesions.

  17. Development of radar cross section analysis system of naval ships

    Directory of Open Access Journals (Sweden)

    Kookhyun Kim

    2012-03-01

    Full Text Available A software system for a complex object scattering analysis, named SYSCOS, has been developed for a systematic radar cross section (RCS analysis and reduction design. The system is based on the high frequency analysis methods of physical optics, geometrical optics, and physical theory of diffraction, which are suitable for RCS analysis of electromagnetically large and complex targets as like naval ships. In addition, a direct scattering center analysis function has been included, which gives relatively simple and intuitive way to discriminate problem areas in design stage when comparing with conventional image-based approaches. In this paper, the theoretical background and the organization of the SYSCOS system are presented. To verify its accuracy and to demonstrate its applicability, numerical analyses for a square plate, a sphere and a cylinder, a weapon system and a virtual naval ship have been carried out, of which results have been compared with analytic solutions and those obtained by the other existing software.

  18. Quality Manual

    Science.gov (United States)

    Koch, Michael

    The quality manual is the “heart” of every management system related to quality. Quality assurance in analytical laboratories is most frequently linked with ISO/IEC 17025, which lists the standard requirements for a quality manual. In this chapter examples are used to demonstrate, how these requirements can be met. But, certainly, there are many other ways to do this.

  19. FINAS. Example manual. 2

    International Nuclear Information System (INIS)

    Iwata, Koji; Tsukimori, Kazuyuki; Ueno, Mutsuo

    2003-12-01

    FINAS is a general purpose structural analysis computer program which was developed by Japan Nuclear Cycle Development Institute for the analysis of static, dynamic and thermal responses of elastic and inelastic structures by the finite element method. This manual contains typical analysis examples that illustrate applications of FINAS to a variety of structural engineering problems. The first part of this manual presents fundamental examples in which numerical solutions by FINAS are compared with some analytical reference solutions, and the second part of this manual presents more complex examples intended for practical application. All the input data images and principal results for each problem are included in this manual for beginners' convenience. All the analyses are performed by using the FINAS Version 13.0. (author)

  20. A quantitative analysis of rotary, ultrasonic and manual techniques to treat proximally flattened root canals

    Directory of Open Access Journals (Sweden)

    Fabiana Soares Grecca

    2007-04-01

    Full Text Available OBJECTIVE: The efficiency of rotary, manual and ultrasonic root canal instrumentation techniques was investigated in proximally flattened root canals. MATERIAL AND METHODS: Forty human mandibular left and right central incisors, lateral incisors and premolars were used. The pulp tissue was removed and the root canals were filled with red die. Teeth were instrumented using three techniques: (i K3 and ProTaper rotary systems; (ii ultrasonic crown-down technique; and (iii progressive manual technique. Roots were bisected longitudinally in a buccolingual direction. The instrumented canal walls were digitally captured and the images obtained were analyzed using the Sigma Scan software. Canal walls were evaluated for total canal wall area versus non-instrumented area on which dye remained. RESULTS: No statistically significant difference was found between the instrumentation techniques studied (p<0.05. CONCLUSION: The findings of this study showed that no instrumentation technique was 100% efficient to remove the dye.

  1. The new reform in tunisia: the democratic challenge of the manuals analysis and teaching

    OpenAIRE

    Maria Lucenti

    2017-01-01

    Tunisia is affected by many changes, including the reform of the education system that plays a crucial role. The Ministry of Education together with Trade Unions (UGTT) and the Arab Institute of Human Rights are tracing the outlines of the reform, through an experiment of participatory democracy, synthesized through the preparation of a national report, which is analyzed here. The reform aims to change profoundly the current educational system: programs, manuals, the training of teachers, tea...

  2. User's manual of the REFLA-1D/MODE4 reflood thermo-hydrodynamic analysis code

    International Nuclear Information System (INIS)

    Hojo, Tsuneyuki; Iguchi, Tadashi; Okubo, Tsutomu; Murao, Yoshio; Sugimoto, Jun.

    1986-01-01

    REFLA-1D/MODE4 code has been developed by incorporating local power effect model and fuel temperature profile effect model into REFLA-1D/MODE3 code. This code can calculate the temperature transient of local rod by considering radial power profile effect in core and simulate the thermal characteristics of the nuclear fuel rod. This manual describes the outline of incorporated models, modification of the code with incorporating models and provides application information required to utilize the code. (author)

  3. A gender analysis of secondary school physics textbooks and laboratory manuals

    Science.gov (United States)

    Kostas, Nancy Ann

    Secondary school physics textbooks and laboratory manuals were evaluated for gender balance. The textbooks and manuals evaluated were all current editions available at the time of the study with copyrights of 1988 to 1992. Illustrations, drawings and photographs were judged gender balanced based on the number of men and women, boys and girls shown in both active and passive roles. Illustrations, drawings and photographs were also evaluated by the number of male and female scientists identified by name. The curricular content of the textbooks was analyzed for gender balance by three criteria: the number of named male and female scientists whose accomplishments were described in the text; the number of careers assigned to men and women; and the number of verbal analogies assigned to girls interests, boys interests or neutral interests. The laboratory activities in the manuals were categorized as demonstrations, experiments and observations. Three of each of these types of activities from each manual were analyzed for skills and motivating factors important to girls as identified by Potter and Rosser (1992). Data were analyzed by use of descriptive statistics of frequencies, means and chi-square goodness of fit. The.05 level of significance was applied to all analyses based upon an expected frequency of 50 - 50 percentage of men and women and a 4.5 percent for women scientists to 95.5 percent for men scientists. The findings were as follows. None of the textbooks had a balance of men/women, boys/girls in the illustrations, drawings and photographs. The Hewitt (Scott-Foresman, 1989) textbook was the only textbook with no significant difference. Using the expected frequency for male and female scientists, two textbooks were gender balanced for illustrations, drawings and photographs while all textbooks were gender balanced for described accomplishments of scientists. The Hewitt (Scott Foresman, 1989) textbook had the only gender balanced representation of careers

  4. Brain-wide mapping of axonal connections: workflow for automated detection and spatial analysis of labeling in microscopic sections

    Directory of Open Access Journals (Sweden)

    Eszter Agnes ePapp

    2016-04-01

    Full Text Available Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA and Phaseolus vulgaris leucoagglutinin (Pha-L allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS atlas of the Sprague Dawley rat brain (v2 by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data.

  5. Camp Health Aide Manual = Manual para trabajadores de salud.

    Science.gov (United States)

    Robinson, June Grube; And Others

    This bilingual manual serves as a textbook for migrant Camp Health Aides. Camp Health Aides are members of migrant labor camps enlisted to provide information about health and social services to migrant workers and their families. The manual is divided into 12 tabbed sections representing lessons. Teaching notes printed on contrasting paper…

  6. Probe code: a set of programs for processing and analysis of the left ventricular function - User's manual

    International Nuclear Information System (INIS)

    Piva, R.M.V.

    1987-01-01

    The User's Manual of the Probe Code is an addendum to the M.Sc. thesis entitled A Microcomputer System of Nuclear Probe to Check the Left Ventricular Function. The Probe Code is a software which was developed for processing and off-line analysis curves from the Left Ventricular Function, that were obtained in vivo. These curves are produced by means of an external scintigraph probe, which was collimated and put on the left ventricule, after a venous inoculation of Tc-99 m. (author)

  7. Application of automated image analysis reduces the workload of manual screening of sentinel Lymph node biopsies in breast cancer

    DEFF Research Database (Denmark)

    Holten-Rossing, Henrik; Talman, Maj-Lis Møller; Jylling, Anne Marie Bak

    2017-01-01

    axilla. In patients with no clinical signs of metastatic disease in the axilla, a SLN biopsy (SLNB) is performed. Assessment of metastases in the SLNB is done in a conventional microscope by manually observing a metastasis and measuring its size and/or counting the number of tumor cells. This is done...... essentially to categorize the type of metastases as macrometastases, micrometastases or isolated tumor cells, which is used to determine which treatment the breast cancer patient will benefit mostly from. The aim of this study was to evaluate whether digital image analysis can be applied as a screening tool...

  8. Perturbation analysis for Monte Carlo continuous cross section models

    International Nuclear Information System (INIS)

    Kennedy, Chris B.; Abdel-Khalik, Hany S.

    2011-01-01

    Sensitivity analysis, including both its forward and adjoint applications, collectively referred to hereinafter as Perturbation Analysis (PA), is an essential tool to complete Uncertainty Quantification (UQ) and Data Assimilation (DA). PA-assisted UQ and DA have traditionally been carried out for reactor analysis problems using deterministic as opposed to stochastic models for radiation transport. This is because PA requires many model executions to quantify how variations in input data, primarily cross sections, affect variations in model's responses, e.g. detectors readings, flux distribution, multiplication factor, etc. Although stochastic models are often sought for their higher accuracy, their repeated execution is at best computationally expensive and in reality intractable for typical reactor analysis problems involving many input data and output responses. Deterministic methods however achieve computational efficiency needed to carry out the PA analysis by reducing problem dimensionality via various spatial and energy homogenization assumptions. This however introduces modeling error components into the PA results which propagate to the following UQ and DA analyses. The introduced errors are problem specific and therefore are expected to limit the applicability of UQ and DA analyses to reactor systems that satisfy the introduced assumptions. This manuscript introduces a new method to complete PA employing a continuous cross section stochastic model and performed in a computationally efficient manner. If successful, the modeling error components introduced by deterministic methods could be eliminated, thereby allowing for wider applicability of DA and UQ results. Two MCNP models demonstrate the application of the new method - a Critical Pu Sphere (Jezebel), a Pu Fast Metal Array (Russian BR-1). The PA is completed for reaction rate densities, reaction rate ratios, and the multiplication factor. (author)

  9. Software-assisted quantitative analysis of small bowel motility compared to manual measurements

    International Nuclear Information System (INIS)

    Bickelhaupt, S.; Froehlich, J.M.; Cattin, R.; Raible, S.; Bouquet, H.; Bill, U.; Patak, M.A.

    2014-01-01

    Aim: To validate a newly developed software prototype that automatically analyses small bowel motility by comparing it directly with manual measurement. Material and methods: Forty-five patients with clinical indication for small bowel magnetic resonance imaging (MRI) were retrospectively included in this institutional review board-approved study. MRI was performed using a 1.5 T system following a standard MR-enterography protocol. Small bowel motility parameters (contractions-per-minute, luminal diameter, amplitude) were measured three times each in identical segments using the manual and the semiautomatic software-assisted method. The methods were compared for agreement, repeatability, and time needed for each measurement. All parameters were compared between the methods. Results: A total of 91 small-bowel segments were analysed. No significant intra-individual difference (p > 0.05) was found for peristaltic frequencies between the methods (mean: 4.14/min manual; 4.22/min software-assisted). Amplitudes (5.14 mm; 5.57 mm) and mean lumen diameters (17.39 mm; 14.68) differed due to systematic differences in the definition of the bowel wall. Mean duration of single measurement was significantly (p < 0.01) shorter with the software (6.25 min; 1.30 min). The scattering of repeated measurements was significantly (p < 0.05) lower using the software. Conclusion: The software-assisted method accomplished highly reliable, fast and accurate measurement of small bowel motility. Measurement precision and duration differed significantly between the two methods in favour of the software-assisted technique

  10. SCALE system cross-section validation for criticality safety analysis

    International Nuclear Information System (INIS)

    Hathout, A.M.; Westfall, R.M.; Dodds, H.L. Jr.

    1980-01-01

    The purpose of this study is to test selected data from three cross-section libraries for use in the criticality safety analysis of UO 2 fuel rod lattices. The libraries, which are distributed with the SCALE system, are used to analyze potential criticality problems which could arise in the industrial fuel cycle for PWR and BWR reactors. Fuel lattice criticality problems could occur in pool storage, dry storage with accidental moderation, shearing and dissolution of irradiated elements, and in fuel transport and storage due to inadequate packing and shipping cask design. The data were tested by using the SCALE system to analyze 25 recently performed critical experiments

  11. In-vessel source term analysis code TRACER version 2.3. User's manual

    International Nuclear Information System (INIS)

    Toyohara, Daisuke; Ohno, Shuji; Hamada, Hirotsugu; Miyahara, Shinya

    2005-01-01

    A computer code TRACER (Transport Phenomena of Radionuclides for Accident Consequence Evaluation of Reactor) version 2.3 has been developed to evaluate species and quantities of fission products (FPs) released into cover gas during a fuel pin failure accident in an LMFBR. The TRACER version 2.3 includes new or modified models shown below. a) Both model: a new model for FPs release from fuel. b) Modified model for FPs transfer from fuel to bubbles or sodium coolant. c) Modified model for bubbles dynamics in coolant. Computational models, input data and output data of the TRACER version 2.3 are described in this user's manual. (author)

  12. A comparison between the conventional manual ROI method and an automatic algorithm for semiquantitative analysis of SPECT studies

    International Nuclear Information System (INIS)

    Pagan, L; Novi, B; Guidarelli, G; Tranfaglia, C; Galli, S; Lucchi, G; Fagioli, G

    2011-01-01

    In this study, the performance of a free software for automatic segmentation of striatal SPECT brain studies (BasGanV2 - www.aimn.it) and a standard manual Region Of Interest (ROI) method were compared. The anthropomorphic Alderson RSD phantom, filled with solutions at different concentration of 123 I-FP-CIT with Caudate-Putamen to Background ratios between 1 and 8.7 and Caudate to Putamen ratios between 1 and 2, was imaged on a Philips-Irix triple head gamma camera. Images were reconstructed using filtered back-projection and processed with both BasGanV2, that provides normalized striatal uptake values on volumetric anatomical ROIs, and a manual method, based on average counts per voxel in ROIs drawn in a three-slice section. Caudate-Putamen/Background and Caudate/Putamen ratios obtained with the two methods were compared with true experimental ratios. Good correlation was found for each method; BasGanV2, however, has higher R index (BasGan R mean = 0.95, p mean = 0.89, p 123 I-FP-CIT SPECT data with, moreover, the advantage of the availability of a control subject's database.

  13. The Relationship between Mechanical Hyperalgesia Assessed by Manual Tender Point Examination and Disease Severity in Patients with Chronic Widespread Pain: A Cross-Sectional Study

    DEFF Research Database (Denmark)

    Amris, Kirstine; Wæhrens, Eva Ejlersen; Jespersen, Anders

    2014-01-01

    The clinical utility of tender point (TP) examination in patients reporting chronic widespread pain (CWP) is the subject of contemporary debate. The objective of this study was to assess the relationship between mechanical hyperalgesia assessed by manual TP examination and clinical disease severity...

  14. Exercise, Manual Therapy, and Booster Sessions in Knee Osteoarthritis: Cost-Effectiveness Analysis From a Multicenter Randomized Controlled Trial.

    Science.gov (United States)

    Bove, Allyn M; Smith, Kenneth J; Bise, Christopher G; Fritz, Julie M; Childs, John; Brennan, Gerard P; Abbott, J Haxby; Fitzgerald, G Kelley

    2018-01-01

    Limited information exists regarding the cost-effectiveness of rehabilitation strategies for individuals with knee osteoarthritis (OA). The study objective was to compare the cost-effectiveness of 4 different combinations of exercise, manual therapy, and booster sessions for individuals with knee OA. This economic evaluation involved a cost-effectiveness analysis performed alongside a multicenter randomized controlled trial. The study took place in Pittsburgh, Pennsylvania; Salt Lake City, Utah; and San Antonio, Texas. The study participants were 300 individuals taking part in a randomized controlled trial investigating various physical therapy strategies for knee OA. Participants were randomized into 4 treatment groups: exercise only (EX), exercise plus booster sessions (EX+B), exercise plus manual therapy (EX+MT), and exercise plus manual therapy and booster sessions (EX+MT+B). For the 2-year base case scenario, a Markov model was constructed using the United States societal perspective and a 3% discount rate for costs and quality-adjusted life years (QALYs). Incremental cost-effectiveness ratios were calculated to compare differences in cost per QALY gained among the 4 treatment strategies. In the 2-year analysis, booster strategies (EX+MT+B and EX+B) dominated no-booster strategies, with both lower health care costs and greater effectiveness. EX+MT+B had the lowest total health care costs. EX+B cost ${\\$}$1061 more and gained 0.082 more QALYs than EX+MT+B, for an incremental cost-effectiveness ratio of ${\\$}$12,900/QALY gained. The small number of total knee arthroplasty surgeries received by individuals in this study made the assessment of whether any particular strategy was more successful at delaying or preventing surgery in individuals with knee OA difficult. Spacing exercise-based physical therapy sessions over 12 months using periodic booster sessions was less costly and more effective over 2 years than strategies not containing booster sessions for

  15. REFLA-1D/MODE 1: a computer program for reflood thermo-hydrodynamic analysis during PWR-LOCA user's manual

    International Nuclear Information System (INIS)

    Murao, Yoshio; Sugimoto, Jun; Okubo, Tsutomu

    1981-01-01

    This manual describes the REFLA-1D/MODE 1 reflood system analysis code. This code can solve the core thermo-hydrodynamics under forced flooding conditions and gravity feed conditions in a system similar to FLECHT-SET phase A. This manual describes the REFLA-1D/MODE 1 models and provides application information required to utilize REFLA-1D/MODE 1. (author)

  16. Idaho Safety Manual.

    Science.gov (United States)

    Idaho State Dept. of Education, Boise. Div. of Vocational Education.

    This manual is intended to help teachers, administrators, and local school boards develop and institute effective safety education as a part of all vocational instruction in the public schools of Idaho. This guide is organized in 13 sections that cover the following topics: introduction to safety education, legislation, levels of responsibility,…

  17. Functional Assessment Inventory Manual.

    Science.gov (United States)

    Crewe, Nancy M.; Athelstan, Gary T.

    This manual, which provides extensive new instructions for administering the Functional Assessment Inventory (FAI), is intended to enable counselors to begin using the inventory without undergoing any special training. The first two sections deal with the need for functional assessment and issues in the development and use of the inventory. The…

  18. NDS EXFOR Manual

    International Nuclear Information System (INIS)

    Lemmel, H.D.

    1979-06-01

    This manual contains the coding rules and formats and NDS internal compilation rules for the exchange format (EXFOR) for the transmission of nuclear reaction data between national and international nuclear data centres, and for the data storage and retrieval system of the IAEA Nuclear Data Section

  19. The new reform in tunisia: the democratic challenge of the manuals analysis and teaching

    Directory of Open Access Journals (Sweden)

    Maria Lucenti

    2017-06-01

    Full Text Available Tunisia is affected by many changes, including the reform of the education system that plays a crucial role. The Ministry of Education together with Trade Unions (UGTT and the Arab Institute of Human Rights are tracing the outlines of the reform, through an experiment of participatory democracy, synthesized through the preparation of a national report, which is analyzed here. The reform aims to change profoundly the current educational system: programs, manuals, the training of teachers, teaching, all these aspects will be included. Faced with the current economic situation and the problems affecting the country, from the religious extremism to the economic crisis, attempts have been made to examine the solutions proposed by various players involved. After a brief historical reconstruction of the reforms that have affected the Tunisian school system, a current reform is discussed, focusing especially on the school textbooks and on the image of Europe that they carry. Based on the opening to the other, discussed in the manuals, how can one explain the diffusion of religious fanaticism among certain groups of young people? How can the school consolidate the emerging democracy? These are the questions that are tried to be answered, leaving open the possibility of different interpretations.

  20. The Navruz Project: Transboundary Monitoring for Radionuclides and Metals in Central Asia Rivers. Sampling and Analysis Plan and Operational Manual

    International Nuclear Information System (INIS)

    Passell, Howard D.; Barber, David S.; Betsill, J. David; Littlfield, Adriane C.; Mohagheghi, Amir H.; Shanks, Sonoya T.; Yuldashev, Bekhzad; Salikhbaev, Umar; Radyuk, Raisa; Djuraev, Akram; Djuraev, Amwar; Vasilev, Ivan; Tolongutov, Bajgabyl; Valentina, Alekhina; Solodukhin, Vladimir; Pozniak, Victor

    2002-01-01

    The transboundary nature of water resources demands a transboundary approach to their monitoring and management. However, transboundary water projects raise a challenging set of problems related to communication issues, and standardization of sampling, analysis and data management methods. This manual addresses those challenges and provides the information and guidance needed to perform the Navruz Project, a cooperative, transboundary, river monitoring project involving rivers and institutions in Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan facilitated by Sandia National Laboratories in the U.S. The Navruz Project focuses on waterborne radionuclides and metals because of their importance to public health and nuclear materials proliferation concerns in the region. This manual provides guidelines for participants on sample and data collection, field equipment operations and procedures, sample handling, laboratory analysis, and data management. Also included are descriptions of rivers, sampling sites and parameters on which data are collected. Data obtained in this project are shared among all participating countries and the public through an internet web site, and are available for use in further studies and in regional transboundary water resource management efforts. Overall, the project addresses three main goals: to help increase capabilities in Central Asian nations for sustainable water resources management; to provide a scientific basis for supporting nuclear transparency and non-proliferation in the region; and to help reduce the threat of conflict in Central Asia over water resources, proliferation concerns, or other factors.

  1. The Navruz Project: Transboundary Monitoring for Radionuclides and Metals in Central Asia Rivers. Sampling and Analysis Plan and Operational Manual

    Energy Technology Data Exchange (ETDEWEB)

    Passell, Howard D.; Barber, David S.; Betsill, J. David; Littlfield, Adriane C.; Mohagheghi, Amir H.; Shanks, Sonoya T.; Yuldashev, Bekhzad; Salikhbaev, Umar; Radyuk, Raisa; Djuraev, Akram; Djuraev, Amwar; Vasilev, Ivan; Tolongutov, Bajgabyl; Valentina, Alekhina; Solodukhin, Vladimir; Pozniak, Victor

    2002-04-02

    The transboundary nature of water resources demands a transboundary approach to their monitoring and management. However, transboundary water projects raise a challenging set of problems related to communication issues, and standardization of sampling, analysis and data management methods. This manual addresses those challenges and provides the information and guidance needed to perform the Navruz Project, a cooperative, transboundary, river monitoring project involving rivers and institutions in Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan facilitated by Sandia National Laboratories in the U.S. The Navruz Project focuses on waterborne radionuclides and metals because of their importance to public health and nuclear materials proliferation concerns in the region. This manual provides guidelines for participants on sample and data collection, field equipment operations and procedures, sample handling, laboratory analysis, and data management. Also included are descriptions of rivers, sampling sites and parameters on which data are collected. Data obtained in this project are shared among all participating countries and the public through an internet web site, and are available for use in further studies and in regional transboundary water resource management efforts. Overall, the project addresses three main goals: to help increase capabilities in Central Asian nations for sustainable water resources management; to provide a scientific basis for supporting nuclear transparency and non-proliferation in the region; and to help reduce the threat of conflict in Central Asia over water resources, proliferation concerns, or other factors.

  2. Formulation and Analysis of the Quantum Radar Cross Section

    Science.gov (United States)

    Brandsema, Matthew J.

    In radar, the amount of returns that an object sends back to the receiver after being struck by an electromagnetic wave is characterized by what is known as the radar cross section, denoted by sigma typically. There are many mechanisms that affect how much radiation is reflected back in the receiver direction, such as reflectivity, physical contours and dimensions, attenuation properties of the materials, projected cross sectional area and so on. All of these characteristics are lumped together in a single value of sigma, which has units of m2. Stealth aircrafts for example are designed to minimize its radar cross section and return the smallest amount of radiation possible in the receiver direction. A new concept has been introduced called quantum radar, that uses correlated quantum states of photons as well as the unique properties of quantum mechanics to ascertain information on a target at a distance. At the time of writing this dissertation, quantum radar is very much in its infancy. There still exist fundamental questions about the feasibility of its implementation, especially in the microwave spectrum. However, what has been theoretically determined, is that quantum radar has a fundamental advantage over classical radar in terms of resolution and returns in certain regimes. Analogous to the classical radar cross section (CRCS), the concept of the quantum radar cross section (QRCS) has been introduced. This quantity measures how an object looks to a quantum radar be describing how a single photon, or small cluster of photons scatter off of a macroscopic target. Preliminary simulations of the basic quantum radar cross section equation have yielded promising results showing an advantage in sidelobe response in comparison to the classical RCS. This document expands upon this idea by providing insight as to where this advantage originates, as well as developing more rigorous simulation analysis, and greatly expanding upon the theory. The expanded theory presented

  3. MARS CODE MANUAL VOLUME III - Programmer's Manual

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Hwang, Moon Kyu; Jeong, Jae Jun; Kim, Kyung Doo; Bae, Sung Won; Lee, Young Jin; Lee, Won Jae

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This programmer's manual provides a complete list of overall information of code structure and input/output function of MARS. In addition, brief descriptions for each subroutine and major variables used in MARS are also included in this report, so that this report would be very useful for the code maintenance. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  4. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  5. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  6. Training manual on the analysis of microsatellite repeats in human DNA for diagnostic applications

    International Nuclear Information System (INIS)

    Ioannou, P.

    1998-01-01

    The recent discovery that simple sequence length polymorphisms (SSLPs), or microsatellites, are highly polymorphic has provided a rich source of genetic markers for the development of high-resolution maps. SSLPs are ideal markers because they are widely distributed throughout eukrayotic genomes and can be efficiently analyzed using the polymerase chain reaction (PCR). The recent development of moderate-resolution maps of both human and mouse genomes built entirely with SSLPs reflects the rapid conversion from manual Southern blot-based markers to semi-automated PCR-amplified markers during the last few years. Furthermore, these markers can also be used as 'sequence-tagged sites' (STS) in physical maps and provide a direct connection between the genetic and physical maps of eukaryotic chromosomes

  7. Program user's manual: cryogen system for the analysis for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    1979-04-01

    The Mirror Fusion Test Facility being designed and constructed at the Lawrence Livermore Laboratory requires a liquid helium liquefaction, storage, distribution, and recovery system and a liquid nitrogen storage and distribution system. To provide a powerful analytical tool to aid in the design evolution of this system through hardware, a thermodynamic fluid flow model was developed. This model allows the Lawrence Livermore Laboratory to verify that the design meets desired goals and to play what if games during the design evolution. For example, what if the helium flow rate is changed in the magnet liquid helium flow loop; how does this affect the temperature, fluid quality, and pressure. This manual provides all the information required to run all or portions of this program as desired. In addition, the program is constructed in a modular fashion so changes or modifications can be made easily to keep up with the evolving design

  8. Biosafety Manual

    Energy Technology Data Exchange (ETDEWEB)

    King, Bruce W.

    2010-05-18

    Work with or potential exposure to biological materials in the course of performing research or other work activities at Lawrence Berkeley National Laboratory (LBNL) must be conducted in a safe, ethical, environmentally sound, and compliant manner. Work must be conducted in accordance with established biosafety standards, the principles and functions of Integrated Safety Management (ISM), this Biosafety Manual, Chapter 26 (Biosafety) of the Health and Safety Manual (PUB-3000), and applicable standards and LBNL policies. The purpose of the Biosafety Program is to protect workers, the public, agriculture, and the environment from exposure to biological agents or materials that may cause disease or other detrimental effects in humans, animals, or plants. This manual provides workers; line management; Environment, Health, and Safety (EH&S) Division staff; Institutional Biosafety Committee (IBC) members; and others with a comprehensive overview of biosafety principles, requirements from biosafety standards, and measures needed to control biological risks in work activities and facilities at LBNL.

  9. CALENDF-2010: user manual

    International Nuclear Information System (INIS)

    Sublet, Jean-Christophe; Ribon, Pierre; Coste-Delclaux, Mireille

    2011-09-01

    CALENDF-2010 represents a Fortran-95 update of the 1994, 2001 then 2005 code distribution with emphasise on programming quality and standards, physics and usage improvements. Devised to process multigroup cross-sections it relies on Gauss quadrature mathematical principle and strength. The followings processes can be handled by the code: moment probability table and effective cross-section calculation; pointwise cross section, probability table and effective cross-section regrouping; probability table condensation; probability table mix for several isotopes; probability table interpolation; effective cross section based probability table calculations; probability table calculations from effective cross-sections; cross-section comparison, complete energy pointwise cross-section processing and thickness dependent averaged transmission sample calculation. The CALENDF user manual, after having listed all principal code functions, describes sequentially each of them and gives comments on their associated output streams. Installation procedures, test cases and running time platform comparisons are given in the appendix. (authors)

  10. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    International Nuclear Information System (INIS)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B.; Koch, R.

    2012-01-01

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 ± 17.4 sec) was the same as with the manual approach (29.1 ± 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  11. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    Energy Technology Data Exchange (ETDEWEB)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B. [Muenster Univ. (Germany). Dept. of Clinical Radiology; Koch, R. [Muenster Univ. (Germany). Inst. of Biostatistics and Clinical Research

    2012-09-15

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 {+-} 17.4 sec) was the same as with the manual approach (29.1 {+-} 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  12. High order effects in cross section sensitivity analysis

    International Nuclear Information System (INIS)

    Greenspan, E.; Karni, Y.; Gilai, D.

    1978-01-01

    Two types of high order effects associated with perturbations in the flux shape are considered: Spectral Fine Structure Effects (SFSE) and non-linearity between changes in performance parameters and data uncertainties. SFSE are investigated in Part I using a simple single resonance model. Results obtained for each of the resolved and for representative unresolved resonances of 238 U in a ZPR-6/7 like environment indicate that SFSE can have a significant contribution to the sensitivity of group constants to resonance parameters. Methods to account for SFSE both for the propagation of uncertainties and for the adjustment of nuclear data are discussed. A Second Order Sensitivity Theory (SOST) is presented, and its accuracy relative to that of the first order sensitivity theory and of the direct substitution method is investigated in Part II. The investigation is done for the non-linear problem of the effect of changes in the 297 keV sodium minimum cross section on the transport of neutrons in a deep-penetration problem. It is found that the SOST provides a satisfactory accuracy for cross section uncertainty analysis. For the same degree of accuracy, the SOST can be significantly more efficient than the direct substitution method

  13. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  14. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  15. Multivariate survivorship analysis using two cross-sectional samples.

    Science.gov (United States)

    Hill, M E

    1999-11-01

    As an alternative to survival analysis with longitudinal data, I introduce a method that can be applied when one observes the same cohort in two cross-sectional samples collected at different points in time. The method allows for the estimation of log-probability survivorship models that estimate the influence of multiple time-invariant factors on survival over a time interval separating two samples. This approach can be used whenever the survival process can be adequately conceptualized as an irreversible single-decrement process (e.g., mortality, the transition to first marriage among a cohort of never-married individuals). Using data from the Integrated Public Use Microdata Series (Ruggles and Sobek 1997), I illustrate the multivariate method through an investigation of the effects of race, parity, and educational attainment on the survival of older women in the United States.

  16. Basic analysis on the load management in consumer section

    Energy Technology Data Exchange (ETDEWEB)

    Tezuka, Tetsuo; Nishikawa, Eiichi

    1988-05-01

    The load management of the energy (electric power, gas and oil products) in consumer section means to move demand characteristics in desirable directions. The demand characteristics are represented by the energy consumption characteristics along time and their annual sum. The load management is analyzed here from a more practical point of view. As the total thermal demand has been fixed to some extent from the aspect of a total system, the trade-off occurs among objectives of industries. For the quantitative consistency, the model analysis is effective. Changes in the consumers' attitude have been observed as indicated by the cogeneration, heat storage technology and automatic energy management by consumers. Techniques for changing the demand characteristics include the charging system, financial aids for equipment installation, favorable provisions in taxation, law revision and marketing. Stable supply and improved consumption are the future tasks. (2 figs, 6 tabs, 28 refs)

  17. EML procedures manual

    International Nuclear Information System (INIS)

    Volchok, H.L.; de Planque, G.

    1982-01-01

    This manual contains the procedures that are used currently by the Environmental Measurements Laboratory of the US Department of Energy. In addition a number of analytical methods from other laboratories have been included. These were tested for reliability at the Battelle, Pacific Northwest Laboratory under contract with the Division of Biomedical and Environmental Research of the AEC. These methods are clearly distinguished. The manual is prepared in loose leaf form to facilitate revision of the procedures and inclusion of additional procedures or data sheets. Anyone receiving the manual through EML should receive this additional material automatically. The contents are as follows: (1) general; (2) sampling; (3) field measurements; (4) general analytical chemistry; (5) chemical procedures; (6) data section; (7) specifications

  18. The Effectiveness of Manual Therapy for Relieving Pain, Stiffness, and Dysfunction in Knee Osteoarthritis: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Xu, Qinguang; Chen, Bei; Wang, Yueyi; Wang, Xuezong; Han, Dapeng; Ding, Daofang; Zheng, Yuxin; Cao, Yuelong; Zhan, Hongsheng; Zhou, Yao

    2017-05-01

    Knee osteoarthritis (KOA) is the most common form of arthritis, leading to pain disability in seniors and increased health care utilization. Manual therapy is one widely used physical treatment for KOA. To evaluate the effectiveness and adverse events (AEs) of manual therapy compared to other treatments for relieving pain, stiffness, and physical dysfunction in patients with KOA. A systematic review and meta-analysis of manual therapy for KOA. We searched PubMed, EMBASE, the Cochrane Library, and Chinese databases for relevant randomized controlled trials (RCTs) of manual therapy for patients with KOA from the inception to October 2015 without language restrictions. RCTs compared manual therapy to the placebo or other interventional control with an appropriate description of randomization. Two reviewers independently conducted the search results identification, data extraction, and methodological quality assessment. The methodological quality was assessed by PEDro scale. Pooled data was expressed as standard mean difference (SMD), with 95% confident intervals (CIs) in a random effects model. The meta-analysis of manual therapy for KOA on pain, stiffness, and physical function were conducted. Fourteen studies involving 841 KOA participants compared to other treatments were included. The methodological quality of most included RCTs was poor. The mean PEDro scale score was 6.6. The meta-analyses results showed that manual therapy had statistically significant effects on relieving pain (standardized mean difference, SMD = -0.61, 95% CI -0.95 to -0.28, P = 76%), stiffness (SMD = -0.58, 95% CI -0.95 to -0.21, P = 81%), improving physical function (SMD = -0.49, 95% CI -0.76 to -0.22, P = 65%), and total score (SMD = -0.56, 95% CI -0.78 to -0.35, P = 50%). But in the subgroups, manual therapy did not show significant improvements on stiffness and physical function when treatment duration was less than 4 weeks. And the long-term information for manual therapy was

  19. Cross section homogenization analysis for a simplified Candu reactor

    International Nuclear Information System (INIS)

    Pounders, Justin; Rahnema, Farzad; Mosher, Scott; Serghiuta, Dumitru; Turinsky, Paul; Sarsour, Hisham

    2008-01-01

    The effect of using zero current (infinite medium) boundary conditions to generate bundle homogenized cross sections for a stylized half-core Candu reactor problem is examined. Homogenized cross section from infinite medium lattice calculations are compared with cross sections homogenized using the exact flux from the reference core environment. The impact of these cross section differences is quantified by generating nodal diffusion theory solutions with both sets of cross sections. It is shown that the infinite medium spatial approximation is not negligible, and that ignoring the impact of the heterogeneous core environment on cross section homogenization leads to increased errors, particularly near control elements and the core periphery. (authors)

  20. Comparison between manual and automated analysis for the quantification of carotid wall by using sonography. A validation study with CT

    International Nuclear Information System (INIS)

    Saba, Luca; Montisci, Roberto; Molinari, Filippo; Tallapally, Niranjan; Zeng, Guang; Mallarini, Giorgio; Suri, Jasjit S.

    2012-01-01

    Purpose: The purpose of this paper was to compare manual and automated analysis for the quantification of carotid wall obtained with sonography by using the computed tomography as validation technique. Material and methods: 21 consecutive patients underwent MDCTA and ultrasound analysis of carotid arteries (mean age 68 years; age range 59–81 years). The intima–media-thickness (IMT) of the 42 carotids was measured with novel and dedicated automated software analysis (called AtheroEdge™, Biomedical Technologies, Denver, CO, USA) and by four observers that manually calculated the IMT. The carotid artery wall thickness (CAWT) was also quantified in the CT datasets. Bland–Altman statistics was employed to measure the agreement between methods. A Student's t-test was used to test the differences between the IMT values of AtheroEdge™. The study obtained the IRB approval. Results: The correlation between automated AtheroEdge™ measurements and those of the human experts were equal to 95.5%, 73.5%, 88.9%, and 81.7%. The IMT coefficient of variation of the human experts was equal to 11.9%. By using a Student's t-test, the differences between the IMT values of AtheroEdge™ and those of the human experts were not found statistically significant (p value = 0.02). On comparing AtheroEdge™ (using Ultrasound) with CAWT (using CT), the results suggested a very good concordance of 84.96%. Conclusions: Data of this preliminary study indicate that automated software AtheroEdge™ can analyze with precision the IMT of carotid arteries and that the concordance with CT is optimal.

  1. SHARP User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Y. Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Thomas, J. W. [Argonne National Lab. (ANL), Argonne, IL (United States); Mahadevan, Vijay S. [Argonne National Lab. (ANL), Argonne, IL (United States); Rahaman, Ronald O. [Argonne National Lab. (ANL), Argonne, IL (United States); Solberg, Jerome [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-31

    SHARP is an advanced modeling and simulation toolkit for the analysis of nuclear reactors. It is comprised of several components including physical modeling tools, tools to integrate the physics codes for multi-physics analyses, and a set of tools to couple the codes within the MOAB framework. Physics modules currently include the neutronics code PROTEUS, the thermal-hydraulics code Nek5000, and the structural mechanics code Diablo. This manual focuses on performing multi-physics calculations with the SHARP ToolKit. Manuals for the three individual physics modules are available with the SHARP distribution to help the user to either carry out the primary multi-physics calculation with basic knowledge or perform further advanced development with in-depth knowledge of these codes. This manual provides step-by-step instructions on employing SHARP, including how to download and install the code, how to build the drivers for a test case, how to perform a calculation and how to visualize the results. Since SHARP has some specific library and environment dependencies, it is highly recommended that the user read this manual prior to installing SHARP. Verification tests cases are included to check proper installation of each module. It is suggested that the new user should first follow the step-by-step instructions provided for a test problem in this manual to understand the basic procedure of using SHARP before using SHARP for his/her own analysis. Both reference output and scripts are provided along with the test cases in order to verify correct installation and execution of the SHARP package. At the end of this manual, detailed instructions are provided on how to create a new test case so that user can perform novel multi-physics calculations with SHARP. Frequently asked questions are listed at the end of this manual to help the user to troubleshoot issues.

  2. SHARP User Manual

    International Nuclear Information System (INIS)

    Yu, Y. Q.; Shemon, E. R.; Thomas, J. W.; Mahadevan, Vijay S.; Rahaman, Ronald O.; Solberg, Jerome

    2016-01-01

    SHARP is an advanced modeling and simulation toolkit for the analysis of nuclear reactors. It is comprised of several components including physical modeling tools, tools to integrate the physics codes for multi-physics analyses, and a set of tools to couple the codes within the MOAB framework. Physics modules currently include the neutronics code PROTEUS, the thermal-hydraulics code Nek5000, and the structural mechanics code Diablo. This manual focuses on performing multi-physics calculations with the SHARP ToolKit. Manuals for the three individual physics modules are available with the SHARP distribution to help the user to either carry out the primary multi-physics calculation with basic knowledge or perform further advanced development with in-depth knowledge of these codes. This manual provides step-by-step instructions on employing SHARP, including how to download and install the code, how to build the drivers for a test case, how to perform a calculation and how to visualize the results. Since SHARP has some specific library and environment dependencies, it is highly recommended that the user read this manual prior to installing SHARP. Verification tests cases are included to check proper installation of each module. It is suggested that the new user should first follow the step-by-step instructions provided for a test problem in this manual to understand the basic procedure of using SHARP before using SHARP for his/her own analysis. Both reference output and scripts are provided along with the test cases in order to verify correct installation and execution of the SHARP package. At the end of this manual, detailed instructions are provided on how to create a new test case so that user can perform novel multi-physics calculations with SHARP. Frequently asked questions are listed at the end of this manual to help the user to troubleshoot issues.

  3. Modulation of electroencephalograph activity by manual acupuncture stimulation in healthy subjects: An autoregressive spectral analysis

    International Nuclear Information System (INIS)

    Yi Guo-Sheng; Wang Jiang; Deng Bin; Wei Xi-Le; Han Chun-Xiao

    2013-01-01

    To investigate whether and how manual acupuncture (MA) modulates brain activities, we design an experiment where acupuncture at acupoint ST36 of the right leg is used to obtain electroencephalograph (EEG) signals in healthy subjects. We adopt the autoregressive (AR) Burg method to estimate the power spectrum of EEG signals and analyze the relative powers in delta (0 Hz–4 Hz), theta (4 Hz–8 Hz), alpha (8 Hz–13 Hz), and beta (13 Hz–30 Hz) bands. Our results show that MA at ST36 can significantly increase the EEG slow wave relative power (delta band) and reduce the fast wave relative powers (alpha and beta bands), while there are no statistical differences in theta band relative power between different acupuncture states. In order to quantify the ratio of slow to fast wave EEG activity, we compute the power ratio index. It is found that the MA can significantly increase the power ratio index, especially in frontal and central lobes. All the results highlight the modulation of brain activities with MA and may provide potential help for the clinical use of acupuncture. The proposed quantitative method of acupuncture signals may be further used to make MA more standardized. (interdisciplinary physics and related areas of science and technology)

  4. A user input manual for single fuel rod behaviour analysis code FEMAXI-III

    International Nuclear Information System (INIS)

    Saito, Hiroaki; Yanagisawa, Kazuaki; Fujita, Misao.

    1983-03-01

    Principal objectives of Safety related research in connection with lighr water reactor fuel rods under normal operating condition are mainly addressed 1) to assess fuel integrity under steady state condition and 2) to generate initial condition under hypothetical accident. These assessments have to be relied principally upon steady state fuel behaviour computing code that is able to calculate fuel conditions to tbe occurred in a various manner. To achieve these objectives, efforts have been made to develope analytical computer code that calculates in-reactor fuel rod behaviour in best estimate manner. The computer code developed for the prediction of the long-term burnup response of single fuel rod under light water reactor condition is the third in a series of code versions:FEMAMI-III. The code calculates temperature, rod internal gas pressure, fission gas release and pellet-cladding interaction related rod deformation as a function of time-dependent fuel rod power and coolant boundary conditions. This document serves as a user input manual for the code FEMAMI-III which has opened to the public in year of 1982. A general description of the code input and output are included together with typical examples of input data. A detailed description of structures, analytical submodels and solution schemes in the code shall be given in the separate document to be published. (author)

  5. Salinas : theory manual.

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Timothy Francis; Reese, Garth M.; Bhardwaj, Manoj Kumar

    2011-11-01

    Salinas provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Salinas. For a more detailed description of how to use Salinas, we refer the reader to Salinas, User's Notes. Many of the constructs in Salinas are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Salinas are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature.

  6. Regulations and Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    Young, Lydia J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-07-25

    The purpose of the Regulations and Procedures Manual (RPM) is to provide LBNL personnel with a reference to University and Lawrence Berkeley National Laboratory (LBNL or Laboratory) policies and regulations by outlining normal practices and answering most policy questions that arise in the day-to-day operations of Laboratory organizations. Much of the information in this manual has been condensed from detail provided in LBNL procedure manuals, Department of Energy (DOE) directives, and Contract DE-AC02-05CH11231. This manual is not intended, however, to replace any of those documents. RPM sections on personnel apply only to employees who are not represented by unions. Personnel policies pertaining to employees represented by unions may be found in their labor agreements. Questions concerning policy interpretation should be directed to the LBNL organization responsible for the particular policy. A link to the Managers Responsible for RPM Sections is available on the RPM home page. If it is not clear which organization is responsible for a policy, please contact Requirements Manager Lydia Young or the RPM Editor.

  7. Regulations and Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    Young, Lydia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-09-30

    The purpose of the Regulations and Procedures Manual (RPM) is to provide Laboratory personnel with a reference to University and Lawrence Berkeley National Laboratory policies and regulations by outlining the normal practices and answering most policy questions that arise in the day-to-day operations of Laboratory departments. Much of the information in this manual has been condensed from detail provided in Laboratory procedure manuals, Department of Energy (DOE) directives, and Contract DE-AC02-05CH11231. This manual is not intended, however, to replace any of those documents. The sections on personnel apply only to employees who are not represented by unions. Personnel policies pertaining to employees represented by unions may be found in their labor agreements. Questions concerning policy interpretation should be directed to the department responsible for the particular policy. A link to the Managers Responsible for RPM Sections is available on the RPM home page. If it is not clear which department should be called, please contact the Associate Laboratory Director of Operations.

  8. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 5.0: Data loading manual. Volume 10

    International Nuclear Information System (INIS)

    VanHorn, R.L.; Wolfram, L.M.; Fowler, R.D.; Beck, S.T.; Smith, C.L.

    1995-04-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) suite of programs can be used to organize and standardize in an electronic format information from probabilistic risk assessments or individual plant examinations. The Models and Results Database (MAR-D) program of the SAPHIRE suite serves as the repository for probabilistic risk assessment and individual plant examination data and information. This report demonstrates by examples the common electronic and manual methods used to load these types of data. It is not a stand alone document but references documents that contribute information relative to the data loading process. This document provides a more detailed discussion and instructions for using SAPHIRE 5.0 only when enough information on a specific topic is not provided by another available source

  9. BEACON/MOD: a computer program for thermal-hydraulic analysis of nuclear reactor containments - user's manual

    International Nuclear Information System (INIS)

    Broadus, C.R.; Doyle, R.J.; James, S.W.; Lime, J.F.; Mings, W.J.

    1980-04-01

    The BEACON code is a best-estimate, advanced containment code designed to perform a best-estimate analysis of the flow of a mixture of air, water, and steam in a nuclear reactor containment system under loss-of-coolant accident conditions. The code can simulate two-component, two-phase fluid flow in complex geometries using a combination of two-dimensional, one-dimensional, and lumped-parameter representations for the various parts of the system. The current version of BEACON, which is designated BEACON/MOD3, contains mass and heat transfer models for wall film and wall conduction. It is suitable for the evaluation of short-term transients in dry-containment systems. This manual describes the models employed in BEACON/MOD3 and specifies code implementation requirements. It provides application information for input data preparation and for output data interpretation

  10. Light water reactor fuel analysis code FEMAXI-IV(Ver.2). Detailed structure and user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    A light water reactor fuel behavior analysis code FEMAXI-IV(Ver.2) was developed as an improved version of FEMAXI-IV. Development of FEMAXI-IV has been already finished in 1992, though a detailed structure and input manual of the code have not been open to users yet. Here, the basic theories and structure, the models and numerical solutions applied to FEMAXI-IV(Ver.2), and the material properties adopted in the code are described in detail. In FEMAXI-IV(Ver.2), programming bugs in previous FEMAXI-IV were eliminated, renewal of the pellet thermal conductivity was performed, and a model of thermal-stress restraint on FP gas release was incorporated. For facilitation of effective and wide-ranging application of the code, methods of input/output of the code are also described in detail, and sample output is included. (author)

  11. Neutron cross section libraries for analysis of fusion neutronics experiments

    International Nuclear Information System (INIS)

    Kosako, Kazuaki; Oyama, Yukio; Maekawa, Hiroshi; Nakamura, Tomoo

    1988-03-01

    We have prepared two computer code systems producing neutron cross section libraries to analyse fusion neutronics experiments. First system produces the neutron cross section library in ANISN format, i.e., the multi-group constants in group independent format. This library can be obtained by using the multi-group constant processing code system MACS-N and the ANISN format cross section compiling code CROKAS. Second system is for the continuous energy cross section library for the MCNP code. This library can be obtained by the nuclear data processing system NJOY which generates pointwise energy cross sections and the cross section compiling code MACROS for the MCNP library. In this report, we describe the production procedures for both types of the cross section libraries, and show six libraries with different conditions in ANISN format and a library for the MCNP code. (author)

  12. Design, analysis and modeling of a novel hybrid powertrain system based on hybridized automated manual transmission

    Science.gov (United States)

    Wu, Guang; Dong, Zuomin

    2017-09-01

    Hybrid electric vehicles are widely accepted as a promising short to mid-term technical solution due to noticeably improved efficiency and lower emissions at competitive costs. In recent years, various hybrid powertrain systems were proposed and implemented based on different types of conventional transmission. Power-split system, including Toyota Hybrid System and Ford Hybrid System, are well-known examples. However, their relatively low torque capacity, and the drive of alternative and more advanced designs encouraged other innovative hybrid system designs. In this work, a new type of hybrid powertrain system based hybridized automated manual transmission (HAMT) is proposed. By using the concept of torque gap filler (TGF), this new hybrid powertrain type has the potential to overcome issue of torque gap during gearshift. The HAMT design (patent pending) is described in details, from gear layout and design of gear ratios (EV mode and HEV mode) to torque paths at different gears. As an analytical tool, mutli-body model of vehicle equipped with this HAMT was built to analyze powertrain dynamics at various steady and transient modes. A gearshift was decomposed and analyzed based basic modes. Furthermore, a Simulink-SimDriveline hybrid vehicle model was built for the new transmission, driveline and vehicle modular. Control strategy has also been built to harmonically coordinate different powertrain components to realize TGF function. A vehicle launch simulation test has been completed under 30% of accelerator pedal position to reveal details during gearshift. Simulation results showed that this HAMT can eliminate most torque gap that has been persistent issue of traditional AMT, improving both drivability and performance. This work demonstrated a new type of transmission that features high torque capacity, high efficiency and improved drivability.

  13. Analysis of a Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Wu, Hsi-Yung T.; Shaw, Peter; Przekop, Adam

    2013-01-01

    The hybrid wing body center section test article is an all-composite structure made of crown, floor, keel, bulkhead, and rib panels utilizing the Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) design concept. The primary goal of this test article is to prove that PRSEUS components are capable of carrying combined loads that are representative of a hybrid wing body pressure cabin design regime. This paper summarizes the analytical approach, analysis results, and failure predictions of the test article. A global finite element model of composite panels, metallic fittings, mechanical fasteners, and the Combined Loads Test System (COLTS) test fixture was used to conduct linear structural strength and stability analyses to validate the specimen under the most critical combination of bending and pressure loading conditions found in the hybrid wing body pressure cabin. Local detail analyses were also performed at locations with high stress concentrations, at Tee-cap noodle interfaces with surrounding laminates, and at fastener locations with high bearing/bypass loads. Failure predictions for different composite and metallic failure modes were made, and nonlinear analyses were also performed to study the structural response of the test article under combined bending and pressure loading. This large-scale specimen test will be conducted at the COLTS facility at the NASA Langley Research Center.

  14. Users manual for the FORSS sensitivity and uncertainty analysis code system

    International Nuclear Information System (INIS)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology

  15. Users manual for the FORSS sensitivity and uncertainty analysis code system

    Energy Technology Data Exchange (ETDEWEB)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  16. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  17. A manual of analytical methods used at MINTEK

    International Nuclear Information System (INIS)

    Stoch, H.; Dixon, K.

    1983-01-01

    The manual deals with various methods for a wide range of elemental analysis. Some of the methods that are used, include atomic absorption spectroscopy, optical emission spectroscopy and x-ray fluoresence spectroscopy. The basic charateristics of the method are given and the procedures are recorded step by step. One of the sections deals with methods associated with the recovery of uranium

  18. WheelerLab: An interactive program for sequence stratigraphic analysis of seismic sections, outcrops and well sections and the generation of chronostratigraphic sections and dynamic chronostratigraphic sections

    OpenAIRE

    Adewale Amosu; Yuefeng Sun

    2017-01-01

    WheelerLab is an interactive program that facilitates the interpretation of stratigraphic data (seismic sections, outcrop data and well sections) within a sequence stratigraphic framework and the subsequent transformation of the data into the chronostratigraphic domain. The transformation enables the identification of significant geological features, particularly erosional and non-depositional features that are not obvious in the original seismic domain. Although there are some software produ...

  19. Effects of exercise and manual therapy on pain associated with hip osteoarthritis: a systematic review and meta-analysis.

    Science.gov (United States)

    Beumer, Lucy; Wong, Jennie; Warden, Stuart J; Kemp, Joanne L; Foster, Paul; Crossley, Kay M

    2016-04-01

    To explore the effects of exercise (water-based or land-based) and/or manual therapies on pain in adults with clinically and/or radiographically diagnosed hip osteoarthritis (OA). A systematic review and meta-analysis was performed, with patient reported pain assessed using a visual analogue scale (VAS) or the Western Ontario and McMaster Universities Arthritis Index (WOMAC) pain subscale. Data were grouped by follow-up time (0-3 months=short term; 4-12 months=medium term and; >12 months=long term), and standardised mean differences (SMD) with 95% CIs were used to establish intervention effect sizes. Study quality was assessed using modified PEDro scores. 19 trials were included. Four studies showed short-term benefits favouring water-based exercise over minimal control using the WOMAC pain subscale (SMD -0.53, 95% CI -0.96 to -0.10). Six studies supported a short-term benefit of land-based exercise compared to minimal control on VAS assessed pain (SMD -0.49, 95% CI -0.70 to -0.29). There were no medium (SMD -0.23, 95% CI -0.48 to 0.03) or long (SMD -0.22, 95% CI -0.51 to 0.06) term benefits of exercise therapy, or benefit of combining exercise therapy with manual therapy (SMD -0.38, 95% CI -0.88 to 0.13) when compared to minimal control. Best available evidence indicates that exercise therapy (whether land-based or water-based) is more effective than minimal control in managing pain associated with hip OA in the short term. Larger high-quality RCTs are needed to establish the effectiveness of exercise and manual therapies in the medium and long term. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  20. CANAL user's manual

    International Nuclear Information System (INIS)

    Faya, A.; Wolf, L.; Todreas, N.

    1979-11-01

    CANAL is a subchannel computer program for the steady-state and transient thermal hydraulic analysis of BWR fuel rod bundles. The purpose of this manual is to introduce the user into the mechanism of running the code by providing information about the input data and options

  1. Laboratory biosafety manual

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    This book is in three sections; basic standards of laboratory design and equipment; procedures for safe laboratory practice; and the selection and use of essential biosafety equipment. The intention is that the guidance given in the book should have a broad basis and international application, and that it should be a source from which manuals applicable to local and special conditions can be usefully derived.

  2. PLANS; a finite element program for nonlinear analysis of structures. Volume 2: User's manual

    Science.gov (United States)

    Pifko, A.; Armen, H., Jr.; Levy, A.; Levine, H.

    1977-01-01

    The PLANS system, rather than being one comprehensive computer program, is a collection of finite element programs used for the nonlinear analysis of structures. This collection of programs evolved and is based on the organizational philosophy in which classes of analyses are treated individually based on the physical problem class to be analyzed. Each of the independent finite element computer programs of PLANS, with an associated element library, can be individually loaded and used to solve the problem class of interest. A number of programs have been developed for material nonlinear behavior alone and for combined geometric and material nonlinear behavior. The usage, capabilities, and element libraries of the current programs include: (1) plastic analysis of built-up structures where bending and membrane effects are significant, (2) three dimensional elastic-plastic analysis, (3) plastic analysis of bodies of revolution, and (4) material and geometric nonlinear analysis of built-up structures.

  3. CONPAS 1.0 (CONtainment Performance Analysis System). User`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Jin, Young Ho [Korea Atomic Energy Research Institute, Daeduk (Korea, Republic of)

    1996-04-01

    CONPAS (CONtainment Performance Analysis System) is a verified computer code package to integrate the numerical, graphical, and results-operation aspects of Level 2 probabilistic safety assessments (PSA) for nuclear power plants automatically under a PC window environment. Compared with the existing DOS-based computer codes for Level 2 PSA, the most important merit of the window-based computer code is that user can easily describe and quantify the accident progression models, and manipulate the resultant outputs in a variety of ways. As a main logic for accident progression analysis, CONPAS employs a concept of the small containment phenomenological event tree (CPET) helpful to trace out visually individual accident progressions and of the large supporting event tree (LSET) for its detailed quantification. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules; (1) ET Editor for construction of several event tree models describing the accident progressions, (2) Computer for quantification of the constructed event trees and graphical display of the resultant outputs, (3) Text Editor for preparation of input decks for quanification and utilization of calculational results, and (4) Mechanistic Code Plotter for utilization of results obtained from severe accident analysis codes. Compared with other existing computer codes for Level 2 PSA, the CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friend interface. 10 refs. (Author) .new.

  4. Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles

    International Nuclear Information System (INIS)

    Walker, Lee Kenneth

    2017-01-01

    This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.

  5. Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Lee Kenneth [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-03-01

    This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.

  6. Comparison of characteristics of femtosecond laser-assisted anterior capsulotomy versus manual continuous curvilinear capsulorrhexis: A meta-analysis of 5-year results.

    Science.gov (United States)

    Ali, Muhammad Hassaan; Ullah, Samee; Javaid, Usman; Javaid, Mamoona; Jamal, Samreen; Butt, Nadeem Hafeez

    2017-10-01

    To perform a meta-analysis on the precision and safety of femtosecond laser-assisted anterior capsulotomy versus conventional manual continuous curvilinear capsulorrhexis. This meta-analysis was conducted from February 2010 to November 2014. Literature search on PubMed, Google Scholar, ExcerptaMedica database and Cochrane Library was done to identify randomised controlled trials and case-control studies. SPSS 20 was used for data analysis. Of the 10 articles included, there were 3(30%) randomised controlled trials and 7(70%) non-randomised controlled trials. The meta-analysis was based on a total of 2,882eyes. Of them, 1,498(51.97%) underwent femtosecond laser-assisted capsulotomy and 1,384(48.02%) underwent manual continuous curvilinear capsulorrhexis. The diameter of the capsulotomy and the rates of anterior capsule tear showed no statistical difference between the femtosecond laser group and the manual capsulorrhexis group (p=0.29 and p=0.68). In terms of circularity of capsulotomy, femtosecond laser group had a more significant advantage than the manual capsulorrhexis group (pmanual continuous curvilinear capsulorrhexis.

  7. SPSS survival manual a step by step guide to data analysis using SPSS

    CERN Document Server

    Pallant, Julie

    2010-01-01

    In this thoroughly revised edition of her bestselling text, now covering up to version 18 of the SPSS software, Julie Pallant guides you through the entire research process, helping you choose the right data analysis technique for your project.

  8. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    Bolado, R.; Alonso, A.; Moya, J.M.

    1996-07-01

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  9. The root cause of ability and inability to assemble and install components using written manual with or without diagrams among non-native English speakers: Root cause analysis

    Science.gov (United States)

    Shukri, S. Ahmad; Millar, R.; Gratton, G.; Garner, M.; Noh, H. Mohd

    2017-12-01

    Documentation errors and human errors are often claimed to be the contributory factors for aircraft maintenance mistakes. This paper highlights the preliminary results of the third phase of a four-phased research on communication media that are utilised in an aircraft maintenance organisation. The second phase has looked into the probability of success and failure in completing a task by 60 subjects while in this third phase, the same subjects have been interviewed immediately after completing the task by using Root Cause Analysis (RCA) method. It is discovered that the root cause of their inability to finish the task while using only written manual is the absence of diagrams. However, haste is identified to be the root cause for the incompletion of the task when both manual and diagram are given to the participants. It is observed that those who are able to complete the task is due to their reference to both manual and diagram, simultaneously.

  10. Pushover Analysis of Steel Seismic Resistant Frames with Reduced Web Section and Reduced Beam Section Connections

    Directory of Open Access Journals (Sweden)

    Daniel Tomas Naughton

    2017-10-01

    Full Text Available The widespread brittle failure of welded beam-to-column connections caused by the 1994 Northridge and 1995 Kobe earthquakes highlighted the need for retrofitting measures effective in reducing the strength demand imposed on connections under cyclic loading. Researchers presented the reduced beam section (RBS as a viable option to create a weak zone away from the connection, aiding the prevention of brittle failure at the connection weld. More recently, an alternative connection known as a reduced web section (RWS has been developed as a potential replacement, and initial studies show ideal performance in terms of rotational capacity and ductility. This study performs a series of non-linear static pushover analyses using a modal load case on three steel moment-resisting frames of 4-, 8-, and 16-storeys. The frames are studied with three different connection arrangements; fully fixed moment connections, RBSs and RWSs, in order to compare the differences in capacity curves, inter-storey drifts, and plastic hinge formation. The seismic-resistant connections have been modeled as non-linear hinges in ETABS, and their behavior has been defined by moment-rotation curves presented in previous recent research studies. The frames are displacement controlled to the maximum displacement anticipated in an earthquake with ground motions having a 2% probability of being exceeded in 50 years. The study concludes that RWSs perform satisfactorily when compared with frames with fully fixed moment connections in terms of providing consistent inter-storey drifts without drastic changes in drift between adjacent storeys in low- to mid-rise frames, without significantly compromising the overall strength capacity of the frames. The use of RWSs in taller frames causes an increase in inter-storey drifts in the lower storeys, as well as causing a large reduction in strength capacity (33%. Frames with RWSs behave comparably to frames with RBSs and are deemed a suitable

  11. TMAP/Mod 1: Tritium Migration Analysis Program code description and user's manual

    International Nuclear Information System (INIS)

    Merrill, B.J.; Jones, J.L.; Holland, D.F.

    1986-01-01

    The Tritium Migration Analysis Program (TMAP) has been developed by the Fusion Safety Program of EG and G Idaho, Inc., at the Idaho National Engineering Laboratory (INEL) as a safety analysis code to analyze tritium loss from fusion systems during normal operation and under accident conditions. TMAP is a one-dimensional code that determines tritium movement and inventories in a system of interconnected enclosures and wall structures. In addition, the thermal response of structures is modeled to provide temperature information required for calculations of tritium movement. The program is written in FORTRAN 4 and has been implemented on the National Magnetic Fusion Energy Computing Center (NMFECC)

  12. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0. Volume 5, Systems Analysis and Risk Assessment (SARA) tutorial manual

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs) primarily for nuclear power plants. This volume is the tutorial manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons is provided that guides the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis capabilities

  13. 14 CFR 121.141 - Airplane flight manual.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Airplane flight manual. 121.141 Section 121... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Manual Requirements § 121.141 Airplane flight manual. (a) Each certificate holder shall keep a current approved airplane flight manual for each type of...

  14. 46 CFR 160.176-21 - User manuals.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false User manuals. 160.176-21 Section 160.176-21 Shipping...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Inflatable Lifejackets § 160.176-21 User manuals. (a) The manufacturer must develop a user's manual for each model of inflatable lifejacket. The content of the manual...

  15. IAC user manual

    Science.gov (United States)

    Vos, R. G.; Beste, D. L.; Gregg, J.

    1984-01-01

    The User Manual for the Integrated Analysis Capability (IAC) Level 1 system is presented. The IAC system currently supports the thermal, structures, controls and system dynamics technologies, and its development is influenced by the requirements for design/analysis of large space systems. The system has many features which make it applicable to general problems in engineering, and to management of data and software. Information includes basic IAC operation, executive commands, modules, solution paths, data organization and storage, IAC utilities, and module implementation.

  16. STAR (structural test and analysis database for reliable design) Version 7.1. User's manual

    International Nuclear Information System (INIS)

    Hosogai, Hiromi; Kawasaki, Nobuchika; Kasahara, Naoto

    1998-12-01

    The STAR is characterized by having two supporting functions for developing strength evaluation methods in addition to usual data base management system, an automatic damage calculation function with external programs and an analysis system on accuracy of prediction. This report describes the structure and user information for execution of STAR code. (K. Itami)

  17. Light water reactor fuel analysis code. FEMAXI-6 (Ver.1). Detailed structure and user's manual

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Saitou, Hiroaki

    2006-02-01

    A light water reactor fuel analysis code FEMAXI-6 is an advanced version which has been produced by integrating the former version FEMAXI-V with numerous functional improvements and extensions. In particular, the FEMAXI-6 code has attained a complete coupled solution of thermal analysis and mechanical analysis, enabling an accurate prediction of pellet-clad gap size and PCMI in high burnup fuel rods. Also, such new models have been implemented as pellet-clad bonding and fission gas bubble swelling, and linkage function with detailed burning analysis code has been enhanced. Furthermore, a number of new materials properties and parameters have been introduced. With these advancements, the FEMAXI-6 code has been upgraded to a versatile analytical tool for high burnup fuel behavior not only in the normal operation but also in anticipated transient conditions. This report describes in detail the design, basic theory and structure, models and numerical method, improvements and extensions, and method of model modification. In order to facilitate effective and wide-ranging application of the code, formats and methods of input/output of the code are also described, and a sample output in an actual form is included. (author)

  18. The GRASP 3: Graphical Reliability Analysis Simulation Program. Version 3: A users' manual and modelling guide

    Science.gov (United States)

    Phillips, D. T.; Manseur, B.; Foster, J. W.

    1982-01-01

    Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.

  19. Langley Stability and Transition Analysis Code (LASTRAC) Version 1.2 User Manual

    Science.gov (United States)

    Chang, Chau-Lyan

    2004-01-01

    LASTRAC is a general-purposed, physics-based transition prediction code released by NASA for Laminar Flow Control studies and transition research. The design and development of the LASTRAC code is aimed at providing an engineering tool that is easy to use and yet capable of dealing with a broad range of transition related issues. It was written from scratch based on the state-of-the-art numerical methods for stability analysis and modern software technologies. At low fidelity, it allows users to perform linear stability analysis and N-factor transition correlation for a broad range of flow regimes and configurations by using either the linear stability theory or linear parabolized stability equations method. At high fidelity, users may use nonlinear PSE to track finite-amplitude disturbances until the skin friction rise. This document describes the governing equations, numerical methods, code development, detailed description of input/output parameters, and case studies for the current release of LASTRAC.

  20. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual

    International Nuclear Information System (INIS)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained

  1. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    Science.gov (United States)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  2. FORECAST: Regulatory effects cost analysis software manual -- Version 4.1. Revision 1

    International Nuclear Information System (INIS)

    Lopez, B.; Sciacca, F.W.

    1996-07-01

    The FORECAST program was developed to facilitate the preparation of the value-impact portion of NRC regulatory analyses. This PC program integrates the major cost and benefit considerations that may result from a proposed regulatory change. FORECAST automates much of the calculations typically needed in a regulatory analysis and thus reduces the time and labor required to perform these analyses. More importantly, its integrated and consistent treatment of the different value-impact considerations should help assure comprehensiveness, uniformity, and accuracy in the preparation of NRC regulatory analyses. The Current FORECAST Version 4.1 has been upgraded from the previous version and now includes an uncertainty package, an automatic cost escalation package, and other improvements. In addition, it now explicitly addresses public health impacts, occupational health impacts, onsite property damage, and government costs. Thus, FORECAST Version 4.1 can treat all attributes normally quantified in a regulatory analysis

  3. Laboratory manual on sample preparation procedures for x-ray micro-analysis

    International Nuclear Information System (INIS)

    1997-01-01

    X-ray micro fluorescence is a non-destructive and sensitive method for studying the microscopic distribution of different elements in almost all kinds of samples. Since the beginning of this century, x-rays and electrons have been used for the analysis of many different kinds of material. Techniques which rely on electrons are mainly developed for microscopic studies, and are used in conventional Electron Microscopy (EM) or Scanning Electron Microscopy (SEM), while x-rays are widely used for chemical analysis at the microscopic level. The first chemical analysis by fluorescence spectroscopy using small x-ray beams was conducted in 1928 by Glockner and Schreiber. Since then much work has been devoted to developing different types of optical systems for focusing an x-ray beam, but the efficiency of these systems is still inferior to the conventional electron optical systems. However, even with a poor optical efficiency, the x-ray microbeam has many advantages compared with electron or proton induced x-ray emission methods. These include: The analyses are non-destructive, losses of mass are negligible, and due to the low thermal loading of x-rays, materials which may be thermally degraded can be analysed; Samples can be analysed in air, and no vacuum is required, therefore specimens with volatile components such as water in biological samples, can be imaged at normal pressure and temperature; No charging occurs during analysis and therefore coating of the sample with a conductive layer is not necessary; With these advantages, simpler sample preparation procedures including mounting and preservation can be used

  4. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    International Nuclear Information System (INIS)

    1977-10-01

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parameter modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table

  5. Transportation Routing Analysis Geographic Information System (WebTRAGIS) User's Manual

    International Nuclear Information System (INIS)

    Michelhaugh, R.D.

    2000-01-01

    In the early 1980s, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Both of these models have been used by the U.S. Department of Energy (DOE) community for a variety of routing needs over the years. One of the primary uses of the models has been to determine population-density information, which is used as input for risk assessment with the RADTRAN model, which is available on the TRANSNET computer system. During the recent years, advances in the development of geographic information systems (GISs) have resulted in increased demands from the user community for a GIS version of the ORNL routing models. In April 1994, the DOE Transportation Management Division (EM-261) held a Baseline Requirements Assessment Session with transportation routing experts and users of the HIGHWAY and INTERLINE models. As a result of the session, the development of a new GIS routing model, Transportation Routing Analysis GIS (TRAGIS), was initiated. TRAGIS is a user-friendly, GIS-based transportation and analysis computer model. The older HIGHWAY and INTERLINE models are useful to calculate routes, but they cannot display a graphic of the calculated route. Consequently, many users have experienced difficulty determining the proper node for facilities and have been confused by or have misinterpreted the text-based listing from the older routing models. Some of the primary reasons for the development of TRAGIS are (a) to improve the ease of selecting locations for routing, (b) to graphically display the calculated route, and (c) to provide for additional geographic analysis of the route

  6. Partial wave analysis for folded differential cross sections

    Science.gov (United States)

    Machacek, J. R.; McEachran, R. P.

    2018-03-01

    The value of modified effective range theory (MERT) and the connection between differential cross sections and phase shifts in low-energy electron scattering has long been recognized. Recent experimental techniques involving magnetically confined beams have introduced the concept of folded differential cross sections (FDCS) where the forward (θ ≤ π/2) and backward scattered (θ ≥ π/2) projectiles are unresolved, that is the value measured at the angle θ is the sum of the signal for particles scattered into the angles θ and π - θ. We have developed an alternative approach to MERT in order to analyse low-energy folded differential cross sections for positrons and electrons. This results in a simplified expression for the FDCS when it is expressed in terms of partial waves and thereby enables one to extract the first few phase shifts from a fit to an experimental FDCS at low energies. Thus, this method predicts forward and backward angle scattering (0 to π) using only experimental FDCS data and can be used to determine the total elastic cross section solely from experimental results at low-energy, which are limited in angular range.

  7. Cross-sectional dependence in panel data analysis

    NARCIS (Netherlands)

    Sarafidis, V.; Wansbeek, T.J.

    2012-01-01

    This article provides an overview of the existing literature on panel data models with error cross-sectional dependence (CSD). We distinguish between weak and strong CSD and link these concepts to the spatial and factor structure approaches. We consider estimation under strong and weak exogeneity of

  8. Commentary: Mediation Analysis, Causal Process, and Cross-Sectional Data

    Science.gov (United States)

    Shrout, Patrick E.

    2011-01-01

    Maxwell, Cole, and Mitchell (2011) extended the work of Maxwell and Cole (2007), which raised important questions about whether mediation analyses based on cross-sectional data can shed light on longitudinal mediation process. The latest article considers longitudinal processes that can only be partially explained by an intervening variable, and…

  9. Elemental composition of paint cross sections by nuclear microprobe analysis

    International Nuclear Information System (INIS)

    Nens, B.; Trocellier, P.; Engelmann, C.; Lahanier, C.

    1982-09-01

    Physico-chemical characterization of pigments used in artistic painting give precious indications on age of paintings and sometimes on geographical origin of ores. After recalling the principle of protons microprobe, first results obtained by microanalysis of painting cross sections for non destructive microanalysis of impurities in white lead are given [fr

  10. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  11. SCAP - a Shaped Charge Analysis Program: user's manual for SCAP 1. 0

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A.C.

    1985-04-01

    The basic modeling and format for a shaped charge analysis program, SCAP, is described. The code models the motion of liner elements due to explosive loading, jet formation, jet breakup and target penetration through application of a series of analytical approximations. The structure of the code is intended to provide flexibility in shaped charge device and target configurations and in modeling techniques. The code is designed for interactive use and produces both printed and plotted output. Examples of code output are given and compared with experimental data. 19 refs., 13 figs.

  12. CATDAT - A program for parametric and nonparametric categorical data analysis user's manual, Version 1.0

    International Nuclear Information System (INIS)

    Peterson, James R.; Haas, Timothy C.; Lee, Danny C.

    2000-01-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network

  13. NDS EXFOR manual

    International Nuclear Information System (INIS)

    Lemmel, H.D.

    1996-01-01

    EXFOR is the agreed exchange format for the transmission of nuclear reaction data between national and international nuclear data centers for the benefit of nuclear data users in all countries. The IAEA Nuclear Data Section uses the EXFOR system not only for the center-to-center data exchange but also as its data storage and retrieval system. This NDS EXFOR MANUAL therefore contains the agreed EXFOR coding rules and format, supplemented by NDS internal compilation rules. The EXFOR system and the EXFOR nuclear data library with several million data records originate from the cooperation of an increasing number of data centers whose names and addresses can be found inside the Manual. Their contributions and cooperative efforts are gratefully acknowledged. (author)

  14. NDS EXFOR manual

    International Nuclear Information System (INIS)

    Lemmel, H.D.

    1985-08-01

    EXFOR is the agreed exchange format for the transmission of nuclear reaction data between national and international nuclear data centers for the benefit of nuclear data users in all countries. The IAEA Nuclear Data Section uses the EXFOR system not only for the center-to-center data exchange but also as its data storage and retrieval system. This NDS EXFOR MANUAL therefore contains the agreed EXFOR coding rules and format, supplemented by NDS internal compilation rules. The EXFOR system and the EXFOR nuclear data library with several million data records originate from the cooperation of an increasing number of data centers whose names and addresses can be found inside the Manual. Their contributions and cooperative efforts are gratefully acknowledged. (author)

  15. The approach to analysis of significance of flaws in ASME section III and section XI

    International Nuclear Information System (INIS)

    Cowan, A.

    1979-01-01

    ASME III Appendix G and ASME XI Appendix A describe linear elastic fracture mechanics methods to assess the significance of defects in thick-walled pressure vessels for nuclear reactor systems. The assessment of fracture toughness, Ksub(Ic), is based upon recommendations made by a Task Group of the USA Pressure Vessel Research Committee and is dependent upon correlations with drop weight and Charpy V-notch data to give a lower bound of fracture toughness Ksub(IR). The methods used in the ASME Appendices are outlined noting that, whereas ASME III Appendix G defines a procedure for obtaining allowable pressure vessel loadings for normal service in the presence of a defect, ASME XI Appendix A defines methods for assessing the significance of defects (found by volumetric inspection) under normal and emergency and faulted conditions. The methods of analysis are discussed with respect to material properties, flaw characterisation, stress analysis and recommended safety factors; a short discussion is given on the applicability of the data and methods to other materials and non-nuclear structures. (author)

  16. Is automatic CPAP titration as effective as manual CPAP titration in OSAHS patients? A meta-analysis.

    Science.gov (United States)

    Gao, Weijie; Jin, Yinghui; Wang, Yan; Sun, Mei; Chen, Baoyuan; Zhou, Ning; Deng, Yuan

    2012-06-01

    It is costly and time-consuming to conduct the standard manual titration to identify an effective pressure before continuous positive airway pressure (CPAP) treatment for obstructive sleep apnea (OSA) patients. Automatic titration is cheaper and more easily available than manual titration. The purpose of this systematic review was to evaluate the effect of automatic titration in identifying a pressure and on the improvement of apnea/hyponea index (AHI) and somnolence, the change of sleep quality, and the acceptance and compliance of CPAP treatment, compared with the manual titration. A systematic search was made of the PubMed, EMBASE, Cochrane Library, SCI, China Academic Journals Full-text Databases, Chinese Biomedical Literature Database, Chinese Scientific Journals Databases and Chinese Medical Association Journals. Randomized controlled trials comparing automatic titration and manual titration were reviewed. Studies were pooled to yield odds ratios (OR) or mean differences (MD) with 95% confidence intervals (CI). Ten trials involving 849 patients met the inclusion criteria. It is hard to identify a trend in the pressures determined by either automatic or manual titration. Automatic titration can improve the AHI (MD = 0.03/h, 95% CI = -4.48 to 4.53) and Epworth sleepiness scale (SMD = -0.02, 95% CI = -0.34 to 0.31,) as effectively as the manual titration. There is no difference between sleep architecture under automatic titration or manual titration. The acceptance of CPAP treatment (OR = 0.96, 95% CI = 0.60 to 1.55) and the compliance with treatment (MD = -0.04, 95% CI = -0.17 to 0.10) after automatic titration is not different from manual titration. Automatic titration is as effective as standard manual titration in improving AHI, somnolence while maintaining sleep quality similar to the standard method. In addition, automatic titration has the same effect on the acceptance and compliance of CPAP treatment as manual titration. With the potential advantage

  17. Treatment effectiveness and fidelity of manual therapy to the knee: A systematic review and meta-analysis.

    Science.gov (United States)

    Salamh, Paul; Cook, Chad; Reiman, Michael P; Sheets, Charles

    2017-09-01

    Manual therapy (MT) is a commonly used treatment for knee osteoarthritis (OA) but to date only one systematic review has explored its effectiveness. The purpos e of the present study was to perform a systematic review and meta-analysis of the literature, to determine the effectiveness and fidelity of studies using MT techniques in individuals with knee OA. Relevant studies were assessed for inclusion. Effectiveness was measured using effect sizes, and methodological bias and treatment fidelity were both explored. Effect sizes were calculated using standardized mean differences (SMD) based on pooled data depending on statistical and clinical heterogeneity, as well as risk of bias. The search captured 2,969 studies; after screening, 12 were included. Four had a low risk of bias and high treatment fidelity. For self-reported function, comparing MT with no treatment resulted in a large effect size (standardized mean difference [SMD] 0.84), as did adding MT to a comparator treatment (SMD 0.78). A significant difference was found for pain when adding MT to a comparator treatment (SMD 0.73). The findings in the present meta-analytical review support the use of MT versus a number of different comparators for improvement in self-reported knee function. Lesser support is present for pain reduction, and no endorsement of functional performance can be made at this time. Copyright © 2016 John Wiley & Sons, Ltd.

  18. BEACON/MOD2A: a computer program for subcompartment analysis of nuclear reactor containment. A user's manual

    International Nuclear Information System (INIS)

    Wells, R.A.

    1979-03-01

    The BEACON code is a Best Estimate Advanced Containment code which being developed by EG and G, Idaho, Inc., at the Idaho National Engineering Laboratory. The program is designed to perform a best estimate analysis of the flow of a mixture of air, water, and steam in a nuclear reactor containment system under loss-of-coolant accident conditions. The code can simulate two-component, two-phase fluid flow in complex geometries using a combination of two-dimensional, one-dimensional, and lumped-parameter representations for the various parts of the system. The current version of BEACON, which is designated BEACON/MOD2A, contains mass and heat transfer models for wall film and for wall conduction. It is suitable for the evaluation of short term transients in PWR dry containment systems. This manual describes the models employed in BEACON/MOD2A and specifies code implementation requirements. It provides application information for input data preparation and for output data interpretation

  19. Description and user's manual of light water reactor fuel analysis code FEMAXI-IV (Ver.2)

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Saitou, Hiroaki.

    1997-03-01

    FEMAXI-IV is an advanced version of FEMAXI-III, the analysis code of light water reactor fuel behavior in which various functions and improvements have been incorporated. The present report describes in detail the basic theories and structure, the models and numerical solutions applied, and the material properties adopted in the version 2 which is an improved version of the first version of FEMAXI-IV. In FEMAXI-IV (Ver.2), bugs have been fixed, pellet thermal conductivity properties have been updated, and thermal-stress-induced FP gas release model have been incorporated. In order to facilitate effective and wide-ranging application of the code, types and methods of input/output of the code are also described, and a sample output in an actual form is included. (author)

  20. User manual of the CATSS system (version 1.0) communication analysis tool for space station

    Science.gov (United States)

    Tsang, C. S.; Su, Y. T.; Lindsey, W. C.

    1983-01-01

    The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.

  1. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  2. GENLPLOT: An interactive program for display and analysis of data: User's manual

    International Nuclear Information System (INIS)

    Sullivan, J.D.; Grisar, C.C.

    1987-08-01

    GENLPLOT is an interactive program written in FORTRAN and running under VAX/VMS that enables technicians, scientists, engineers, and other users to quickly and accurately examine and analyze data. The current version utilizes the GRAPAC4 plot package, reads a standard input file or permits direct data entry, and is optimized for use with data stored in MDS databases. This program has been the principal interactive data analysis tool used on the Tara Tandem Mirror Experiment and on the Constance II Mirror Experiment. The program is menu driven with options selected on command lines distinguished by various prompts. Subsequent changes and additions to the program will be indicated by a version number greater than that appearing in the welcome message and will be documented in the appropriate menu(s)

  3. Radiation densitometry in tree-ring analysis: a review and procedure manual

    Energy Technology Data Exchange (ETDEWEB)

    Parker, M.L.; Taylor, F.G.; Doyle, T.W.; Foster, B.E.; Cooper, C.; West, D.C.

    1985-01-01

    An x-ray densitometry of wood facility is being established by the Environmental Sciences Division, Oak Ridge Natioanl Laboratory (ORNL). The objective is to apply tree-ring data to determine whether or not there is a fertilizer effect on tree growth from increased atmospheric carbon dioxide since the beginning of the industrial era. Intra-ring width and density data, including ring-mass will be detemined from tree-ring samples collected from sites located throughout the United States and Canada. This report is designed as a guide to assist ORNL scientists in building the x-ray densitometry system. The history and development of x-ray densitometry in tree-ring research is examined and x-ray densitometry is compared with other techniques. Relative wood and tree characteristics are described as are environmental and genetic factors affecting tree growth responses. Methods in x-ray densitometry are examined in detail and the techniques used at four operating laboratories are described. Some ways that dendrochronology has been applied in dating, in wood quality, and environmental studies are presented, and a number of tree-ring studies in Canada are described. An annotated bibliography of radiation densitometry in tree-ring analysis and related subjects is included.

  4. User`s manual of a support system for human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yokobayashi, Masao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Kazuo

    1995-10-01

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user`s guide of the system. (author).

  5. MARFA user's manual: Migration analysis of radionuclides in the far field

    International Nuclear Information System (INIS)

    Painter, S.; Mancillas, J.

    2013-12-01

    The computer code Migration Analysis of Radionuclides in the Far Field (MARFA) uses a particle-based Monte Carlo method to simulate the transport of radionuclides in a sparsely fractured geological medium. The algorithm uses non-interacting particles to represent packets of radionuclide mass. These particles are moved through the system according to rules that mimic the underlying physical transport and retention processes. The physical processes represented in MARFA include advection, longitudinal dispersion, Fickian diffusion into an infinite or finite rock matrix, equilibrium sorption, decay, and in-growth. Because the algorithm uses non-interacting particles, the transport and retention processes are limited to those that depend linearly on radionuclide concentration. Multiple non-branching decay chains of arbitrary length are supported, as is full heterogeneity in the transport and retention properties. Two variants of the code are provided. These two versions differ in how particles are routed through the computational domain. In MARFA 3.2.3, transport is assumed to occur along a set of trajectories or pathways that originate at radionuclide source locations. The trajectories are intended to represent the movement of hypothetical, advectively transported groundwater tracers and are typically calculated by pathline tracing in a discrete fracture network flow code. The groundwater speed and retention properties along each pathway may change in time, but the pathway trajectories are fixed. MARFA 3.3.1 allows the transport effects of changing flow directions to be represented by abandoning the fixed pathways and performing node routing within MARFA. (orig.)

  6. Radiation densitometry in tree-ring analysis: a review and procedure manual

    International Nuclear Information System (INIS)

    Parker, M.L.; Taylor, F.G.; Doyle, T.W.; Foster, B.E.; Cooper, C.; West, D.C.

    1985-01-01

    An x-ray densitometry of wood facility is being established by the Environmental Sciences Division, Oak Ridge Natioanl Laboratory (ORNL). The objective is to apply tree-ring data to determine whether or not there is a fertilizer effect on tree growth from increased atmospheric carbon dioxide since the beginning of the industrial era. Intra-ring width and density data, including ring-mass will be detemined from tree-ring samples collected from sites located throughout the United States and Canada. This report is designed as a guide to assist ORNL scientists in building the x-ray densitometry system. The history and development of x-ray densitometry in tree-ring research is examined and x-ray densitometry is compared with other techniques. Relative wood and tree characteristics are described as are environmental and genetic factors affecting tree growth responses. Methods in x-ray densitometry are examined in detail and the techniques used at four operating laboratories are described. Some ways that dendrochronology has been applied in dating, in wood quality, and environmental studies are presented, and a number of tree-ring studies in Canada are described. An annotated bibliography of radiation densitometry in tree-ring analysis and related subjects is included

  7. ETAP user's manual

    International Nuclear Information System (INIS)

    Watanabe, Norio; Higuchi, Suminori.

    1990-11-01

    The event tree analysis technique has been used in Probabilistic Safety Assessment for LWRs to delineate various accident scenarios leading to core melt or containment failure and to evaluate their frequencies. This technique often requires manual preparation of event trees with iterative process and time-consuming work in data handling. For the purpose of reducing manual efforts in event tree analysis, we developed a new software package named ETAP (Event Tree Analysis Supporting Program) for event tree analysis. ETAP is an interactive PC-based program which has the ability to construct, update, document, and quantify event trees. Because of its fast running capability to quantify event trees, use of the EATP program can make it easy to perform the sensitivity studies on a variety of system/containment performance issues. This report provides a user's manual for ETAP, which describes the structure, installation, and use of EATP. This software runs on NEC/PC-9800 or compatible PCs that have a 640 KB memory and MS-DOS 2.11 or higher. (author)

  8. Environmental Impact Research Program. Osprey Nest Platforms. Section 5.1.6, US Army Corps of Engineers Wildlife Resources Management Manual.

    Science.gov (United States)

    1986-07-01

    34"x 0treated pole, with 101, olts5" min. top diam. 518"x 7" hardwood dowel,6’O set 1- 112" deep, glued TOP VIEW I r 2’-,9 Some as dinam . of pole usedI I...added strength ; these are described in the section entitled Tripod Support. Pole supports. Two designs are suggested for attaching the solid base

  9. MARFA version 3.2.2 user's manual: migration analysis of radionuclides in the far field

    International Nuclear Information System (INIS)

    Painter, Scott; Mancillas, James

    2009-12-01

    The computer code Migration Analysis of Radionuclides in the Far Field (MARFA) uses a particle-based Monte Carlo method to simulate the transport of radionuclides in a sparsely fractured geological medium. Transport in sparsely fractured rock is of interest because this medium may serve as a barrier to migration of radionuclides to the accessible environment. The physical processes represented in MARFA include advection, longitudinal dispersion, Fickian diffusion into an infinite or finite rock matrix, equilibrium sorption, decay, and in-growth. Multiple non-branching decay chains of arbitrary length are supported. This document describes the technical basis and input requirements for MARFA Version 3.2.2. MARFA Version 3.2 included new capabilities to accommodate transient flow velocities and sorption parameters, which are assumed to be piecewise constant in time. Version 3.2.1 was a minor change from Version 3.2 to allow a more convenient input format for sorption information. New capabilities in Version 3.2.2 include an option to specify a non-zero start time for the simulation, an optional input parameter that decreases the amount of retention within a single fracture because of flow channeling, and an alternative method for sampling the radionuclide source. MARFA uses the particle on random streamline segment algorithm /Painter et al. 2006/, a Monte Carlo algorithm combining time-domain random walk methods with pathway stochastic simulation. The algorithm uses non-interacting particles to represent packets of radionuclide mass. These particles are moved through the system according to rules that mimic the underlying physical transport and retention processes. The set of times required for particles to pass through the geological barrier are then used to reconstruct discharge rates (mass or activity basis). Because the algorithm uses non-interacting particles, the transport and retention processes are limited to those that depend linearly on radionuclide

  10. The environmental survey manual

    International Nuclear Information System (INIS)

    1987-08-01

    The purpose of this manual is to provide guidance to the Survey and Sampling and Analysis teams that conduct the one-time Environmental Survey of the major US Department of Energy (DOE) operating facilities. This manual includes a discussion of DOE's policy on environmental issues, a review of statutory guidance as it applies to the Survey, the procedures and protocols to be used by the Survey teams, criteria for the use of the Survey teams in evaluating existing environmental data for the Survey effort, generic technical checklists used in every Survey, health and safety guidelines for the personnel conducting the Survey, including the identification of potential hazards, prescribed protective equipment, and emergency procedures, the required formats for the Survey reports, guidance on identifying environmental problems that need immediate attention by the Operations Office responsible for the particular facility, and procedures and protocols for the conduct of sampling and analysis

  11. Effects of manual threshold setting on image analysis results of a sandstone sample structural characterization by X-ray microtomography

    International Nuclear Information System (INIS)

    Moreira, Anderson C.; Fernandes, Celso P.; Fernandes, Jaquiel S.; Marques, Leonardo C.; Appoloni, Carlos R.; Nagata, Rodrigo

    2009-01-01

    X-ray microtomography is a nondestructive nuclear technique widely applied for samples structural characterization. This methodology permits the investigation of materials porous phase, without special sample preparation, generating bidimensional images of the irradiated sample. The images are generated by the linear attenuation coefficient mapping of the sample. In order to do a quantitative characterization, the images have to be binarized, separating porous phase from the material matrix. The choice of the correct threshold in the grey level histogram is an important and discerning procedure for the binary images creation. Slight variations of the threshold level led to substantial variations in physical parameters determination, like porosity and pore size distribution values. The aim of this work is to evaluate these variations based on some manual threshold setting. Employing Imago image analysis software, four operators determined the porosity and pore size distribution of a sandstone sample by image analysis. The microtomography measurements were accomplished with the following scan conditions: 60 kV, 165 μA, 1 mm Al filter, 0.45 deg step size and 180.0 deg total rotation angle with and 3.8 μm and 11 μm spatial resolution. The global average porosity values, determined by the operators, range from 27.8 to 32.4 % for 3.8 μm spatial resolution and 12.3 to 28.3 % for 11 μm spatial resolution. Percentage differences among the pore size distributions were also found. For the same pore size range, 5.5 % and 17.1 %, for 3.8 μm and 11 μm spatial resolutions respectively, were noted. (author)

  12. Nuclear cross section library for oil well logging analysis

    International Nuclear Information System (INIS)

    Kodeli, I.; Kitsos, S.; Aldama, D.L.; Zefran, B.

    2003-01-01

    As part of the IRTMBA (Improved Radiation Transport Modelling for Borehole Applications) Project of the EU Community's 5 th Programme a special purpose multigroup cross section library to be used in the deterministic (as well as Monte Carlo) oil well logging particle transport calculations was prepared. This library is expected to improve the prediction of the neutron and gamma spectra at the detector positions of the logging tool, and their use for the interpretation of the neutron logging measurements was studied. Preparation and testing of this library is described. (author)

  13. Financial bubbles analysis with a cross-sectional estimator

    OpenAIRE

    Abergel, Frederic; Huth, Nicolas; Toke, Ioane Muni

    2009-01-01

    We highlight a very simple statistical tool for the analysis of financial bubbles, which has already been studied in [1]. We provide extensive empirical tests of this statistical tool and investigate analytically its link with stocks correlation structure.

  14. A global analysis of inclusive diffractive cross sections at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Royon, C.; Schoeffel, L. [Service de Physique des Particules, CE-Saclay, F-91191 Gif-sur-Yvette Cedex (France); Sapeta, S. [M. Smoluchowski Institue of Physics Jagellonian University Reymonta 4, 30-059 Krakow (Poland); Peschanski, R. [Service de Physique Theorique, CE-Saclay, F-91191 Gif-sur-Yvette Cedex (France); Sauvan, E. [CPPM, IN2P3-CNRS et Universitie de la Mediterranee, F-13288 Marseille Cedex 09 (France)

    2006-10-15

    We describe the most recent data on the diffractive structure functions from the H1 and ZEUS Collaborations at HERA using four models. First, a Pomeron Structure Function (PSF) model, in which the Pomeron is considered as an object with parton distribution functions. Then, the Bartels Ellis Kowalski Wuesthoff (BEKW) approach is discussed, assuming the simplest perturbative description of the Pomeron using a two-gluon ladder. A third approach, the Bialas Peschanski (BP) model, based on the dipole formalism is then described. Finally, we discuss the Golec-Biernat-Wuesthoff (GBW) saturation model which takes into account saturation effects. The best description of all available measurements can be achieved with either the PSF based model or the BEKW approach. In particular, the BEKW prediction allows to include the highest {beta} measurements, which are dominated by higher twists effects and provide an efficient and compact parametrisation of the diffractive cross section. The two other models also give a good description of cross section measurements at small x with a small number of parameters. The comparison of all predictions allows us to identify interesting differences in the behaviour of the effective pomeron intercept and in the shape of the longitudinal component of the diffractive structure functions. In this last part, we present some features that can be discriminated by new experimental measurements, completing the HERA program. (authors)

  15. A global analysis of inclusive diffractive cross sections at HERA

    International Nuclear Information System (INIS)

    Royon, C.; Schoeffel, L.; Sapeta, S.; Peschanski, R.; Sauvan, E.

    2006-10-01

    We describe the most recent data on the diffractive structure functions from the H1 and ZEUS Collaborations at HERA using four models. First, a Pomeron Structure Function (PSF) model, in which the Pomeron is considered as an object with parton distribution functions. Then, the Bartels Ellis Kowalski Wuesthoff (BEKW) approach is discussed, assuming the simplest perturbative description of the Pomeron using a two-gluon ladder. A third approach, the Bialas Peschanski (BP) model, based on the dipole formalism is then described. Finally, we discuss the Golec-Biernat-Wuesthoff (GBW) saturation model which takes into account saturation effects. The best description of all available measurements can be achieved with either the PSF based model or the BEKW approach. In particular, the BEKW prediction allows to include the highest β measurements, which are dominated by higher twists effects and provide an efficient and compact parametrisation of the diffractive cross section. The two other models also give a good description of cross section measurements at small x with a small number of parameters. The comparison of all predictions allows us to identify interesting differences in the behaviour of the effective pomeron intercept and in the shape of the longitudinal component of the diffractive structure functions. In this last part, we present some features that can be discriminated by new experimental measurements, completing the HERA program. (authors)

  16. Caltrans : construction manual

    Science.gov (United States)

    2009-08-01

    Caltrans intends this manual as a resource for all personnel engaged in contract administration. The manual establishes policies and procedures for the construction phase of Caltrans projects. However, this manual is not a contract document. It impos...

  17. User's manual for ASTERIX-2: A two-dimensional modular code system for the steady state and xenon transient analysis of a pebble bed high temperature reactor

    International Nuclear Information System (INIS)

    Wu, T.; Cowan, C.L.; Lauer, A.; Schwiegk, H.J.

    1982-03-01

    The ASTERIX modular code package was developed at KFA Laboratory-Juelich for the steady state and xenon transient analysis of a pebble bed high temperature reactor. The code package was implemented on the Stanford Linear Accelerator Center Computer in August, 1980, and a user's manual for the current version of the code, identified as ASTERIX-2, was prepared as a cooperative effort by KFA Laboratory and GE-ARSD. The material in the manual includes the requirements for accessing the program, a description of the major subroutines, a listing of the input options, and a listing of the input data for a sample problem. The material is provided in sufficient detail for the user to carry out a wide range of analysis from steady state operations to the xenon induced power transients in which the local xenon, temperature, buckling and control feedback effects have been incorporated in the problem solution. (orig.)

  18. Faktor Risiko Manual Handling dengan Keluhan Nyeri Punggung Bawah Pembuat Batu Bata

    Directory of Open Access Journals (Sweden)

    Heru Subaris Kasjono

    2017-08-01

    Full Text Available During done manual work handling for objects work hard, it will cause risk of injury or cause musculoskeletal systems. Risk assessment manual work handling with the methods indicators key-Leitmerkmal Method (LMM intended to know the relationship between time, burden, attitudes of the body, and working conditions manual handling with complaints of the lower back pain at all stages making bricks perceived maker bricks. The kind of research used is surveyed such data is cross sectional. The data taken by lower back pain questionnaire assisted examination physically by nurses and checklist Key-LMM. Analysis relations use the spearman. The results of research acquired at variable time manual handling based frequency raised or operation the transfer of on stage excavation raw materials, the formation and drying bricks there are relations with complaints of  low back pain with p value each are 0,039, 0,047, 0,038 while on the variables of working conditions manual handling in stage excavation raw materials obtained p value of 0,028 with so it can be said there was a correlation between working conditions manual handling with complaints low back pain. A variable load manual handling and attitudes of the body manual handling do not relate in significant to lower back pain all stages making bricks. Conclusion researchers that the variable time manual handling relate in significant with complaints lower back pain in stage excavation raw materials, the formation and drying bricks, while the phase processing raw materials that there was no correlation, in a variable load manual handling and attitudes of the body manual handling all these stage there was no correlation with complaints lower back pain, while variable working conditions manual handling only in stage excavation the raw materials there are relations with complaints lower back pain in the third stage other there was no correlation.

  19. Functional analysis of the cross-section form and X-ray density of human ulnae

    International Nuclear Information System (INIS)

    Hilgen, B.

    1981-01-01

    On 20 ulnae the form of the cross sections and distribution of the X-ray density were investigated in five different cross-section heights. The analysis of the cross-section forms was carried through using plane contraction figures, the X-ray density was established by means of the equidensity line method. (orig.) [de

  20. Chuna (or Tuina Manual Therapy for Musculoskeletal Disorders: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Nam-Woo Lee

    2017-01-01

    Full Text Available Objective. To review the literature and systematically evaluate the effectiveness of Chuna (or Tuina manual therapy (C[T]MT on pain and function for musculoskeletal disorders. Methods. We searched 15 English, Chinese, Japanese, and Korean databases using relevant keywords. All randomized controlled trials (RCTs of C(TMT for musculoskeletal disorders were considered, and we limited analyses to studies with a low-risk bias for randomization and/or allocation concealment. Results. Sixty-six RCTs with 6,170 participants were included. One sham-controlled RCT showed that C(TMT relieved pain more effectively than a sham control (SMD -3.09 [-3.59, -2.59]. For active-controlled RCTs, pooled meta-analysis showed that C(TMT had statistically significant effects on pain reduction, especially compared to traction (P<0.00001, drugs (P=0.04, and physical therapies (P<0.0001. For functional improvement, combined effects of C(TMT with drugs (P=0.04 and traction (P=0.05 also showed similar positive effects. Conclusions. This systematic review suggests that C(TMT is safe and effective for pain reduction and functional improvement for musculoskeletal diseases; however, the evidence for functional improvement was not as strong as for pain reduction. For future studies, high-quality RCTs such as sham-controlled studies with standardized interventions are needed to provide sufficient evidence on the effects of C(TMT for musculoskeletal diseases. Protocol registration number is CRD42016038307 04/07/2016.

  1. Spectral analysis of the electromyograph of the erector spinae muscle before and after a dynamic manual load-lifting test

    Directory of Open Access Journals (Sweden)

    A.C. Cardozo

    2004-07-01

    Full Text Available The aim of the present study was to assess the spectral behavior of the erector spinae muscle during isometric contractions performed before and after a dynamic manual load-lifting test carried out by the trunk in order to determine the capacity of muscle to perform this task. Nine healthy female students participated in the experiment. Their average age, height, and body mass (± SD were 20 ± 1 years, 1.6 ± 0.03 m, and 53 ± 4 kg, respectively. The development of muscle fatigue was assessed by spectral analysis (median frequency and root mean square with time. The test consisted of repeated bending movements from the trunk, starting from a 45º angle of flexion, with the application of approximately 15, 25 and 50% of maximum individual load, to the stand up position. The protocol used proved to be more reliable with loads exceeding 50% of the maximum for the identification of muscle fatigue by electromyography as a function of time. Most of the volunteers showed an increase in root mean square versus time on both the right (N = 7 and the left (N = 6 side, indicating a tendency to become fatigued. With respect to the changes in median frequency of the electromyographic signal, the loads used in this study had no significant effect on either the right or the left side of the erector spinae muscle at this frequency, suggesting that a higher amount and percentage of loads would produce more substantial results in the study of isotonic contractions.

  2. Shell Analysis Manual

    Science.gov (United States)

    1968-04-01

    1.34-8) ax 2 X=0 2 A A 9 1 •z t/2 V2 = ii 1/ 1~ e1 +2 2 + 12 121 91 0 -t/2•1 2 e r" + e- 0 + T e aa 1 td +2 121a d•, 2 d (1.34-9) The strain and stress...Rk sin2 01 -24Rksin2 0 1 Et 2Et tD 0 1 - i To obtain N,, Ne0 Q10 Mo, M enter C 1 and C2 in the preceding general formulas. Table 2. 64-1 presents a...Formulas for the Elastic Constants of Plates with Integral Waffle-Like Stiffening. NACA RM L53 El3a (1953). 3-27 Gerard, G. "Compressive Stability of

  3. A fast reactor transient analysis methodology for PCs: Volume 3, LTC program manual of the QuickBASIC code

    International Nuclear Information System (INIS)

    Ott, K.O.; Chung, L.

    1992-06-01

    This manual augments the detailed manual of the GW-BASIC version of the LTC code for an application in QuickBASIC. As most of the GW-BASIC coding of this program for ''LMR Transient Calculations'' is compatible with QuickBASIC, this manual pertains primarily to the required changes, such as the handling of input and output. The considerable reduction in computation time achieved by this conversion is demonstrated for two sample problems, using a variety of hardware and execution options. The revised code is listed. Although the severe storage limitations of GW-BASIC no longer apply, the LOF transient path has not been completed in this QuickBASIC code. Its advantages are thus primarily in the much faster running time for TOP and LOHS transients. For the fastest PC hardware (486) and execution option the computation time is reduced by a factor of 124 compared to GW-BASIC on a 386/20

  4. Comparison of Manual Mapping and Automated Object-Based Image Analysis of Non-Submerged Aquatic Vegetation from Very-High-Resolution UAS Images

    Directory of Open Access Journals (Sweden)

    Eva Husson

    2016-09-01

    Full Text Available Aquatic vegetation has important ecological and regulatory functions and should be monitored in order to detect ecosystem changes. Field data collection is often costly and time-consuming; remote sensing with unmanned aircraft systems (UASs provides aerial images with sub-decimetre resolution and offers a potential data source for vegetation mapping. In a manual mapping approach, UAS true-colour images with 5-cm-resolution pixels allowed for the identification of non-submerged aquatic vegetation at the species level. However, manual mapping is labour-intensive, and while automated classification methods are available, they have rarely been evaluated for aquatic vegetation, particularly at the scale of individual vegetation stands. We evaluated classification accuracy and time-efficiency for mapping non-submerged aquatic vegetation at three levels of detail at five test sites (100 m × 100 m differing in vegetation complexity. We used object-based image analysis and tested two classification methods (threshold classification and Random Forest using eCognition®. The automated classification results were compared to results from manual mapping. Using threshold classification, overall accuracy at the five test sites ranged from 93% to 99% for the water-versus-vegetation level and from 62% to 90% for the growth-form level. Using Random Forest classification, overall accuracy ranged from 56% to 94% for the growth-form level and from 52% to 75% for the dominant-taxon level. Overall classification accuracy decreased with increasing vegetation complexity. In test sites with more complex vegetation, automated classification was more time-efficient than manual mapping. This study demonstrated that automated classification of non-submerged aquatic vegetation from true-colour UAS images was feasible, indicating good potential for operative mapping of aquatic vegetation. When choosing the preferred mapping method (manual versus automated the desired level of

  5. Qualitative website analysis of information on birth after caesarean section.

    Science.gov (United States)

    Peddie, Valerie L; Whitelaw, Natalie; Cumming, Grant P; Bhattacharya, Siladitya; Black, Mairead

    2015-08-19

    The United Kingdom (UK) caesarean section (CS) rate is largely determined by reluctance to augment trial of labour and vaginal birth. Choice between repeat CS and attempting vaginal birth after CS (VBAC) in the next pregnancy is challenging, with neither offering clear safety advantages. Women may access online information during the decision-making process. Such information is known to vary in its support for either mode of birth when assessed quantitatively. Therefore, we sought to explore qualitatively, the content and presentation of web-based health care information on birth after caesarean section (CS) in order to identify the dominant messages being conveyed. The search engine Google™ was used to conduct an internet search using terms relating to birth after CS. The ten most frequently returned websites meeting relevant purposive sampling criteria were analysed. Sampling criteria were based upon funding source, authorship and intended audience. Images and written textual content together with presence of links to additional media or external web content were analysed using descriptive and thematic analyses respectively. Ten websites were analysed: five funded by Government bodies or professional membership; one via charitable donations, and four funded commercially. All sites compared the advantages and disadvantages of both repeat CS and VBAC. Commercially funded websites favoured a question and answer format alongside images, 'pop-ups', social media forum links and hyperlinks to third-party sites. The relationship between the parent sites and those being linked to may not be readily apparent to users, risking perception of endorsement of either VBAC or repeat CS whether intended or otherwise. Websites affiliated with Government or health services presented referenced clinical information in a factual manner with podcasts of real life experiences. Many imply greater support for VBAC than repeat CS although this was predominantly conveyed through subtle

  6. FY 1998 Report on development project of structural residence of the next generation. Attachment 4. Frame analysis system manual; 1998 nendo jisedai kozo jutaku kaihatsu jigyo shiryohen. 4. Kako kaiseki system manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Attachment 4 of the FY 1998 report on development project of structural residence of the next generation describes the frame analysis system manual. NAPISOS is a program for analyzing static and dynamic responses of three-dimensional structures. The dynamic response is calculated by numerical integration in the time region. Dynamic load can handle dynamic exciting force, uniform seismic input and forced displacement at a node, and analyze linear elasticity and non-linear properties. The static load can handle nodal force, static seismic coefficient and forced displacement at a node, and analyze linear elasticity and non-linear properties. The static analysis also can perform analysis based on the time history response displacement method as the special case. The program implementation procedures fall into 4 general steps; first: inputting/processing of structural data, second: eigen value analysis or equivalent nodal force calculation, third: response calculation by direct integration, preparation of equivalent damping matrix or pre-stress analysis, and fourth: outputting the results. The input data related to control, structure and load are also described. (NEDO)

  7. TAP 2: Performance-Based Training Manual

    International Nuclear Information System (INIS)

    1993-08-01

    Cornerstone of safe operation of DOE nuclear facilities is personnel performing day-to-day functions which accomplish the facility mission. Performance-based training is fundamental to the safe operation. This manual has been developed to support the Training Accreditation Program (TAP) and assist contractors in efforts to develop performance-based training programs. It provides contractors with narrative procedures on performance-based training that can be modified and incorporated for facility-specific application. It is divided into sections dealing with analysis, design, development, implementation, and evaluation

  8. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User's manual to Version 1b (including program reference)

    International Nuclear Information System (INIS)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.; Johnson, G.L.; Mok, G.C.

    1995-02-01

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user's manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers with a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  9. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  10. R-matrix analysis of the 239Pu cross sections up to 1 keV

    International Nuclear Information System (INIS)

    Derrien, H.; de Saussure, G.; Perez, R.B.; Larson, N.M.; Macklin, R.L.

    1986-06-01

    The results are reported of an R-matrix resonance analysis of the 239 Pu neutron cross sections up to 1 keV. After a description of the method of analysis the nearly 1600 resonance parameters obtained are listed and extensive graphical and numerical comparisons between calculated and measured cross-section and transmission date are presented. 47 refs., 47 figs., 8 tabs

  11. Overview and Analysis of the Behaviourist Criticism of the "Diagnostic and Statistical Manual of Mental Disorders (DSM)"

    Science.gov (United States)

    Andersson, Gerhard; Ghaderi, Ata

    2006-01-01

    While a majority of cognitive behavioural researchers and clinicians adhere to the classification system provided in the "Diagnostic and Statistical Manual of Mental Disorders (DSM-IV)," strong objections have been voiced among behaviourists who find the dichotomous allocation of patients into psychiatric diagnoses incompatible with the philosophy…

  12. Laboratory Waste Disposal Manual. Revised Edition.

    Science.gov (United States)

    Stephenson, F. G., Ed.

    This manual is designed to provide laboratory personnel with information about chemical hazards and ways of disposing of chemical wastes with minimum contamination of the environment. The manual contains a reference chart section which has alphabetical listings of some 1200 chemical substances with information on the health, fire and reactivity…

  13. Host Families Matter: The Homestay Manual.

    Science.gov (United States)

    Peace Corps, Washington, DC. Information Collection and Exchange Div.

    This manual provides guidelines, sample documents, and sample lesson plans for the trainers, trainees, and host families involved in homestays for Peace Corps volunteers. The manual contains 11 sections that deal with the following topics: (1) introduction; (2) policy, timelines, and responsibilities; (3) medical and financial issues; (4) host…

  14. New Mexico Response to Intervention Framework Manual

    Science.gov (United States)

    New Mexico Public Education Department, 2014

    2014-01-01

    This manual details the instructional framework and guidance on the Response to Intervention (RtI) process in New Mexico. The manual includes: (1) a section on each of the three instructional tiers; (2) a glossary of key terms; (3) sample forms to assist with the Student Assistance Team (SAT) process; and (4) key resources for teachers.

  15. Missouri DECA: 2010-2011 Policy Manual

    Science.gov (United States)

    Missouri Department of Elementary and Secondary Education, 2011

    2011-01-01

    This paper presents the Missouri DECA Policy Manual. This manual contains the following sections: (1) DECA Board of Directors; (2) State Sales Projects; (3) State Officers; (4) Districts; (5) Competitive Events; (6) General Conference Information; (7) Fall Leadership & State Officer Election Conference; (8) Central Region Leadership…

  16. Quantitative right and left ventricular functional analysis during gated whole-chest MDCT: A feasibility study comparing automatic segmentation to semi-manual contouring

    International Nuclear Information System (INIS)

    Coche, Emmanuel; Walker, Matthew J.; Zech, Francis; Crombrugghe, Rodolphe de; Vlassenbroek, Alain

    2010-01-01

    Purpose: To evaluate the feasibility of an automatic, whole-heart segmentation algorithm for measuring global heart function from gated, whole-chest MDCT images. Material and methods: 15 patients with suspicion of PE underwent whole-chest contrast-enhanced MDCT with retrospective ECG synchronization. Two observers computed right and left ventricular functional indices using a semi-manual and an automatic whole-heart segmentation algorithm. The two techniques were compared using Bland-Altman analysis and paired Student's t-test. Measurement reproducibility was calculated using intraclass correlation coefficient. Results: Ventricular analysis with automatic segmentation was successful in 13/15 (86%) and in 15/15 (100%) patients for the right ventricle and left ventricle, respectively. Reproducibility of measurements for both ventricles was perfect (ICC: 1.00) and very good for automatic and semi-manual measurements, respectively. Ventricular volumes and functional indices except right ventricular ejection fraction obtained from the automatic method were significantly higher for the RV compared to the semi-manual methods. Conclusions: The automatic, whole-heart segmentation algorithm enabled highly reproducible global heart function to be rapidly obtained in patients undergoing gated whole-chest MDCT for assessment of acute chest pain with suspicion of pulmonary embolism.

  17. Eddy current manual, volume 2

    International Nuclear Information System (INIS)

    Cecco, V.S.; Van Drunen, G.; Sharp, F.L.

    1984-09-01

    This report on eddy current testing is divided into three sections: (a) Demonstration of Basic Principles, (b) Practical (Laboratory) Tests and, (c) Typical Certification Questions. It is intended to be used as a supplement to ΣEddy Current Manual, Volume 1Σ (AECL-7523) during CSNDT Foundation Level II and III courses

  18. Technical manual for calculating cooling pond performance

    International Nuclear Information System (INIS)

    Krstulovich, S.F.

    1988-01-01

    This manual is produced in response to a growing number of requests for a technical aid to explain methods for simulating cooling pond performance. As such, it is a compilation of reports, charts and graphs developed through the years for use in analyzing situations. Section II contains a report summarizing the factors affecting cooling pond performance and lists statistical parameters used in developing performance simulations. Section III contains the graphs of simulated cooling pond performance on an hourly basis for various combinations of criteria (wind, solar, depth, air temperature and humidity) developed from the report in Section II. Section IV contains correspondence describing how to develop further data from the graphs in Section III, as well as mathematical models for the system of performance calculation. Section V contains the formulas used to simulate cooling pond performances in a cascade arrangement, such as the Fermilab Main Ring ponds. Section VI contains the calculations currently in use to evaluate the Main Ring pond performance based on current flows and Watts loadings. Section VII contains the overall site drawing of the Main Ring cooling ponds with thermal analysis and physical data

  19. The accuracy of frozen section analysis in ultrasound- guided core needle biopsy of breast lesions

    International Nuclear Information System (INIS)

    Brunner, Andreas H; Sagmeister, Thomas; Kremer, Jolanta; Riss, Paul; Brustmann, Hermann

    2009-01-01

    Limited data are available to evaluate the accuracy of frozen section analysis and ultrasound- guided core needle biopsy of the breast. In a retrospective analysis data of 120 consecutive handheldultrasound- guided 14- gauge automated core needle biopsies (CNB) in 109 consecutive patients with breast lesions between 2006 and 2007 were evaluated. In our outpatient clinic120 CNB were performed. In 59/120 (49.2%) cases we compared histological diagnosis on frozen sections with those on paraffin sections of CNB and finally with the result of open biopsy. Of the cases 42/59 (71.2%) were proved to be malignant and 17/59 (28.8%) to be benign in the definitive histology. 2/59 (3.3%) biopsies had a false negative frozen section result. No false positive results of the intraoperative frozen section analysis were obtained, resulting in a sensitivity, specificity and positive predicting value (PPV) and negative predicting value (NPV) of 95%, 100%, 100% and 90%, respectively. Histological and morphobiological parameters did not show up relevance for correct frozen section analysis. In cases of malignancy time between diagnosis and definitive treatment could not be reduced due to frozen section analysis. The frozen section analysis of suspect breast lesions performed by CNB displays good sensitivity/specificity characteristics. Immediate investigations of CNB is an accurate diagnostic tool and an important step in reducing psychological strain by minimizing the period of uncertainty in patients with breast tumor

  20. 3D automatic quantification applied to optically sectioned images to improve microscopy analysis

    Directory of Open Access Journals (Sweden)

    JE Diaz-Zamboni

    2009-08-01

    Full Text Available New fluorescence microscopy techniques, such as confocal or digital deconvolution microscopy, allow to easily obtain three-dimensional (3D information from specimens. However, there are few 3D quantification tools that allow extracting information of these volumes. Therefore, the amount of information acquired by these techniques is difficult to manipulate and analyze manually. The present study describes a model-based method, which for the first time shows 3D visualization and quantification of fluorescent apoptotic body signals, from optical serial sections of porcine hepatocyte spheroids correlating them to their morphological structures. The method consists on an algorithm that counts apoptotic bodies in a spheroid structure and extracts information from them, such as their centroids in cartesian and radial coordinates, relative to the spheroid centre, and their integrated intensity. 3D visualization of the extracted information, allowed us to quantify the distribution of apoptotic bodies in three different zones of the spheroid.

  1. HEFF---A user's manual and guide for the HEFF code for thermal-mechanical analysis using the boundary-element method

    International Nuclear Information System (INIS)

    St John, C.M.; Sanjeevan, K.

    1991-12-01

    The HEFF Code combines a simple boundary-element method of stress analysis with the closed form solutions for constant or exponentially decaying heat sources in an infinite elastic body to obtain an approximate method for analysis of underground excavations in a rock mass with heat generation. This manual describes the theoretical basis for the code, the code structure, model preparation, and step taken to assure that the code correctly performs its intended functions. The material contained within the report addresses the Software Quality Assurance Requirements for the Yucca Mountain Site Characterization Project. 13 refs., 26 figs., 14 tabs

  2. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

  3. Tank farms criticality safety manual

    International Nuclear Information System (INIS)

    FORT, L.A.

    2003-01-01

    This document defines the Tank Farms Contractor (TFC) criticality safety program, as required by Title 10 Code of Federal Regulations (CFR-), Subpart 830.204(b)(6), ''Documented Safety Analysis'' (10 CFR- 830.204 (b)(6)), and US Department of Energy (DOE) 0 420.1A, Facility Safety, Section 4.3, ''Criticality Safety.'' In addition, this document contains certain best management practices, adopted by TFC management based on successful Hanford Site facility practices. Requirements in this manual are based on the contractor requirements document (CRD) found in Attachment 2 of DOE 0 420.1A, Section 4.3, ''Nuclear Criticality Safety,'' and the cited revisions of applicable standards published jointly by the American National Standards Institute (ANSI) and the American Nuclear Society (ANS) as listed in Appendix A. As an informational device, requirements directly imposed by the CRD or ANSI/ANS Standards are shown in boldface. Requirements developed as best management practices through experience and maintained consistent with Hanford Site practice are shown in italics. Recommendations and explanatory material are provided in plain type

  4. State of laboratory manual instruction in California community college introductory (non-majors) biology laboratory instruction

    Science.gov (United States)

    Priest, Michelle

    College students must complete a life science course prior to graduation for a bachelor's degree. Generally, the course has lecture and laboratory components. It is in the laboratory where there are exceptional opportunities for exploration, challenge and application of the material learned. Optimally, this would utilize the best of inquiry based approaches. Most community colleges are using a home-grown or self written laboratory manual for the direction of work in the laboratory period. Little was known about the motivation, development and adaptation of use. It was also not known about the future of the laboratory manuals in light of the recent learning reform in California Community Colleges, Student Learning Outcomes. Extensive interviews were conducted with laboratory manual authors to determine the motivation, process of development, who was involved and learning framework used in the creation of the manuals. It was further asked of manual authors their ideas about the future of the manual, the development of staff and faculty and finally, the role Student Learning Outcomes would play in the manual. Science faculty currently teaching the non-majors biology laboratories for at least two semesters were surveyed on-line about actual practice of the manual, assessment, manual flexibility, faculty training and incorporation of Student Learning Outcomes. Finally, an evaluation of the laboratory manual was done using an established Laboratory Task Analysis Instrument. Laboratory manuals were evaluated on a variety of categories to determine the level of inquiry instruction done by students in the laboratory section. The results were that the development of homegrown laboratory manuals was done by community colleges in the Los Angeles and Orange Counties in an effort to minimize the cost of the manual to the students, to utilize all the exercises in a particular lab and to effectively utilize the materials already owned by the department. Further, schools wanted to

  5. ELCOS: the PSI code system for LWR core analysis. Part II: user's manual for the fuel assembly code BOXER

    International Nuclear Information System (INIS)

    Paratte, J.M.; Grimm, P.; Hollard, J.M.

    1996-02-01

    ELCOS is a flexible code system for the stationary simulation of light water reactor cores. It consists of the four computer codes ETOBOX, BOXER, CORCOD and SILWER. The user's manual of the second one is presented here. BOXER calculates the neutronics in cartesian geometry. The code can roughly be divided into four stages: - organisation: choice of the modules, file manipulations, reading and checking of input data, - fine group fluxes and condensation: one-dimensional calculation of fluxes and computation of the group constants of homogeneous materials and cells, - two-dimensional calculations: geometrically detailed simulation of the configuration in few energy groups, - burnup: evolution of the nuclide densities as a function of time. This manual shows all input commands which can be used while running the different modules of BOXER. (author) figs., tabs., refs

  6. R-matrix analysis of the /sup 239/Pu neutron cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Saussure, G. de; Perez, R.B.; Macklin, R.L.

    1986-03-01

    /sup 239/Pu neutron cross-section data in the resolved resonance region were analyzed with the R-Matrix Bayesian Program SAMMY. Below 30 eV the cross sections computed with the multilevel parameters are consistent with recent fission and transmission measurements as well as with older capture and alpha measurements. Above 30 eV no suitable transmission data were available and only fission cross-section measurements were analyzed. However, since the analysis conserves the complete covariance matrix, the analysis can be updated by the Bayes method as transmission measurements become available. To date, the analysis of the fission measurements has been completed up to 300 eV.

  7. Cleaning capacity promoted by motor-driven or manual instrumentation using ProTaper Universal system: Histological analysis

    OpenAIRE

    da Frota, Matheus Franco; Filho, Idomeo Bonetti; Berbert, F?bio Luiz Camargo Villela; Sponchiado, Emilio Carlos; Marques, Andr? Augusto Franco; Garcia, Lucas da Fonseca Roberti

    2013-01-01

    Aim: The aim of this study was to assess the cleaning capacity of the Protaper system using motor-driven or manual instrumentation. Materials and Methods: Ten mandibular molars were randomly separated into 2 groups (n = 5) according to the type of instrumentation performed, as follows: Group 1 - instrumentation with rotary nickel-titanium (Ni-Ti) files using ProTaper Universal System (Dentsply/Maillefer); and, Group 2 - instrumentation with Ni-Ti hand files using ProTaper Universal (Den...

  8. INIS: Database manual

    International Nuclear Information System (INIS)

    2003-01-01

    This document is one in a series of publications known as the INIS Reference Series. It is intended for users of INIS (International Nuclear Information System) output data on various media (FTP file, CD-ROM, e-mail file, earlier magnetic tape, cartridge, etc.). This manual provides a description of each data element including information on contents, structure and usage as well as historical overview of additions, deletions and changes of data elements and their contents that have taken place over the years. Each record contains certain control data fields (001-009), one, two or three bibliographic levels, a set of descriptors, and zero, one or more abstracts, one in English and optionally one or more in another language. In order to facilitate the description of the system, the sequence of data elements is based on the input or, as it is internally called, worksheet format which differs from the exchange format described in the manual IAEA-INIS-7. A separate section is devoted to each data element and deviations from the exchange format are indicated whenever present. As the Record Leader and the Directory are sufficiently explained in Chapter 3.1 of IAEA-INIS-7, the contents of this manual are limited to control fields and data fields; the detailed explanations are intended to supplement the basic information given in Chapter 3.2 of IAEA-INIS-7. Bibliographic levels are used to identify component parts of a publication, i.e. chapters in a book, articles in a journal issue, conference papers in a proceedings volume. All bibliographic levels contained in a record are given in a control data field. Each bibliographic level identifier appears in the subdirectory with a pointer to its position in the record

  9. Automated Image Analysis in Undetermined Sections of Human Permanent Third Molars

    DEFF Research Database (Denmark)

    Bjørndal, Lars; Darvann, Tron Andre; Bro-Nielsen, Morten

    1997-01-01

    . Sixty-three photomicrographs (x100), equally distributed among the three sectioning profiles, were scanned in a high-resolution scanner to produce images for the analysis. After initial user interaction for the description of training classes on one image, an automatic segmentation of the images...... sectioning profiles should be analysed. The use of advanced image processing on undemineralized tooth sections provides a rational foundation for further work on the reactions of the odontoblasts to external injuries including dental caries....

  10. ASSERT-4 user's manual

    International Nuclear Information System (INIS)

    Judd, R.A.; Tahir, A.; Carver, M.B.; Stewart, D.G.; Thibeault, P.R.; Rowe, D.S.

    1984-09-01

    ASSERT-4 is an advanced subchannel code being developed primarily to model single- and two-phase flow and heat transfer in horizontal rod bundles. This manual is intended to facilitate the application of this code to the analysis of flow in reactor fuel channels. It contains a brief description of the thermalhydraulic model and ASSERT-4 solution scheme, and other information required by users. This other information includes a detailed discussion of input data requirements, a sample problem and solution, and information describing how to access and run ASSERT-4 on the Chalk River computers

  11. NASCAP programmer's reference manual

    Science.gov (United States)

    Mandell, M. J.; Stannard, P. R.; Katz, I.

    1993-05-01

    The NASA Charging Analyzer Program (NASCAP) is a computer program designed to model the electrostatic charging of complicated three-dimensional objects, both in a test tank and at geosynchronous altitudes. This document is a programmer's reference manual and user's guide. It is designed as a reference to experienced users of the code, as well as an introduction to its use for beginners. All of the many capabilities of NASCAP are covered in detail, together with examples of their use. These include the definition of objects, plasma environments, potential calculations, particle emission and detection simulations, and charging analysis.

  12. 42 Manual

    International Nuclear Information System (INIS)

    Bergstroem, Mats.

    1990-04-01

    A program for two dimensional analysis of γ-γ coincidence matrices has been written. The program is tailored to the VT340 colour graphic terminal and written in VAX-FORTRAN. The program provides viewing of the matrix as well as projections in both energy coordinates, full two dimensional Gauss fit of maximum five peaks in one arbitrarily shaped area of the matrix, etc. A method of determining the colour scale needed has been developed. (author)

  13. A survey of cross-section sensitivity analysis as applied to radiation shielding

    International Nuclear Information System (INIS)

    Goldstein, H.

    1977-01-01

    Cross section sensitivity studies revolve around finding the change in the value of an integral quantity, e.g. transmitted dose, for a given change in one of the cross sections. A review is given of the principal methodologies for obtaining the sensitivity profiles-principally direct calculations with altered cross sections, and linear perturbation theory. Some of the varied applications of cross section sensitivity analysis are described, including the practice, of questionable value, of adjusting input cross section data sets so as to provide agreement with integral experiments. Finally, a plea is made for using cross section sensitivity analysis as a powerful tool for analysing the transport mechanisms of particles in radiation shields and for constructing models of how cross section phenomena affect the transport. Cross section sensitivities in the shielding area have proved to be highly problem-dependent. Without the understanding afforded by such models, it is impossible to extrapolate the conclusions of cross section sensitivity analysis beyond the narrow limits of the specific situations examined in detail. Some of the elements that might be of use in developing the qualitative models are presented. (orig.) [de

  14. PROTEUS-SN User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Shemon, Emily R. [Argonne National Lab. (ANL), Argonne, IL (United States); Smith, Micheal A. [Argonne National Lab. (ANL), Argonne, IL (United States); Lee, Changho [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-02-16

    is a part of the SHARP multi-physics suite for coupled multi-physics analysis of nuclear reactors. This user manual describes how to set up a neutron transport simulation with the PROTEUS-SN code. A companion methodology manual describes the theory and algorithms within PROTEUS-SN.

  15. 21 CFR 864.6160 - Manual blood cell counting device.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Manual blood cell counting device. 864.6160 Section 864.6160 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Manual Hematology Devices § 864.6160 Manual...

  16. 21 CFR 890.5180 - Manual patient rotation bed.

    Science.gov (United States)

    2010-04-01

    ...) MEDICAL DEVICES PHYSICAL MEDICINE DEVICES Physical Medicine Therapeutic Devices § 890.5180 Manual patient rotation bed. (a) Identification. A manual patient rotation bed is a device that turns a patient who is... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Manual patient rotation bed. 890.5180 Section 890...

  17. 21 CFR 868.5915 - Manual emergency ventilator.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Manual emergency ventilator. 868.5915 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Therapeutic Devices § 868.5915 Manual emergency ventilator. (a) Identification. A manual emergency ventilator is a device, usually incorporating a bag and valve, intended to...

  18. 14 CFR 125.75 - Airplane flight manual.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Airplane flight manual. 125.75 Section 125... OPERATIONS: AIRPLANES HAVING A SEATING CAPACITY OF 20 OR MORE PASSENGERS OR A MAXIMUM PAYLOAD CAPACITY OF 6... Airplane flight manual. (a) Each certificate holder shall keep a current approved Airplane Flight Manual or...

  19. Analysis of the 239Pu neutron cross sections from 300 to 2000 eV

    International Nuclear Information System (INIS)

    Derrien, H.; de Saussure, G.

    1990-01-01

    A recent high-resolution measurement of the neutron fission cross section of 239 Pu has allowed the extension from 1 to 2 keV of a previously reported resonance analysis of the neutron cross sections, and an improvement of the previous analysis in the range 0.3 to 1 keV. This report analyzes this region. 8 refs., 1 fig., 2 tabs

  20. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's reference manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  1. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandai National Labs, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandai National Labs, Livermore, CA); Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandai National Labs, Livermore, CA); Hough, Patricia Diane (Sandai National Labs, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  2. Sierra Structural Dynamics Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Reese, Garth M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-19

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Sierra/SD. For a more detailed description of how to use Sierra/SD , we refer the reader to Sierra/SD, User's Notes . Many of the constructs in Sierra/SD are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Sierra/SD are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature. This page intentionally left blank.

  3. Coal marketing manual 1986

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    This manual presents information for the use of marketers, consumers, analysts and investors. The information is presented in a series of tables and figures. Statistics are given for: Australian export tonnages and average export values for 1978-1985; international pig iron production 1976 to 1985; and international crude steel production 1979 to 1985. Trends in Australian export tonnages and prices of coal are reviewed. Details of international loading and discharge ports are given, together with a historical summary of shipping freight-rates since 1982. Long term contract prices for thermal and coking coal to Japan are tabulated. A review of coal and standards is given, together with Australian standards for coal and coke. A section on coal quality is included containing information on consumer coal quality preferences and Australian and Overseas coal brands and qualities. Finally an index is given of contact details of Australian and Overseas exporting companies, government departments, and the Australian Coal Association.

  4. Analytical methods for analysis of neutron cross sections of amino acids and proteins

    International Nuclear Information System (INIS)

    Voi, Dante L.; Ferreira, Francisco de O.; Nunes, Rogerio Chaffin; Carvalheira, Luciana; Rocha, Hélio F. da

    2017-01-01

    Two unpublished analytical processes were developed at IEN-CNEN-RJ for the analysis of neutron cross sections of chemical compounds and complex molecules, the method of data parceling and grouping (P and G) and the method of data equivalence and similarity (E and S) of cross-sections. The former allows the division of a complex compound or molecule so that the parts can be manipulated to construct a value of neutron cross section for the compound or the entire molecule. The second method allows by comparison obtain values of neutron cross-sections of specific parts of the compound or molecule, as the amino acid radicals or its parts. The processes were tested for the determination of neutron cross-sections of the 20 human amino acids and a small database was built for future use in the construction of neutron cross-sections of proteins and other components of the human being cells, also in other industrial applications. (author)

  5. Analytical methods for analysis of neutron cross sections of amino acids and proteins

    Energy Technology Data Exchange (ETDEWEB)

    Voi, Dante L.; Ferreira, Francisco de O.; Nunes, Rogerio Chaffin; Carvalheira, Luciana, E-mail: dante@ien.gov.br, E-mail: fferreira@ien.gov.br, E-mail: Chaffin@ien.gov.br, E-mail: luciana@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Rocha, Hélio F. da, E-mail: helionutro@gmail.com.br [Universidade Federal do Rio de Janeiro (IPPMG/UFRJ), Rio de Janeiro, RJ (Brazil). Instituto de Pediatria

    2017-07-01

    Two unpublished analytical processes were developed at IEN-CNEN-RJ for the analysis of neutron cross sections of chemical compounds and complex molecules, the method of data parceling and grouping (P and G) and the method of data equivalence and similarity (E and S) of cross-sections. The former allows the division of a complex compound or molecule so that the parts can be manipulated to construct a value of neutron cross section for the compound or the entire molecule. The second method allows by comparison obtain values of neutron cross-sections of specific parts of the compound or molecule, as the amino acid radicals or its parts. The processes were tested for the determination of neutron cross-sections of the 20 human amino acids and a small database was built for future use in the construction of neutron cross-sections of proteins and other components of the human being cells, also in other industrial applications. (author)

  6. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  7. SYVAC3 manual

    International Nuclear Information System (INIS)

    Andres, T.H.

    2000-01-01

    SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes the use and structure of SYVAC3. It is intended for modellers, programmers, operators and reviewers who deal with simulation codes based on SYVAC3. From this manual they can learn how to link a model with SYVAC3, how to set up an input file, and how to extract results from output files. The manual lists the subroutines of SYVAC3 that are available for use by models, and describes their argument lists. It also gives an overview of how routines in the File Reading Package, the Parameter Sampling Package and the Time Series Package can be used by programs outside of SYVAC3. (author)

  8. ARDS User Manual

    Science.gov (United States)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  9. SYVAC3 manual

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2000-07-01

    SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes the use and structure of SYVAC3. It is intended for modellers, programmers, operators and reviewers who deal with simulation codes based on SYVAC3. From this manual they can learn how to link a model with SYVAC3, how to set up an input file, and how to extract results from output files. The manual lists the subroutines of SYVAC3 that are available for use by models, and describes their argument lists. It also gives an overview of how routines in the File Reading Package, the Parameter Sampling Package and the Time Series Package can be used by programs outside of SYVAC3. (author)

  10. Appendices section

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2018-04-01

    The last chapter presents some papers related to the subject of the book. They are: 1) Some practical samples of defence in depth analysis for category IV gamma irradiators and 2) Interaction of both gamma radiation and X-rays with matter. A Sample of a manual and of a checklist for weekly maintenance; Sample of a manual and of checklist for monthly maintenance; Sample of a manual and of a checklist for quarterly, semiannual and yearly maintenance; Spreadsheet for a shield survey of a JS 8900 irradiator made by MDS Nordion (Canada); Sample of a water monitoring spreadsheet and two more papers: 1) A commercial game redesigned to aid in the teaching of radioprotection and 2) Recollecting concepts of radioprotection by applying a redesigned commercial game.

  11. Appendices section

    International Nuclear Information System (INIS)

    2018-01-01

    The last chapter presents some papers related to the subject of the book. They are: 1) Some practical samples of defence in depth analysis for category IV gamma irradiators and 2) Interaction of both gamma radiation and X-rays with matter. A Sample of a manual and of a checklist for weekly maintenance; Sample of a manual and of checklist for monthly maintenance; Sample of a manual and of a checklist for quarterly, semiannual and yearly maintenance; Spreadsheet for a shield survey of a JS 8900 irradiator made by MDS Nordion (Canada); Sample of a water monitoring spreadsheet and two more papers: 1) A commercial game redesigned to aid in the teaching of radioprotection and 2) Recollecting concepts of radioprotection by applying a redesigned commercial game

  12. Resonance analysis and evaluation of the 235U neutron induced cross sections

    International Nuclear Information System (INIS)

    Leal, L.C.

    1990-06-01

    Neutron cross sections of fissile nuclei are of considerable interest for the understanding of parameters such as resonance absorption, resonance escape probability, resonance self-shielding,and the dependence of the reactivity on temperature. In the present study, new techniques for the evaluation of the 235 U neutron cross sections are described. The Reich-Moore formalism of the Bayesian computer code SAMMY was used to perform consistent R-matrix multilevel analyses of the selected neutron cross-section data. The Δ 3 -statistics of Dyson and Mehta, along with high-resolution data and the spin-separated fission cross-section data, have provided the possibility of developing a new methodology for the analysis and evaluation of neutron-nucleus cross sections. The results of the analysis consists of a set of resonance parameters which describe the 235 U neutron cross sections up to 500 eV. The set of resonance parameters obtained through a R-matrix analysis are expected to satisfy statistical properties which lead to information on the nuclear structure. The resonance parameters were tested and showed good agreement with the theory. It is expected that the parametrization of the 235 U neutron cross sections obtained in this dissertation represents the current state of art in data as well as in theory and, therefore, can be of direct use in reactor calculations. 44 refs., 21 figs., 8 tabs

  13. Two-dimensional cross-section sensitivity and uncertainty analysis for fusion reactor blankets

    International Nuclear Information System (INIS)

    Embrechts, M.J.

    1982-02-01

    A two-dimensional sensitivity and uncertainty analysis for the heating of the TF coil for the FED (fusion engineering device) blanket was performed. The uncertainties calculated are of the same order of magnitude as those resulting from a one-dimensional analysis. The largest uncertainties were caused by the cross section uncertainties for chromium

  14. Analysis of rosen piezoelectric transformers with a varying cross-section.

    Science.gov (United States)

    Xue, H; Yang, J; Hu, Y

    2008-07-01

    We study the effects of a varying cross-section on the performance of Rosen piezoelectric transformers operating with length extensional modes of rods. A theoretical analysis is performed using an extended version of a one-dimensional model developed in a previous paper. Numerical results based on the theoretical analysis are presented.

  15. Reliability and reproducibility analysis of the Cobb angle and assessing sagittal plane by computer-assisted and manual measurement tools.

    Science.gov (United States)

    Wu, Weifei; Liang, Jie; Du, Yuanli; Tan, Xiaoyi; Xiang, Xuanping; Wang, Wanhong; Ru, Neng; Le, Jinbo

    2014-02-06

    Although many studies on reliability and reproducibility of measurement have been performed on coronal Cobb angle, few results about reliability and reproducibility are reported on sagittal alignment measurement including the pelvis. We usually use SurgimapSpine software to measure the Cobb angle in our studies; however, there are no reports till date on its reliability and reproducible measurements. Sixty-eight standard standing posteroanterior whole-spine radiographs were reviewed. Three examiners carried out the measurements independently under the settings of manual measurement on X-ray radiographies and SurgimapSpine software on the computer. Parameters measured included pelvic incidence, sacral slope, pelvic tilt, Lumbar lordosis (LL), thoracic kyphosis, and coronal Cobb angle. SPSS 16.0 software was used for statistical analyses. The means, standard deviations, intraclass and interclass correlation coefficient (ICC), and 95% confidence intervals (CI) were calculated. There was no notable difference between the two tools (P = 0.21) for the coronal Cobb angle. In the sagittal plane parameters, the ICC of intraobserver reliability for the manual measures varied from 0.65 (T2-T5 angle) to 0.95 (LL angle). Further, for SurgimapSpine tool, the ICC ranged from 0.75 to 0.98. No significant difference in intraobserver reliability was found between the two measurements (P > 0.05). As for the interobserver reliability, measurements with SurgimapSpine tool had better ICC (0.71 to 0.98 vs 0.59 to 0.96) and Pearson's coefficient (0.76 to 0.99 vs 0.60 to 0.97). The reliability of SurgimapSpine measures was significantly higher in all parameters except for the coronal Cobb angle where the difference was not significant (P > 0.05). Although the differences between the two methods are very small, the results of this study indicate that the SurgimapSpine measurement is an equivalent measuring tool to the traditional manual in coronal Cobb angle, but is advantageous in spino

  16. Traditional manual acupuncture combined with rehabilitation therapy for shoulder hand syndrome after stroke within the Chinese healthcare system: a systematic review and meta-analysis.

    Science.gov (United States)

    Peng, Le; Zhang, Chao; Zhou, Lan; Zuo, Hong-Xia; He, Xiao-Kuo; Niu, Yu-Ming

    2018-04-01

    To investigate the effectiveness of traditional manual acupuncture combined with rehabilitation therapy versus rehabilitation therapy alone for shoulder hand syndrome after stroke. PubMed, EMBASE, the Cochrane Library, Chinese Biomedicine Database, China National Knowledge Infrastructure, VIP Information Database, Wan Fang Database and reference lists of the eligible studies were searched up to July 2017 for relevant studies. Randomized controlled trials that compared the combined effects of traditional manual acupuncture and rehabilitation therapy to rehabilitation therapy alone for shoulder hand syndrome after stroke were included. Two reviewers independently screened the searched records, extracted the data and assessed risk of bias of the included studies. The treatment effect sizes were pooled in a meta-analysis using RevMan 5.3 software. A total of 20 studies involving 1918 participants were included in this study. Compared to rehabilitation therapy alone, the combined therapy significantly reduced pain on the visual analogue scale and improved limb movement on the Fugl-Meyer Assessment scale and the performance of activities of daily living (ADL) on the Barthel Index scale or Modified Barthel Index scale. Of these, the visual analogue scale score changes were significantly higher (mean difference = 1.49, 95% confidence interval = 1.15-1.82, P < 0.00001) favoring the combined therapy after treatment, with severe heterogeneity ( I 2  = 71%, P = 0.0005). Current evidence suggests that traditional manual acupuncture integrated with rehabilitation therapy is more effective in alleviating pain, improving limb movement and ADL. However, considering the relatively low quality of available evidence, further rigorously designed and large-scale randomized controlled trials are needed to confirm the results.

  17. CTF Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Avramova, Maria N. [Pennsylvania State Univ., University Park, PA (United States); Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-25

    Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, and subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.

  18. User's Manual for the Naval Interactive Data Analysis System-Climatologies (NIDAS-C), Version 2.0

    Science.gov (United States)

    Abbott, Clifton

    1996-01-01

    This technical note provides the user's manual for the NIDAS-C system developed for the naval oceanographic office. NIDAS-C operates using numerous oceanographic data categories stored in an installed version of the Naval Environmental Operational Nowcast System (NEONS), a relational database management system (rdbms) which employs the ORACLE proprietary rdbms engine. Data management, configuration, and control functions for the supporting rdbms are performed externally. NIDAS-C stores and retrieves data to/from the rdbms but exercises no direct internal control over the rdbms or its configuration. Data is also ingested into the rdbms, for use by NIDAS-C, by external data acquisition processes. The data categories employed by NIDAS-C are as follows: Bathymetry - ocean depth at

  19. SECTIONAL ANALYSIS OF POTENTIAL CONSUMERS OF RETAIL TRADING SERVICES OF POPULATION OF IZHEVSK

    Directory of Open Access Journals (Sweden)

    N.G. Sokolova

    2009-06-01

    Full Text Available Social trends and preferences of potential consumer of retail services when selling food products in Izhevsk, based on the data of marketing research are being studied. Sectional analysis for the given market is held. The trend of selected market section is described. The article contains the calculation of total market demand for retail trading services in Izhevsk for a moment in 2008.

  20. Graphs of neutron cross sections in JSD1000 for radiation shielding safety analysis

    International Nuclear Information System (INIS)

    Yamano, Naoki

    1984-03-01

    Graphs of neutron cross sections and self-shielding factors in the JSD1000 library are presented for radiation shielding safety analysis. The compilation contains various reaction cross sections for 42 nuclides from 1 H to 241 Am in the energy range from 3.51 x 10 -4 eV to 16.5 MeV. The Bondarenko-type self-shielding factors of each reaction are given by the background cross sections from σ 0 = 0 to σ 0 = 10000. (author)

  1. Comparative analysis of the neutron cross-sections of iron from various evaluated data libraries

    International Nuclear Information System (INIS)

    Bychkov, V.M.; Vozyakov, V.V.; Manokhin, V.N.; Smoll, F.; Resner, P.; Seeliger, D.; Hermsdorf, D.

    1983-09-01

    The comparative analysis of neutron cross-sections of iron from evaluated nuclear data libraries SOKRATOR, KEDAK, ENDL is done in energy interval from 0.025 eV to 20 MeV. Some of iron cross-sections from SOKRATOR library are revised and new data, which are obtained by using new experimental data and more comprehensive theoretical methods, are recommended. As a result the new version of the iron neutron cross-section file (BNF-2012) is produced for SOKRATOR library. (author)

  2. Analysis of Sodium-23 Data Cross-Sections for Coolant on Generation IV Reactor - SFR

    International Nuclear Information System (INIS)

    Suwoto; Zuhair

    2009-01-01

    The integral tests of sodium-23 neutron cross-sections for coolant contained in JENDL-3.3, ENDF/B-VII.0, BROND-2.2 and JEFF-3.1 files have been performed. Cross-sections analysis of sodium-23 such as total cross-sections, elastic scattering, in-elastic scattering and radiative capture cross-sections for several temperature i.e. 300K, 800K and 1500K. The sodium-23 total cross-sections analysis based on MAEKER, R.E. experimental result through Broomstick experiment calculation. Differences between among other evaluated nuclear data file for elastic scattering, in-elastic scattering and radiative capture cross-sections were done analyzed and compared to ENDF/B-VII.0 as standard reference. Analysis of total cross-sections sodium-23 through broomstick calculation show JENDL-3.3 file give the best result on C/E statistical average value is 1.1043 compared to another nuclear data files. Differences sodium-23 total cross-sections on JEFF-3.1 file for all temperature work specially for energy 40keV and 1MeV-2MeV is about 0.2%. Meanwhile, relative small differences on in-elastic total scattering cross-sections are shown for all temperatures are about ±0.1% in JENDL-3.3. On the other hand, BROND-2.2 file give ±6% higher on sodium-23 in-elastic total scattering cross sections for energy range 450keV-550keV. Clearly significant differences on sodium-23 radiative capture cross sections for BROND-2.2 file especially in energy 109.659keV is somewhat higher than 446%, otherwise JENDL-3.3 and JEFF-3.1 give 16% higher than ENDF/B-VII.0 file. Overall result show that JENDL-3.3, ENDFB-VII.0, BROND-2.2 and JEFF-3.1 have little bit differences in total, elastic scattering, in-elastic scattering total cross sections, except BROND-2.2 file due to radiative capture cross-sections with larger discrepancies. (author)

  3. Effectiveness of Trigger Point Manual Treatment on the Frequency, Intensity, and Duration of Attacks in Primary Headaches: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Luca Falsiroli Maistrello

    2018-04-01

    Full Text Available BackgroundA variety of interventions has been proposed for symptomatology relief in primary headaches. Among these, manual trigger points (TrPs treatment gains popularity, but its effects have not been investigated yet.ObjectiveThe aim was to establish the effectiveness of manual TrP compared to minimal active or no active interventions in terms of frequency, intensity, and duration of attacks in adult people with primary headaches.MethodsWe searched MEDLINE, COCHRANE, Web Of Science, and PEDro databases up to November 2017 for randomized controlled trials (RCTs. Two independent reviewers appraised the risk-of-bias (RoB and the grading of recommendations, assessment, development, and evaluation (GRADE to evaluate the overall quality of evidence.ResultsSeven RCTs that compared manual treatment vs minimal active intervention were included: 5 focused on tension-type headache (TTH and 2 on Migraine (MH; 3 out of 7 RCTs had high RoB. Combined TTH and MH results show statistically significant reduction for all outcomes after treatment compared to controls, but the level of evidence was very low. Subgroup analysis showed a statistically significant reduction in attack frequency (no. of attacks per month after treatment in TTH (MD −3.50; 95% CI from −4.91 to −2.09; 4 RCTs and in MH (MD −1.92; 95% CI from −3.03 to −0.80; 2 RCTs. Pain intensity (0–100 scale was reduced in TTH (MD −12.83; 95% CI from −19.49 to −6.17; 4 RCTs and in MH (MD −13.60; 95% CI from −19.54 to −7.66; 2RCTs. Duration of attacks (hours was reduced in TTH (MD −0.51; 95% CI from −0.97 to −0.04; 2 RCTs and in MH (MD −10.68; 95% CI from −14.41 to −6.95; 1 RCT.ConclusionManual TrPs treatment of head and neck muscles may reduce frequency, intensity, and duration of attacks in TTH and MH, but the quality of evidence according to GRADE approach was very low for the presence of few studies, high RoB, and imprecision of results.

  4. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Models and Results Database (MAR-D) reference manual. Volume 8

    International Nuclear Information System (INIS)

    Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The primary function of MAR-D is to create a data repository for completed PRAs and Individual Plant Examinations (IPEs) by providing input, conversion, and output capabilities for data used by IRRAS, SARA, SETS, and FRANTIC software. As probabilistic risk assessments and individual plant examinations are submitted to the NRC for review, MAR-D can be used to convert the models and results from the study for use with IRRAS and SARA. Then, these data can be easily accessed by future studies and will be in a form that will enhance the analysis process. This reference manual provides an overview of the functions available within MAR-D and step-by-step operating instructions

  5. MC2-3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C. H. [Argonne National Lab. (ANL), Argonne, IL (United States); Yang, W. S. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-11-08

    The MC2-3 code is a Multigroup Cross section generation Code for fast reactor analysis, developed by improving the resonance self-shielding and spectrum calculation methods of MC2-2 and integrating the one-dimensional cell calculation capabilities of SDX. The code solves the consistent P1 multigroup transport equation using basic neutron data from ENDF/B data files to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (~2000) or hyperfine (~400,000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified isotopic temperatures. The pointwise cross sections are directly used in the hyperfine group calculation whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for two-dimensional whole-core problems to generate region-dependent broad-group cross sections. Multigroup cross sections are written in the ISOTXS format for a user-specified group structure. The code is executable on UNIX, Linux, and PC Windows systems, and its library includes all isotopes of the ENDF/BVII. 0 data.

  6. Investigation on macroscopic cross section model for BWR pin-by-pin core analysis - 118

    International Nuclear Information System (INIS)

    Fujita, T.; Tada, K.; Yamamoto, A.; Yamane, Y.; Kosaka, S.; Hirano, G.

    2010-01-01

    A cross section model used in the pin-by-pin core analysis for BWR is investigated. In the pin-by-pin core calculation method, pin-cell averaged cross sections are calculated for many combinations of state and history variables that have influences on the cross section and are tabulated prior to the core calculations. Variation of a cross section in a core simulator is classified into two different types, i.e., the instantaneous effect and the history effect. The instantaneous effect is incorporated by the variation of cross section which is caused by the instantaneous change of state variables. For this effect, the exposure, the void fraction, the fuel temperature, the moderator temperature and the control rod are used as indexes. The history effect is the cumulative effect of state variables. We treat this effect with a unified approach using the spectral history. To confirm accuracy of the cross section model, the pin-by-pin fission rate distribution and the k-infinity of fuel assembly which are obtained with the tabulated and the reference cross sections are compared. For the instantaneous effect, the present cross section model well reproduces the reference results for all off-nominal conditions. For the history effect, however, considerable differences both on the pin-by-pin fission rate distribution and the k-infinity are observed at high exposure points. (authors)

  7. 13th New Mexico Analysis Seminar & Western Spring Sectional Meeting of the AMS

    CERN Document Server

    Marcantognini, Stefania; Stokolos, Alexander; Urbina, Wilfredo; Harmonic analysis, partial differential equations, complex analysis, Banach spaces, and operator theory : celebrating Cora Sadosky's life

    2016-01-01

    Covering a range of subjects from operator theory and classical harmonic analysis to Banach space theory, this book contains survey and expository articles by leading experts in their corresponding fields, and features fully-refereed, high-quality papers exploring new results and trends in spectral theory, mathematical physics, geometric function theory, and partial differential equations. Graduate students and researchers in analysis will find inspiration in the articles collected in this volume, which emphasize the remarkable connections between harmonic analysis and operator theory. Another shared research interest of the contributors of this volume lies in the area of applied harmonic analysis, where a new notion called chromatic derivatives has recently been introduced in communication engineering. The material for this volume is based on the 13th New Mexico Analysis Seminar held at the University of New Mexico, April 3-4, 2014 and on several special sections of the Western Spring Sectional Meeting at th...

  8. Neutron cross-sections database for amino acids and proteins analysis

    Energy Technology Data Exchange (ETDEWEB)

    Voi, Dante L.; Ferreira, Francisco de O.; Nunes, Rogerio Chaffin, E-mail: dante@ien.gov.br, E-mail: fferreira@ien.gov.br, E-mail: Chaffin@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Rocha, Helio F. da, E-mail: hrocha@gbl.com.br [Universidade Federal do Rio de Janeiro (IPPMG/UFRJ), Rio de Janeiro, RJ (Brazil). Instituto de Pediatria

    2015-07-01

    Biological materials may be studied using neutrons as an unconventional tool of analysis. Dynamics and structures data can be obtained for amino acids, protein and others cellular components by neutron cross sections determinations especially for applications in nuclear purity and conformation analysis. The instrument used for this is the crystal spectrometer of the Instituto de Engenharia Nuclear (IEN-CNEN-RJ), the only one in Latin America that uses neutrons for this type of analyzes and it is installed in one of the reactor Argonauta irradiation channels. The experimentally values obtained are compared with calculated values using literature data with a rigorous analysis of the chemical composition, conformation and molecular structure analysis of the materials. A neutron cross-section database was constructed to assist in determining molecular dynamic, structure and formulae of biological materials. The database contains neutron cross-sections values of all amino acids, chemical elements, molecular groups, auxiliary radicals, as well as values of constants and parameters necessary for the analysis. An unprecedented analytical procedure was developed using the neutron cross section parceling and grouping method for data manipulation. This database is a result of measurements obtained from twenty amino acids that were provided by different manufactories and are used in oral administration in hospital individuals for nutritional applications. It was also constructed a small data file of compounds with different molecular groups including carbon, nitrogen, sulfur and oxygen, all linked to hydrogen atoms. A review of global and national scene in the acquisition of neutron cross sections data, the formation of libraries and the application of neutrons for analyzing biological materials is presented. This database has further application in protein analysis and the neutron cross-section from the insulin was estimated. (author)

  9. Neutron cross-sections database for amino acids and proteins analysis

    International Nuclear Information System (INIS)

    Voi, Dante L.; Ferreira, Francisco de O.; Nunes, Rogerio Chaffin; Rocha, Helio F. da

    2015-01-01

    Biological materials may be studied using neutrons as an unconventional tool of analysis. Dynamics and structures data can be obtained for amino acids, protein and others cellular components by neutron cross sections determinations especially for applications in nuclear purity and conformation analysis. The instrument used for this is the crystal spectrometer of the Instituto de Engenharia Nuclear (IEN-CNEN-RJ), the only one in Latin America that uses neutrons for this type of analyzes and it is installed in one of the reactor Argonauta irradiation channels. The experimentally values obtained are compared with calculated values using literature data with a rigorous analysis of the chemical composition, conformation and molecular structure analysis of the materials. A neutron cross-section database was constructed to assist in determining molecular dynamic, structure and formulae of biological materials. The database contains neutron cross-sections values of all amino acids, chemical elements, molecular groups, auxiliary radicals, as well as values of constants and parameters necessary for the analysis. An unprecedented analytical procedure was developed using the neutron cross section parceling and grouping method for data manipulation. This database is a result of measurements obtained from twenty amino acids that were provided by different manufactories and are used in oral administration in hospital individuals for nutritional applications. It was also constructed a small data file of compounds with different molecular groups including carbon, nitrogen, sulfur and oxygen, all linked to hydrogen atoms. A review of global and national scene in the acquisition of neutron cross sections data, the formation of libraries and the application of neutrons for analyzing biological materials is presented. This database has further application in protein analysis and the neutron cross-section from the insulin was estimated. (author)

  10. Manual therapy for tension-type headache related to quality of work life and work presenteeism: Secondary analysis of a randomized controlled trial.

    Science.gov (United States)

    Monzani, Lucas; Espí-López, Gemma Victoria; Zurriaga, Rosario; Andersen, Lars L

    2016-04-01

    The objective of this research is to evaluate the efficacy of manual therapy for tension-type headache (TTH) in restoring workers quality of work life, and how work presenteeism affects this relation. This study is a secondary analysis of a factorial, randomized clinical trial on manual therapy interventions. Altogether, 80 patients (85% women) with TTH and without current symptoms of any other concomitant disease participated. An experienced therapist delivered the treatment: myofascial inhibitory technique (IT), articulatory technique (AT), combined technique (IT and AT), and control group (no treatment). In general, all treatments as compared to our control group had a large effect (f≥.69) in the improvement of participants' quality of work life. Work presenteeism interacted with TTH treatment type's efficacy on participant's quality of work life. The inhibitory technique lead to higher reports of quality of work life than other treatment options only for participants with very low frequency of work presenteeism. In turn, TTH articulatory treatment techniques resulted in higher reports of quality of work life for a high to very high work presenteeism frequency. Articulatory manipulation technique is the more efficient treatment to improve quality of work life when the frequency of work presenteeism is high. Implications for future research and practice are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Sensitivity analysis of U238 cross section in thermal nuclear systems

    International Nuclear Information System (INIS)

    Amorim, E.S. do; D'Oliveira, A.B.; Oliveira, E.C. de; Moura Neto, C. de.

    1980-01-01

    A sensitivity analysis system is developed for assessing the implication of uncertainties in nuclear data and related computational methods for light water power reactor. Sensitivies, at equilibrium cycle condition, are carried out for the few group macroscopic cross section of the U 238 with respect to their 35 group microscopic absorption cross section using the batch depletion code SENTEAV similar to those calculation methods used in the industry. This investigation indicates that improvements are requested on specific range of energy. These results point out the direction for worth while experimental measurements based on an analysis of costs and economic benefits. (Author) [pt

  12. Composite Beam Cross-Section Analysis by a Single High-Order Element Layer

    DEFF Research Database (Denmark)

    Couturier, Philippe; Krenk, Steen

    2015-01-01

    An analysis procedure of general cross-section properties is presented. The formulation is based on the stress-strain states in the classic six equilibrium modes of a beam by considering a finite thickness slice modelled by a single layer of 3D finite elements. The theory is illustrated by applic......An analysis procedure of general cross-section properties is presented. The formulation is based on the stress-strain states in the classic six equilibrium modes of a beam by considering a finite thickness slice modelled by a single layer of 3D finite elements. The theory is illustrated...

  13. Challenges of sample preparation for cross sectional EBSD analysis of electrodeposited nickel films

    DEFF Research Database (Denmark)

    Alimadadi, Hossein; Pantleon, Karen

    2009-01-01

    Thorough microstructure and crystallographic orientation analysis of thin films by means of electron backscatter diffraction requires cross section preparation of the film-substrate compound. During careful preparation, changes of the rather non-stable as-deposited microstructure must be avoided....... Different procedures for sample preparation including mechanical grinding and polishing, electropolishing and focused ion beam milling have been applied to a nickel film electrodeposited on top of an amorphous Ni-P layer on a Cu-substrate. Reliable EBSD analysis of the whole cross section can be obtained...

  14. Light-water-reactor hydrogen manual

    International Nuclear Information System (INIS)

    Camp, A.L.; Cummings, J.C.; Sherman, M.P.; Kupiec, C.F.; Healy, R.J.; Caplan, J.S.; Sandhop, J.R.; Saunders, J.H.

    1983-06-01

    A manual concerning the behavior of hydrogen in light water reactors has been prepared. Both normal operations and accident situations are addressed. Topics considered include hydrogen generation, transport and mixing, detection, and combustion, and mitigation. Basic physical and chemical phenomena are described, and plant-specific examples are provided where appropriate. A wide variety of readers, including operators, designers, and NRC staff, will find parts of this manual useful. Different sections are written at different levels, according to the most likely audience. The manual is not intended to provide specific plant procedures, but rather, to provide general guidance that may assist in the development of such procedures

  15. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  16. Effective inelastic scattering cross-sections for background analysis in HAXPES of deeply buried layers

    Energy Technology Data Exchange (ETDEWEB)

    Risterucci, P., E-mail: paul.risterucci@gmail.com [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Université de Lyon, Institut des Nanotechnologies de Lyon, 36 avenue Guy de Collongue, 69134 Ecully (France); Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, DK-5230 Odense M (Denmark); Renault, O., E-mail: olivier.renault@cea.fr [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Zborowski, C. [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Sorbonne Universités, UPMC Univ. Paris 06, CNRS, UMR 7614, Laboratoire de Chimie Physique-Matière et Rayonnement, F-75005, Paris (France); Université de Lyon, Institut des Nanotechnologies de Lyon, 36 avenue Guy de Collongue, 69134 Ecully (France); Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, DK-5230 Odense M (Denmark); Bertrand, D.; Torres, A. [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Rueff, J.-P. [Synchrotron SOLEIL, L' Orme des Merisiers Saint-Aubin, BP 48 91192, Gif-sur-Yvette Cedex (France); Sorbonne Universités, UPMC Univ. Paris 06, CNRS, UMR 7614, Laboratoire de Chimie Physique-Matière et Rayonnement, F-75005, Paris (France); Ceolin, D. [Synchrotron SOLEIL, L' Orme des Merisiers Saint-Aubin, BP 48 91192, Gif-sur-Yvette Cedex (France); Grenet, G. [Université de Lyon, Institut des Nanotechnologies de Lyon, 36 avenue Guy de Collongue, 69134 Ecully (France); Tougaard, S. [Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, DK-5230 Odense M (Denmark)

    2017-04-30

    Highlights: • An effective approach for quantitative background analysis in HAXPES spectra of buried layer underneath complex overlayer structures is proposed. • The approach relies on using a weighted sum of inelastic scattering cross section of the pure layers. • The method is validated by the study of an advanced power transistor stack after successive annealing steps. • The depth distribution of crucial elements (Ti, Ga) is determined reliably at depths up to nearly 50 nm. - Abstract: Inelastic background analysis of HAXPES spectra was recently introduced as a powerful method to get access to the elemental distribution in deeply buried layers or interfaces, at depth up to 60 nm below the surface. However the accuracy of the analysis highly relies on suitable scattering cross-sections able to describe effectively the transport of photoelectrons through overlayer structures consisting of individual layers with potentially very different scattering properties. Here, we show that within Tougaard’s practical framework as implemented in the Quases-Analyze software, the photoelectron transport through thick (25–40 nm) multi-layer structures with widely different cross-sections can be reliably described with an effective cross-section in the form of a weighted sum of the individual cross-section of each layer. The high-resolution core-level analysis partly provides a guide for determining the nature of the individual cross-sections to be used. We illustrate this novel approach with the practical case of a top Al/Ti bilayer structure in an AlGaN/GaN power transistor device stack before and after sucessive annealing treatments. The analysis provides reliable insights on the Ti and Ga depth distributions up to nearly 50 nm below the surface.

  17. Tank waste remediation system process engineering instruction manual

    International Nuclear Information System (INIS)

    ADAMS, M.R.

    1998-01-01

    The purpose of the Tank Waste Remediation System (TWRS) Process Engineering Instruction Manual is to provide guidance and direction to TWRS Process Engineering staff regarding conduct of business. The objective is to establish a disciplined and consistent approach to business such that the work processes within TWRS Process Engineering are safe, high quality, disciplined, efficient, and consistent with Lockheed Martin Hanford Corporation Policies and Procedures. The sections within this manual are of two types: for compliance and for guidance. For compliance sections are intended to be followed per-the-letter until such time as they are formally changed per Section 2.0 of this manual. For guidance sections are intended to be used by the staff for guidance in the conduct of work where technical judgment and discernment are required. The guidance sections shall also be changed per Section 2.0 of this manual. The required header for each manual section is illustrated in Section 2.0, Manual Change Control procedure. It is intended that this manual be used as a training and indoctrination resource for employees of the TWRS Process Engineering organization. The manual shall be required reading for all TWRS Process Engineering staff, matrixed, and subcontracted employees

  18. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  19. CSTEM User Manual

    Science.gov (United States)

    Hartle, M.; McKnight, R. L.

    2000-01-01

    This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.

  20. Analysis of (n,2n) cross-section measurements for nuclei up to mass 238

    International Nuclear Information System (INIS)

    Davey, W.G.; Goin, R.W.; Ross, J.R.

    1975-06-01

    All suitable measurements of the energy dependence of (n,2n) cross sections of all isotopes up to mass 238 have been analyzed. The objectives were to display the quality of the measured data for each isotope and to examine the systematic dependence of the (n,2n) cross section upon N, Z, and A. Graphs and tables are presented of the ratio of the asymptotic (n,2n) and nonelastic cross section to the neutron-asymmetry parameter (N--Z)/A. Similar data are presented for the derived nuclear temperature, T, and level-density parameter, α, as a function of N, Z, and A. This analysis of the results of over 145 experiments on 61 isotopes is essentially a complete review of the current status of (n,2n) cross-section measurements

  1. Analysis on Indications and Causes of Cesarean Section on Pemba Island of Zanzibar in Africa

    Directory of Open Access Journals (Sweden)

    Liping Zhou

    2013-03-01

    Full Text Available Objective: To explore and analyze the indications and causes of cesarean section on Pemba island of Zanzibar in Africa to improve the quality of obstetrics. Methods: 564 patients performed cesarean section in Abdulla Mzee Hospital of Pemba from January, 2008 to December, 2011 were selected, and statistics was conducted by the method of retrospective analysis. Results: The rate of cesarean section in Abdulla Mzee Hospital of Pemba was 10.01%. The primary causes of cesarean section included cephalopelvic disproportion (27.13%, scar uterus (23.40%, preeclampsia and eclampsia (13.30%, fetal distress in uterus (9.40%, fetal factors (9.75% and complication of pregnancy (6.91%. Conclusion: Cesarean section plays a great role in the treatment of dystocia, some complications of pregnancy and reducing the mortality of pregnant women and perinatal infants, but in the area with relatively undeveloped medical conditions in Africa, cesarean section still takes great risks. Unnecessary cesarean section cannot reduce the incidence of postpartum hemorrhage and neonatal morbidity. The local medical staff should improve the midwifery technique, establish and perfect the formal antenatal examination system to improve the quality of maternity.

  2. JASMINE-pro: A computer code for the analysis of propagation process in steam explosions. User's manual

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua; Nilsuwankosit, Sunchai; Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo; Hashimoto, Kazuichiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-12-01

    A steam explosion is a phenomenon where a high temperature liquid gives its internal energy very rapidly to another low temperature volatile liquid, causing very strong pressure build up due to rapid vaporization of the latter. In the field of light water reactor safety research, steam explosions caused by the contact of molten core and coolant has been recognized as a potential threat which could cause failure of the pressure vessel or the containment vessel during a severe accident. A numerical simulation code JASMINE was developed at Japan Atomic Energy Research Institute (JAERI) to evaluate the impact of steam explosions on the integrity of reactor boundaries. JASMINE code consists of two parts, JASMINE-pre and -pro, which handle the premixing and propagation phases in steam explosions, respectively. JASMINE-pro code simulates the thermo-hydrodynamics in the propagation phase of a steam explosion on the basis of the multi-fluid model for multiphase flow. This report, 'User's Manual', gives the usage of JASMINE-pro code as well as the information on the code structures which should be useful for users to understand how the code works. (author)

  3. JASMINE-pro: A computer code for the analysis of propagation process in steam explosions. User's manual

    International Nuclear Information System (INIS)

    Yang, Yanhua; Nilsuwankosit, Sunchai; Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo; Hashimoto, Kazuichiro

    2000-12-01

    A steam explosion is a phenomenon where a high temperature liquid gives its internal energy very rapidly to another low temperature volatile liquid, causing very strong pressure build up due to rapid vaporization of the latter. In the field of light water reactor safety research, steam explosions caused by the contact of molten core and coolant has been recognized as a potential threat which could cause failure of the pressure vessel or the containment vessel during a severe accident. A numerical simulation code JASMINE was developed at Japan Atomic Energy Research Institute (JAERI) to evaluate the impact of steam explosions on the integrity of reactor boundaries. JASMINE code consists of two parts, JASMINE-pre and -pro, which handle the premixing and propagation phases in steam explosions, respectively. JASMINE-pro code simulates the thermo-hydrodynamics in the propagation phase of a steam explosion on the basis of the multi-fluid model for multiphase flow. This report, 'User's Manual', gives the usage of JASMINE-pro code as well as the information on the code structures which should be useful for users to understand how the code works. (author)

  4. Differential cross-section measurements at the University of Kentucky - Adventures in analysis

    International Nuclear Information System (INIS)

    Vanhoy, J.R.; Garza, E.A.; Steves, J.L.; Hicks, S.F.; Henderson, S.L.; Sidwell, L.C.; Champine, B.R.; Crider, B.P.; Liu, S.H.; Peters, E.E.; Prados-Estevez, F.M.; McEllistrem, M.T.; Ross, T.J.; Yates, S.W.

    2014-01-01

    Elastic and inelastic neutron scattering cross-sections are determined at the University of Kentucky Accelerator Laboratory (UKAL) 1 using time-of-flight techniques at incident energies in the fast neutron region. Measurements have been completed for scattering from 23 Na and for the 23 Na(n,n'γ) reaction; similar measurements are in progress for 54 Fe. Commencing in the summer of 2014, measurements will address 56 Fe. An overview of the facilities and instrumentation at UKAL is given, and our measurement and analysis procedures are outlined. Of particular concern are portions of the analysis which limit the accuracy and precision of the measurements. We briefly examine detector efficiencies derived from the 3 H(p,n) cross-sections, attenuation and multiple scattering corrections, and neutron and γ-ray cross-sections standardizations. (authors)

  5. Understanding the gender gap in antibiotic prescribing : a cross-sectional analysis of English primary care

    NARCIS (Netherlands)

    Smith, David R M; Dolk, F Christiaan K; Smieszek, Timo; Robotham, Julie V; Pouwels, Koen B

    2018-01-01

    OBJECTIVES: To explore the causes of the gender gap in antibiotic prescribing, and to determine whether women are more likely than men to receive an antibiotic prescription per consultation. DESIGN: Cross-sectional analysis of routinely collected electronic medical records from The Health

  6. Simultaneous analysis of fission and capture cross section with Adler-Adler resonance formula

    International Nuclear Information System (INIS)

    Cao Hengdao; Qiu Guochun

    1989-01-01

    The method of simultaneous analysis of fission and capture cross section for fissile nuclide with Adler-Adler resonance formula and the corresponding computer code are presented. A simple and convenient method to correct parameters μ, γ simultaneously is given in order to acquire optimized parameters. The results are satisfactory

  7. Risk Factors for Premature Births: A Cross-Sectional Analysis of ...

    African Journals Online (AJOL)

    Risk Factors for Premature Births: A Cross-Sectional Analysis of Hospital Records in a Cameroonian Health Facility. Andreas Chiabi, Evelyn M Mah, Nicole Mvondo, Seraphin Nguefack, Lawrence Mbuagbaw, Karen K Kamga, Shiyuan Zhang, Emile Mboudou, Pierre F Tchokoteu, Elie Mbonda ...

  8. Exploring Students' Conceptions of Science Learning via Drawing: A Cross-Sectional Analysis

    Science.gov (United States)

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2017-01-01

    This cross-sectional study explored students' conceptions of science learning via drawing analysis. A total of 906 Taiwanese students in 4th, 6th, 8th, 10th, and 12th grade were asked to use drawing to illustrate how they conceptualise science learning. Students' drawings were analysed using a coding checklist to determine the presence or absence…

  9. Radiological Control Manual

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    This manual has been prepared by Lawrence Berkeley Laboratory to provide guidance for site-specific additions, supplements, and clarifications to the DOE Radiological Control Manual. The guidance provided in this manual is based on the requirements given in Title 10 Code of Federal Regulations Part 835, Radiation Protection for Occupational Workers, DOE Order 5480.11, Radiation Protection for Occupational Workers, and the DOE Radiological Control Manual. The topics covered are (1) excellence in radiological control, (2) radiological standards, (3) conduct of radiological work, (4) radioactive materials, (5) radiological health support operations, (6) training and qualification, and (7) radiological records.

  10. EMSL Operations Manual

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Nancy S.

    2009-06-18

    This manual is a general resource tool to assist EMSL users and Laboratory staff within EMSL locate official policy, practice and subject matter experts. It is not intended to replace or amend any formal Battelle policy or practice. Users of this manual should rely only on Battelle’s Standard Based Management System (SBMS) for official policy. No contractual commitment or right of any kind is created by this manual. Battelle management reserves the right to alter, change, or delete any information contained within this manual without prior notice.

  11. EMSL Operations Manual

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Nancy S.

    2009-03-25

    This manual is a general resource tool to assist EMSL users and Laboratory staff within EMSL locate official policy, practice and subject matter experts. It is not intended to replace or amend any formal Battelle policy or practice. Users of this manual should rely only on Battelle’s Standard Based Management System (SBMS) for official policy. No contractual commitment or right of any kind is created by this manual. Battelle management reserves the right to alter, change, or delete any information contained within this manual without prior notice.

  12. Radiological Control Manual

    International Nuclear Information System (INIS)

    1993-04-01

    This manual has been prepared by Lawrence Berkeley Laboratory to provide guidance for site-specific additions, supplements, and clarifications to the DOE Radiological Control Manual. The guidance provided in this manual is based on the requirements given in Title 10 Code of Federal Regulations Part 835, Radiation Protection for Occupational Workers, DOE Order 5480.11, Radiation Protection for Occupational Workers, and the DOE Radiological Control Manual. The topics covered are (1) excellence in radiological control, (2) radiological standards, (3) conduct of radiological work, (4) radioactive materials, (5) radiological health support operations, (6) training and qualification, and (7) radiological records

  13. PCs The Missing Manual

    CERN Document Server

    Karp, David

    2005-01-01

    Your vacuum comes with one. Even your blender comes with one. But your PC--something that costs a whole lot more and is likely to be used daily and for tasks of far greater importance and complexity--doesn't come with a printed manual. Thankfully, that's not a problem any longer: PCs: The Missing Manual explains everything you need to know about PCs, both inside and out, and how to keep them running smoothly and working the way you want them to work. A complete PC manual for both beginners and power users, PCs: The Missing Manual has something for everyone. PC novices will appreciate the una

  14. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  15. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  16. HANSF 1.3 Users Manual FAI/98-40-R2 Hanford Spent Nuclear Fuel (SNF) Safety Analysis Model [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    DUNCAN, D.R.

    1999-10-07

    The HANSF analysis tool is an integrated model considering phenomena inside a multi-canister overpack (MCO) spent nuclear fuel container such as fuel oxidation, convective and radiative heat transfer, and the potential for fission product release. This manual reflects the HANSF version 1.3.2, a revised version of 1.3.1. HANSF 1.3.2 was written to correct minor errors and to allow modeling of condensate flow on the MCO inner surface. HANSF 1.3.2 is intended for use on personal computers such as IBM-compatible machines with Intel processors running under Lahey TI or digital Visual FORTRAN, Version 6.0, but this does not preclude operation in other environments.

  17. Stirling engine design manual

    Science.gov (United States)

    Martini, W. R.

    1978-01-01

    This manual is intended to serve both as an introduction to Stirling engine analysis methods and as a key to the open literature on Stirling engines. Over 800 references are listed and these are cross referenced by date of publication, author and subject. Engine analysis is treated starting from elementary principles and working through cycles analysis. Analysis methodologies are classified as first, second or third order depending upon degree of complexity and probable application; first order for preliminary engine studies, second order for performance prediction and engine optimization, and third order for detailed hardware evaluation and engine research. A few comparisons between theory and experiment are made. A second order design procedure is documented step by step with calculation sheets and a worked out example to follow. Current high power engines are briefly described and a directory of companies and individuals who are active in Stirling engine development is included. Much remains to be done. Some of the more complicated and potentially very useful design procedures are now only referred to. Future support will enable a more thorough job of comparing all available design procedures against experimental data which should soon be available.

  18. Quality manual. Nuclear Regulatory Authority of the Slovak Republic

    International Nuclear Information System (INIS)

    2006-03-01

    This quality manual of the Nuclear Regulatory Authority of the Slovak Republic (UJD) is presented. Basic characteristics of the UJD, Quality manual operative control, and Quality management system (QMS) are described. Management responsibility, Processes realization, Measurement, analysis (assessment) and improvement of the quality management system, Cancellation provision as well as abbreviations used in the Quality Manual are presented.

  19. Manual for environmental radiological surveillance

    International Nuclear Information System (INIS)

    Sumiya, Shuichi; Matsuura, Kenichi; Nakano, Masanao; Takeyasu, Masanori; Morisawa, Masato; Onuma, Toshimitsu; Fujita, Hiroki; Mizutani, Tomoko; Watanabe, Hajime; Sugai, Masamitsu

    2010-03-01

    Environmental radiation monitoring around the Tokai Reprocessing Plant has been conducted by the Nuclear Fuel Cycle Engineering Laboratories, based on 'Safety Regulations for the Reprocessing Plant of JAEA, Chapter IV - Environmental monitoring' and Environmental Radiation Monitoring Program decided by the Ibaraki prefectural government. The radiation monitoring installations and equipments were also prepared for emergency. This manual describes; (1) the installations of radiological measurement, (2) the installations of meteorological observation, and (3) environmental data processing system for executing the terrestrial environmental monitoring by Environmental Protection Section, Radiation Protection Department. The environmental monitoring has been operated through the manual published in 1993 (PNC TN8520 93-001). Then the whole articles were revised because the partially of installations and equipments having been updated in recent years. (author)

  20. Zooplankton Methodology, Collection & identyification - A field manual

    Digital Repository Service at National Institute of Oceanography (India)

    Goswami, S.C.

    and productivity would largely depend upon the use of correct methodology which involves collection of samples, fixation, preservation, analysis and computation of data. The detailed procedures on all these aspects are given in this manual....

  1. Energy management manual

    Energy Technology Data Exchange (ETDEWEB)

    1979-06-01

    The Jacarilla reservation lies on the San Juan Basin in New Mexico, with vast oil and gas deposits, actively developed since the late 1950s. Constraints on Tribal regulation of energy development are discussed in Section I. Section II describes the relationship between Federal agencies and the Tribe; identifies energy management problems; recommends management activities to address the problems; and points out skill requirements. The Tribe has now adopted a formal statement of goals and objectives for its minerals management program and details of the program are described in Section III. Information on the legal analysis of oil and gas development on the land of the Tribe is given in the appendix. (MCW)

  2. An analysis of MCNP cross-sections and tally methods for low-energy photon emitters.

    Science.gov (United States)

    Demarco, John J; Wallace, Robert E; Boedeker, Kirsten

    2002-04-21

    Monte Carlo calculations are frequently used to analyse a variety of radiological science applications using low-energy (10-1000 keV) photon sources. This study seeks to create a low-energy benchmark for the MCNP Monte Carlo code by simulating the absolute dose rate in water and the air-kerma rate for monoenergetic point sources with energies between 10 keV and 1 MeV. The analysis compares four cross-section datasets as well as the tally method for collision kerma versus absorbed dose. The total photon attenuation coefficient cross-section for low atomic number elements has changed significantly as cross-section data have changed between 1967 and 1989. Differences of up to 10% are observed in the photoelectric cross-section for water at 30 keV between the standard MCNP cross-section dataset (DLC-200) and the most recent XCOM/NIST tabulation. At 30 keV, the absolute dose rate in water at 1.0 cm from the source increases by 7.8% after replacing the DLC-200 photoelectric cross-sections for water with those from the XCOM/NIST tabulation. The differences in the absolute dose rate are analysed when calculated with either the MCNP absorbed dose tally or the collision kerma tally. Significant differences between the collision kerma tally and the absorbed dose tally can occur when using the DLC-200 attenuation coefficients in conjunction with a modern tabulation of mass energy-absorption coefficients.

  3. Buckling analysis for structural sections and stiffened plates reinforced with laminated composites.

    Science.gov (United States)

    Viswanathan, A. V.; Soong, T.-C.; Miller, R. E., Jr.

    1972-01-01

    A classical buckling analysis is developed for stiffened, flat plates composed of a series of linked flat plate and beam elements. Plates are idealized as multilayered orthotropic elements; structural beads and lips are idealized as beams. The loaded edges of the stiffened plate are simply supported and the conditions at the unloaded edges can be prescribed arbitrarily. The plate and beam elements are matched along their common junctions for displacement continuity and force equilibrium in an exact manner. Offsets between elements are considered in the analysis. Buckling under uniaxial compressive load for plates, sections and stiffened plates is investigated. Buckling loads are found as the lowest of all possible general and local failure modes and the mode shape is used to determine whether buckling is a local or general instability. Numerical correlations with existing analysis and test data for plates, sections and stiffened plates including boron-reinforced structures are discussed. In general, correlations are reasonably good.

  4. Buckling analysis for axially compressed flat plates, structural sections, and stiffened plates reinforced with laminated composites

    Science.gov (United States)

    Viswanathan, A. V.; Soong, T.; Miller, R. E., Jr.

    1971-01-01

    A classical buckling analysis is developed for stiffened, flat plates composed of a series of linked plate and beam elements. Plates are idealized as multilayered orthotropic elements. Structural beads and lips are idealized as beams. The loaded edges of the stiffened plate are simply-supported and the conditions at the unloaded edges can be prescribed arbitrarily. The plate and beam elements are matched along their common junctions for displacement continuity and force equilibrium in an exact manner. Offsets between elements are considered in the analysis. Buckling under uniaxial compressive load for plates, sections, and stiffened plates is investigated. Buckling loads are the lowest of all possible general and local failure modes, and the mode shape is used to determine whether buckling is a local or general instability. Numerical correlations with existing analysis and test data for plates, sections, and stiffened plates including boron-reinforced structures are discussed. In general correlations are reasonably good.

  5. Optimization of multi-group cross sections for fast reactor analysis

    International Nuclear Information System (INIS)

    Chin, M. R.; Manalo, K. L.; Edgar, C. A.; Paul, J. N.; Molinar, M. P.; Redd, E. M.; Yi, C.; Sjoden, G. E.

    2013-01-01

    The selection of the number of broad energy groups, collapsed broad energy group boundaries, and their associated evaluation into collapsed macroscopic cross sections from a general 238-group ENDF/B-VII library dramatically impacted the k eigenvalue for fast reactor analysis. An analysis was undertaken to assess the minimum number of energy groups that would preserve problem physics; this involved studies using the 3D deterministic transport parallel code PENTRAN, the 2D deterministic transport code SCALE6.1, the Monte Carlo based MCNP5 code, and the YGROUP cross section collapsing tool on a spatially discretized MOX fuel pin comprised of 21% PUO 2 -UO 2 with sodium coolant. The various cases resulted in a few hundred pcm difference between cross section libraries that included the 238 multi-group reference, and cross sections rendered using various reaction and adjoint weighted cross sections rendered by the YGROUP tool, and a reference continuous energy MCNP case. Particular emphasis was placed on the higher energies characteristic of fission neutrons in a fast spectrum; adjoint computations were performed to determine the average per-group adjoint fission importance for the MOX fuel pin. This study concluded that at least 10 energy groups for neutron transport calculations are required to accurately predict the eigenvalue for a fast reactor system to within 250 pcm of the 238 group case. In addition, the cross section collapsing/weighting schemes within YGROUP that provided a collapsed library rendering eigenvalues closest to the reference were the contribution collapsed, reaction rate weighted scheme. A brief analysis on homogenization of the MOX fuel pin is also provided, although more work is in progress in this area. (authors)

  6. Analysis of the 235U neutron cross sections in the resolved resonance range

    International Nuclear Information System (INIS)

    Leal, L.C.; de Saussure, G.; Perez, R.B.

    1989-01-01

    Using recent high-resolution measurements of the neutron transmission of 235 U and the spin-separated fission cross-section data of Moore et al., a multilevel analysis of the 235 U neutron cross sections was performed up to 300 eV. The Dyson Metha Δ 3 statistics were used to help locate small levels above 100 eV where resonances are not clearly resolved even in the best resolution measurements available. The statistical properties of the resonance parameters are discussed

  7. Analysis of the 235U neutron cross sections in the resolved resonance range

    International Nuclear Information System (INIS)

    Leal, L.C.; de Saussure, G.; Perez, R.B.

    1989-01-01

    Using recent high-resolution measurements of the neutron transmission of 235 U and the spin-separated fission cross-section data of Moore et al., a multilevel analysis of the 235 U neutron cross sections was performed up to 300 eV. The Dyson Metha Δ 3 statistics were used to help locate small levels above 100 eV where resonances are not clearly resolved even in the best resolution measurements available. The statistical properties of the resonance parameters are discussed. 13 refs., 8 figs., 1 tab

  8. 3-D inelastic analysis methods for hot section components. Volume 2: Advanced special functions models

    Science.gov (United States)

    Wilson, R. B.; Banerjee, P. K.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.

  9. Nuclear forensic applications involving high spatial resolution analysis of Trinitite cross-sections

    International Nuclear Information System (INIS)

    Donohue, P.H.; Antonio Simonetti; Koeman, E.C.; Sara Mana; University of Iowa, Iowa City, IA; Burns, P.C.; University of Notre Dame, Notre Dame, IN

    2015-01-01

    This study reports a comprehensive cross-sectional analysis of major and trace element abundances and 240 Pu/ 239 Pu ratios within vertically oriented Trinitite thin sections. The upper glassy layer (∼2 mm thick) represents fused desert sand combined with devolatilized fallout from the debris cloud. The vertical distribution of 240 Pu/ 239 Pu ratios indicates that residual fuel was incorporated deeper (up to ∼10 mm depth) into Trinitite than previously reported. This requires thorough mixing and disturbance of the upper cm of the blast site prior to or during the initial melting of the desert sand resulting from the nuclear explosion. (author)

  10. Amino acids analysis using grouping and parceling of neutrons cross sections techniques

    International Nuclear Information System (INIS)

    Voi, Dante Luiz Voi; Rocha, Helio Fenandes da

    2002-01-01

    Amino acids used in parenteral administration in hospital patients with special importance in nutritional applications were analyzed to compare with the manufactory data. Individual amino acid samples of phenylalanine, cysteine, methionine, tyrosine and threonine were measured with the neutron crystal spectrometer installed at the J-9 irradiation channel of the 1 kW Argonaut Reactor of the Instituto de Engenharia Nuclear (IEN). Gold and D 2 O high purity samples were used for the experimental system calibration. Neutron cross section values were calculated from chemical composition, conformation and molecular structure analysis of the materials. Literature data were manipulated by parceling and grouping neutron cross sections. (author)

  11. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandia National lababoratory, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandia National lababoratory, Livermore, CA); Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  12. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  13. DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual.

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandai National Labs, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandai National Labs, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandai National Labs, Livermore, CA); Hough, Patricia Diane (Sandai National Labs, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  14. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  15. Effectiveness of conservative interventions including exercise, manual therapy and medical management in adults with shoulder impingement: a systematic review and meta-analysis of RCTs.

    Science.gov (United States)

    Steuri, Ruedi; Sattelmayer, Martin; Elsig, Simone; Kolly, Chloé; Tal, Amir; Taeymans, Jan; Hilfiker, Roger

    2017-09-01

    To investigate the effectiveness of conservative interventions for pain, function and range of motion in adults with shoulder impingement. Systematic review and meta-analysis of randomised trials. Medline, CENTRAL, CINAHL, Embase and PEDro were searched from inception to January 2017. Randomised controlled trials including participants with shoulder impingement and evaluating at least one conservative intervention against sham or other treatments. For pain, exercise was superior to non-exercise control interventions (standardised mean difference (SMD) -0.94, 95% CI -1.69 to -0.19). Specific exercises were superior to generic exercises (SMD -0.65, 95% CI -0.99 to -0.32). Corticosteroid injections were superior to no treatment (SMD -0.65, 95% CI -1.04 to -0.26), and ultrasound guided injections were superior to non-guided injections (SMD -0.51, 95% CI -0.89 to -0.13). Nonsteroidal anti-inflammatory drugs (NSAIDS) had a small to moderate SMD of -0.29 (95% CI -0.53 to -0.05) compared with placebo. Manual therapy was superior to placebo (SMD -0.35, 95% CI -0.69 to -0.01). When combined with exercise, manual therapy was superior to exercise alone, but only at the shortest follow-up (SMD -0.32, 95% CI -0.62 to -0.01). Laser was superior to sham laser (SMD -0.88, 95% CI -1.48 to -0.27). Extracorporeal shockwave therapy (ECSWT) was superior to sham (-0.39, 95% CI -0.78 to -0.01) and tape was superior to sham (-0.64, 95% CI -1.16 to -0.12), with small to moderate SMDs. Although there was only very low quality evidence, exercise should be considered for patients with shoulder impingement symptoms and tape, ECSWT, laser or manual therapy might be added. NSAIDS and corticosteroids are superior to placebo, but it is unclear how these treatments compare to exercise. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Oil Spill Response Manual

    NARCIS (Netherlands)

    Marieke Zeinstra; Sandra Heins; Wierd Koops

    2014-01-01

    A two year programme has been carried out by the NHL University of Applied Sciences together with private companies in the field of oil and chemical spill response to finalize these manuals on oil and chemical spill response. These manuals give a good overview of all aspects of oil and chemical

  17. Technical Manual. The ACT®

    Science.gov (United States)

    ACT, Inc., 2014

    2014-01-01

    This manual contains technical information about the ACT® college readiness assessment. The principal purpose of this manual is to document the technical characteristics of the ACT in light of its intended purposes. ACT regularly conducts research as part of the ongoing formative evaluation of its programs. The research is intended to ensure that…

  18. Eco-Innovation Manual

    DEFF Research Database (Denmark)

    O'Hare, Jamie Alexander; McAloone, Tim C.; Pigosso, Daniela Cristina Antelmi

    Aim of this manual is to introduce a methodology for the implementation of eco‐innovation within small and medium sized companies in developing and emerging economies. The intended audience of this manual is organizations that provide professional services to guide and support manufacturing compa...... companies to improve their sustainability performance....

  19. Indoor Air Quality Manual.

    Science.gov (United States)

    Baldwin Union Free School District, NY.

    This manual identifies ways to improve a school's indoor air quality (IAQ) and discusses practical actions that can be carried out by school staff in managing air quality. The manual includes discussions of the many sources contributing to school indoor air pollution and the preventive planning for each including renovation and repair work,…

  20. Egyptian Mythological Manuals

    DEFF Research Database (Denmark)

    Jørgensen, Jens Kristoffer Blach

    From the hands of Greek mythographers a great number of myths have survived along with philosophical discussions of their meaning and relevance for the Greeks. It is little known that something similar existed in ancient Egypt where temple libraries and archives held scholarly literature used...... by the native priesthood, much of which has only been published in recent years. As part of this corpus of texts, the ancient Egyptian mythological manuals offer a unique perspective on how the Egyptian priesthood structured and interpreted Egyptian myths. The thesis looks at the different interpretative...... techniques used in the Tebtunis Mythological Manual (Second century CE) and the Mythological Manual of the Delta (Sixth century BCE) and the place of these manuals within the larger corpus of priestly scholarly literature from ancient Egypt. To organize the wealth of local myths the manuals use model...

  1. Fuel Element Technical Manual

    Energy Technology Data Exchange (ETDEWEB)

    Burley, H.H. [ed.

    1956-08-01

    It is the purpose of the Fuel Element Technical Manual to Provide a single document describing the fabrication processes used in the manufacture of the fuel element as well as the technical bases for these processes. The manual will be instrumental in the indoctrination of personnel new to the field and will provide a single data reference for all personnel involved in the design or manufacture of the fuel element. The material contained in this manual was assembled by members of the Engineering Department and the Manufacturing Department at the Hanford Atomic Products Operation between the dates October, 1955 and June, 1956. Arrangement of the manual. The manual is divided into six parts: Part I--introduction; Part II--technical bases; Part III--process; Part IV--plant and equipment; Part V--process control and improvement; and VI--safety.

  2. The 'Environmental Manual for Power Development': a tool for GHG mitigation and cost analysis in developing countries

    International Nuclear Information System (INIS)

    Fritsche, Uwe R.; Liptow, Holger

    1999-01-01

    The Environmental Manual for Power Development (EM) is a computerised tool to include environmental and cost data into the decision-making for energy projects in developing countries. The EM is sponsored by German BMZ (Ministry for Economic Co-operation and Development), Dutch DGIS (Directorate General for International Co-operation), British DfID (Department for International Development), and the World Bank. The EM was developed by GTZ with scientific support from Oeko-Institut (Institute for applied ecology). The EM tracks down the emissions and costs of e.g. the existing power supply system in a country, region, or of a specific energy project, and compares those to alternative options to deliver the same energy service, e.g. electricity, or process heat, or transport services. To do so, the EM maintains a comprehensive database on environmental and cost impacts of energy technologies, and determines environmental impacts for life-cycles: All impacts from mining, transport, conversion etc. can be accounted for. To consistently handle all life-cycles, the EM database offers a variety of pre-defined fuel-and life-cycles to work with. The Em database covers generic energy technologies in developing countries, especially fossil-fueled electricity and heating systems, cogeneration, renewable energies, selected energy efficiency technologies, nuclear power systems, as well as data for upstream activities like mining, fuel benefication, transport, and emission control technologies like flue-gas desulfurisation, low-NO x burners, etc. The EM analyses and compares airborne and greenhouse gas emissions, solid wastes, and land use, as well as internal and external costs associated with investment and operation of energy technologies, including their life-cycle (upstream fuel-cycles, materials). The Em helps to check the compliance of energy processes with given emission standards - its database offers such standard for various countries and regions, and users can test if

  3. Measurement and QCD analysis of diffractive jet cross sections in deep inelastic scattering at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Mozer, M.U.

    2006-07-24

    Differential cross sections for the production of two jets in diffractive deep inelastic scattering (DIS) at HERA are presented. The process studied is of the type ep{yields}eXY, where the central hadronic system X contains at least two jets and is separated from the system Y by a gap in rapidity. The forward system Y consists of an elastically scattered proton or a low mass dissociation system. The data were taken with the H1 detector during the years of 1999 and 2000 and correspond to an integrated luminosity of 51.5 pb{sup -1}. The measured cross sections are compared to fixed order NLO QCD predictions, that use diffractive parton densities which have previously been determined by a NLO QCD analysis of inclusive diffractive DIS at H1. The prediction and the data show significant differences. However, the dijet cross section is dominated by the diffractive gluon density, which can be extracted by the above mentioned analysis only with considerable uncertainty. Hence a combined QCD analysis of the previously published inclusive diffractive data and the dijet data is performed. This combined fit analysis allows the determination of diffractive quark and gluon densities with comparable precision. The common description of inclusive diffractive data and the dijet data confirms QCD factorization. (orig.)

  4. Amino acids analysis by total neutron cross-sections determinations: part V

    International Nuclear Information System (INIS)

    Voi, Dante L.; Ferreira, Francisco de O.; Rocha, Helio F. da

    2013-01-01

    Total neutron cross-sections of twenty essential and non-essential amino acids to human were determined using crystal spectrometer installed on the Argonauta reactor of IEN (Instituto de Engenharia Nuclear (CNEN-RJ) and compared with data generated by parceling and grouping methodologies developed at this institution. For each amino acid was calculated the respective neutron cross-section by molecular structure, conformation and chemistry analysis. The results obtained for eighteen of twenty amino acids confirm the specifications and product formulations indicated by manufactures. These initial results allow to build a neutron cross-sections database as part of quality control of the amino supplied to hospitals for production of nutriments for parenteral or enteral formulations used in critical patients dependent on artificial feed, and for application in future studies of structure and dynamics for more complex molecules, including proteins, enzymes, fatty acids, membranes, organelles and other cell components. (author)

  5. Observed Differences between Males and Females in Surgically Treated Carpal Tunnel Syndrome Among Non-manual Workers: A Sensitivity Analysis of Findings from a Large Population Study

    Science.gov (United States)

    Farioli, Andrea; Curti, Stefania; Bonfiglioli, Roberta; Baldasseroni, Alberto; Spatari, Giovanna; Mattioli, Stefano; Violante, Francesco Saverio

    2018-01-01

    Abstract Objectives We aimed at assessing whether differences among males and females in carpal tunnel syndrome (CTS) epidemiology might be attributable to segregation with respect to occupational biomechanical exposures or differential access to care by sex. Methods We analysed surgically treated cases of CTS occurring among non-manual workers in Tuscany between 1997 and 2000. We conducted a Monte Carlo simulation to estimate the difference in occupational biomechanical exposures between males and females necessary to explain the observed incidence rate ratios. We also accounted for the sex-specific probability of receiving surgery after the diagnosis of CTS, as women were reported to be more likely to undergo surgery in a subset of our study population. We quantified the hypothetical biomechanical overload through the hand activity level (HAL) metric proposed by the American Conference of Governmental Industrial Hygienists. To quantify the effect of HAL on CTS risk, we assumed a prior distribution based on findings from two large cohort studies of industrial workers. Results After adjustment for the probability of receiving surgery, women showed a 4-fold incidence of CTS as compared with men. To explain this association among non-manual workers, women should have an average value of HAL at least 5 points higher. Conclusions Our analysis does not support the hypothesis that the difference in CTS incidence between males and females is entirely attributable to occupational risk factors or to differential access to surgery. The causal pathway between sex and CTS might include more determinants such as hormonal factors, anthropometric characteristics, and non-occupational exposure to biomechanical overload (e.g. household tasks). PMID:29579135

  6. Food/Hunger Macro-Analysis Seminar. A Do-It-Yourself Manual for College Courses and Action Groups.

    Science.gov (United States)

    Moyer, William; Thorne, Erika

    This guide describes a fifteen-week macro-analysis seminar about food production, distribution, and consumption on international, national, and local levels. The macro-analysis approach emphasizes the interrelatedness of all parts of the food/hunger issue; therefore the seminar also addresses escalating military expenditures, widening poverty, and…

  7. Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)

    International Nuclear Information System (INIS)

    Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.

    1982-01-01

    The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections

  8. A macroscopic cross-section model for BWR pin-by-pin core analysis

    International Nuclear Information System (INIS)

    Fujita, Tatsuya; Endo, Tomohiro; Yamamoto, Akio

    2014-01-01

    A macroscopic cross-section model used in boiling water reactor (BWR) pin-by-pin core analysis is studied. In the pin-by-pin core calculation method, pin-cell averaged cross sections are calculated for many combinations of core state and depletion history variables and are tabulated prior to core calculations. Variations of cross sections in a core simulator are caused by two different phenomena (i.e. instantaneous and history effects). We treat them through the core state variables and the exposure-averaged core state variables, respectively. Furthermore, the cross-term effect among the core state and the depletion history variables is considered. In order to confirm the calculation accuracy and discuss the treatment of the cross-term effect, the k-infinity and the pin-by-pin fission rate distributions in a single fuel assembly geometry are compared. Some cross-term effects could be negligible since the impacts of them are sufficiently small. However, the cross-term effects among the control rod history (or the void history) and other variables have large impacts; thus, the consideration of them is crucial. The present macroscopic cross-section model, which considers such dominant cross-term effects, well reproduces the reference results and can be a candidate in practical applications for BWR pin-by-pin core analysis on the normal operations. (author)

  9. Micro/nano analysis of tooth microstructures by Focused Ion Beam (FIB cross-sectioning

    Directory of Open Access Journals (Sweden)

    Meltem Sezen

    2017-04-01

    Full Text Available Since dental structures are hard and fragile, cross-sectioning of these materials using ultramicrotomy and other techniques and following micro and nano analysis cause problems. The use of FIB-SEM dual beam platforms is the most convenient solution for investigating the microstructures, site-specifically and in certain geometries. Dual beam platforms allow for imaging at high magnifications and resolutions and simultaneous elemental analysis. In this study, the micro/nano-structural and chemical differences were revealed in dentin and enamel samples. The investigation of dental tissues having different morphologies and chemical components by ion-cross-sectioning is important for the use of FIB-SEM platforms in dentistry in Turkey.

  10. Extract from IAEA's Resources Manual in Nuclear Medicine - Part 2. - Human Resources Development

    International Nuclear Information System (INIS)

    2003-01-01

    The Nuclear Medicine Section of the International Atomic Energy Agency is now engaged in finalizing a reference manual in nuclear medicine, entitled, 'Resources Manual in Nuclear Medicine'. Several renowned professionals from all over the world, from virtually all fields of nuclear medicine have contributed to this manual. The World Journal of Nuclear Medicine will publish a series of extracts from this manual as previews. This is the second extract from the Resources Manual, Part-2 of the chapter on Human Resources Development. (author)

  11. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  12. RootAnalyzer: A Cross-Section Image Analysis Tool for Automated Characterization of Root Cells and Tissues.

    Directory of Open Access Journals (Sweden)

    Joshua Chopin

    Full Text Available The morphology of plant root anatomical features is a key factor in effective water and nutrient uptake. Existing techniques for phenotyping root anatomical traits are often based on manual or semi-automatic segmentation and annotation of microscopic images of root cross sections. In this article, we propose a fully automated tool, hereinafter referred to as RootAnalyzer, for efficiently extracting and analyzing anatomical traits from root-cross section images. Using a range of image processing techniques such as local thresholding and nearest neighbor identification, RootAnalyzer segments the plant root from the image's background, classifies and characterizes the cortex, stele, endodermis and epidermis, and subsequently produces statistics about the morphological properties of the root cells and tissues. We use RootAnalyzer to analyze 15 images of wheat plants and one maize plant image and evaluate its performance against manually-obtained ground truth data. The comparison shows that RootAnalyzer can fully characterize most root tissue regions with over 90% accuracy.

  13. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  14. Gridded Surface Subsurface Hydrologic Analysis (GSSHA) User's Manual; Version 1.43 for Watershed Modeling System 6.1

    National Research Council Canada - National Science Library

    Downer, Charles W; Ogden, Fred L

    2006-01-01

    The need to simulate surface water flows in watersheds with diverse runoff production mechanisms has led to the development of the physically-based hydrologic model Gridded Surface Subsurface Hydrologic Analysis (GSSHA...

  15. Advanced composites structural concepts and materials technologies for primary aircraft structures. Structural response and failure analysis: ISPAN modules users manual

    Science.gov (United States)

    Hairr, John W.; Huang, Jui-Ten; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    The ISPAN Program (Interactive Stiffened Panel Analysis) is an interactive design tool that is intended to provide a means of performing simple and self contained preliminary analysis of aircraft primary structures made of composite materials. The program combines a series of modules with the finite element code DIAL as its backbone. Four ISPAN Modules were developed and are documented. These include: (1) flat stiffened panel; (2) curved stiffened panel; (3) flat tubular panel; and (4) curved geodesic panel. Users are instructed to input geometric and material properties, load information and types of analysis (linear, bifurcation buckling, or post-buckling) interactively. The program utilizing this information will generate finite element mesh and perform analysis. The output in the form of summary tables of stress or margins of safety, contour plots of loads or stress, and deflected shape plots may be generalized and used to evaluate specific design.

  16. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  17. Finite Element Analysis and Test Results Comparison for the Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Przekop, Adam; Jegley, Dawn C.; Rouse, Marshall; Lovejoy, Andrew E.

    2016-01-01

    This report documents the comparison of test measurements and predictive finite element analysis results for a hybrid wing body center section test article. The testing and analysis efforts were part of the Airframe Technology subproject within the NASA Environmentally Responsible Aviation project. Test results include full field displacement measurements obtained from digital image correlation systems and discrete strain measurements obtained using both unidirectional and rosette resistive gauges. Most significant results are presented for the critical five load cases exercised during the test. Final test to failure after inflicting severe damage to the test article is also documented. Overall, good comparison between predicted and actual behavior of the test article is found.

  18. SIMON. A computer program for reliability and statistical analysis using Monte Carlo simulation. Program description and manual

    International Nuclear Information System (INIS)

    Kongsoe, H.E.; Lauridsen, K.

    1993-09-01

    SIMON is a program for calculation of reliability and statistical analysis. The program is of the Monte Carlo type, and it is designed with high flexibility, and has a large potential for application to complex problems like reliability analyses of very large systems and of systems, where complex modelling or knowledge of special details are required. Examples of application of the program, including input and output, for reliability and statistical analysis are presented. (au) (3 tabs., 3 ills., 5 refs.)

  19. ABAQUS-EPGEN: a general-purpose finite element code. Volume 3. Example problems manual

    International Nuclear Information System (INIS)

    Hibbitt, H.D.; Karlsson, B.I.; Sorensen, E.P.

    1983-03-01

    This volume is the Example and Verification Problems Manual for ABAQUS/EPGEN. Companion volumes are the User's, Theory and Systems Manuals. This volume contains two major parts. The bulk of the manual (Sections 1-8) contains worked examples that are discussed in detail, while Appendix A documents a large set of basic verification cases that provide the fundamental check of the elements in the code. The examples in Sections 1-8 illustrate and verify significant aspects of the program's capability. Most of these problems provide verification, but they have also been chosen to allow discussion of modeling and analysis techniques. Appendix A contains basic verification cases. Each of these cases verifies one element in the program's library. The verification consists of applying all possible load or flux types (including thermal loading of stress elements), and all possible foundation or film/radiation conditions, and checking the resulting force and stress solutions or flux and temperature results. This manual provides program verification. All of the problems described in the manual are run and the results checked, for each release of the program, and these verification results are made available

  20. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  1. Salinas : theory manual.

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Timothy Francis; Reese, Garth M.; Bhardwaj, Manoj Kumar

    2004-08-01

    This manual describes the theory behind many of the constructs in Salinas. For a more detailed description of how to use Salinas , we refer the reader to Salinas, User's Notes. Many of the constructs in Salinas are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Salinas are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programer-notes manual, the user's notes and of course the material in the open literature.

  2. Fire Protection Program Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2012-05-18

    This manual documents the Lawrence Livermore National Laboratory (LLNL) Fire Protection Program. Department of Energy (DOE) Orders 420.1B, Facility Safety, requires LLNL to have a comprehensive and effective fire protection program that protects LLNL personnel and property, the public and the environment. The manual provides LLNL and its facilities with general information and guidance for meeting DOE 420.1B requirements. The recommended readers for this manual are: fire protection officers, fire protection engineers, fire fighters, facility managers, directorage assurance managers, facility coordinators, and ES and H team members.

  3. Image analysis as an adjunct to manual HER-2 immunohistochemical review: a diagnostic tool to standardize interpretation.

    LENUS (Irish Health Repository)

    Dobson, Lynne

    2010-07-01

    AIMS: Accurate determination of HER-2 status is critical to identify patients for whom trastuzumab treatment will be of benefit. Although the recommended primary method of evaluation is immunohistochemistry, numerous reports of variability in interpretation have raised uncertainty about the reliability of results. Recent guidelines have suggested that image analysis could be an effective tool for achieving consistent interpretation, and this study aimed to assess whether this technology has potential as a diagnostic support tool. METHODS AND RESULTS: Across a cohort of 275 cases, image analysis could accurately classify HER-2 status, with 91% agreement between computer-aided classification and the pathology review. Assessment of the continuity of membranous immunoreactivity in addition to intensity of reactivity was critical to distinguish between negative and equivocal cases and enabled image analysis to report a lower referral rate of cases for confirmatory fluorescence in situ hybridization (FISH) testing. An excellent concordance rate of 95% was observed between FISH and the automated review across 136 informative cases. CONCLUSIONS: This study has validated that image analysis can robustly and accurately evaluate HER-2 status in immunohistochemically stained tissue. Based on these findings, image analysis has great potential as a diagnostic support tool for pathologists and biomedical scientists, and may significantly improve the standardization of HER-2 testing by providing a quantitative reference method for interpretation.

  4. GENOVA: a generalized perturbation theory program for various applications to CANDU core physics analysis (II) - a user's manual

    International Nuclear Information System (INIS)

    Kim, Do Heon; Choi, Hang Bok

    2001-03-01

    A user's guide for GENOVA, a GENeralized perturbation theory (GPT)-based Optimization and uncertainty analysis program for Canada deuterium uranium (CANDU) physics VAriables, was prepared. The program was developed under the framework of CANDU physics design and analysis code RFSP. The generalized perturbation method was implemented in GENOVA to estimate the zone controller unit (ZCU) level upon refueling operation and calculate various sensitivity coefficients for fuel management study and uncertainty analyses, respectively. This documentation contains descriptions and directions of four major modules of GENOVA such as ADJOINT, GADJINT, PERTURB, and PERTXS so that it can be used as a practical guide for GENOVA users. This documentation includes sample inputs for the ZCU level estimation and sensitivity coefficient calculation, which are the main application of GENOVA. The GENOVA can be used as a supplementary tool of the current CANDU physics design code for advanced CANDU core analysis and fuel development

  5. Optimised design and thermal-hydraulic analysis of the IFMIF/HFTM test section

    Energy Technology Data Exchange (ETDEWEB)

    Gordeev, S.; Heinzel, V.; Lang, K.H.; Moeslang, A.; Schleisiek, K.; Slobodtchouk, V.; Stratmanns, E.

    2003-10-01

    On the basis of previous concepts, analyses and experiments, the high flux test module (HFTM) for the International Fusion Materials Irradiation Facility (IFMIF) was further optimised. The work focused on the design and the thermal hydraulic analysis of the HFTM section containing the material specimens to be irradiated, the ''test section'', with the main objective to improve the concept with respect to the optimum use of the available irradiation volume and to the temperature of the specimens. Particular emphasis was laid on the application of design principles which assure stable and reproducible thermal conditions. The present work has confirmed the feasibility and suitability of the optimised design of the HFTM test section with chocolate plate like shaped rigs. In particular it has been shown that the envisaged irradiation temperatures can be reached with acceptable temperature differences inside the specimen stack. The latter can be achieved only by additional electrical heating of the axial ends of the capsules. Division of the heater in three sections with separate power supply and control units is necessary. Maintaining of the temperatures during beam-off periods likewise requires electrical heating. The required electrical heaters - mineral isolated wires - are commercially available. The potential of the CFD code STAR-CD for the thermal hydraulic analysis of complex systems like the HFTM was confirmed. Nevertheless, experimental confirmation is desirable. Suitable experiments are under preparation. To verify the assumptions made on the thermal conductivity of the contact faces and layers between the two shells of the rig, dedicated experiments are suggested. The present work must be complemented by a thermal mechanical analysis of the module. Most critical component in this respect seems to be the rig wall. Furthermore, it will be necessary to investigate the response of the HFTM to power transients, and to determine the requirements

  6. Optimised design and thermal-hydraulic analysis of the IFMIF/HFTM test section

    International Nuclear Information System (INIS)

    Gordeev, S.; Heinzel, V.; Lang, K.H.; Moeslang, A.; Schleisiek, K.; Slobodtchouk, V.; Stratmanns, E.

    2003-10-01

    On the basis of previous concepts, analyses and experiments, the high flux test module (HFTM) for the International Fusion Materials Irradiation Facility (IFMIF) was further optimised. The work focused on the design and the thermal hydraulic analysis of the HFTM section containing the material specimens to be irradiated, the ''test section'', with the main objective to improve the concept with respect to the optimum use of the available irradiation volume and to the temperature of the specimens. Particular emphasis was laid on the application of design principles which assure stable and reproducible thermal conditions. The present work has confirmed the feasibility and suitability of the optimised design of the HFTM test section with chocolate plate like shaped rigs. In particular it has been shown that the envisaged irradiation temperatures can be reached with acceptable temperature differences inside the specimen stack. The latter can be achieved only by additional electrical heating of the axial ends of the capsules. Division of the heater in three sections with separate power supply and control units is necessary. Maintaining of the temperatures during beam-off periods likewise requires electrical heating. The required electrical heaters - mineral isolated wires - are commercially available. The potential of the CFD code STAR-CD for the thermal hydraulic analysis of complex systems like the HFTM was confirmed. Nevertheless, experimental confirmation is desirable. Suitable experiments are under preparation. To verify the assumptions made on the thermal conductivity of the contact faces and layers between the two shells of the rig, dedicated experiments are suggested. The present work must be complemented by a thermal mechanical analysis of the module. Most critical component in this respect seems to be the rig wall. Furthermore, it will be necessary to investigate the response of the HFTM to power transients, and to determine the requirements on the electrical

  7. Manual of Documentation Practices Applicable to Defence-Aerospace Scientific and Technical Information. Volume 1. Section 1 - Acquisition and Sources. Section 2 - Descriptive Cataloguing. Section 3 - Abstracting and Subject Analysis

    Science.gov (United States)

    1978-08-01

    be added for new subcategories. The Dewey Decimal Classification, the Library of Congress Classification of the United States, and the Universal...to be published. A translation duplicate is a translation of a report or an article into another language. DDC (DOD/USA) Defense Documentation Center...controls need not be elaborate. The system described below has proved adequate over a number of years of operation at the Defense Documentation Center ( DDC

  8. Automatic registration of multi-modal microscopy images for integrative analysis of prostate tissue sections

    International Nuclear Information System (INIS)

    Lippolis, Giuseppe; Edsjö, Anders; Helczynski, Leszek; Bjartell, Anders; Overgaard, Niels Chr

    2013-01-01

    Prostate cancer is one of the leading causes of cancer related deaths. For diagnosis, predicting the outcome of the disease, and for assessing potential new biomarkers, pathologists and researchers routinely analyze histological samples. Morphological and molecular information may be integrated by aligning microscopic histological images in a multiplex fashion. This process is usually time-consuming and results in intra- and inter-user variability. The aim of this study is to investigate the feasibility of using modern image analysis methods for automated alignment of microscopic images from differently stained adjacent paraffin sections from prostatic tissue specimens. Tissue samples, obtained from biopsy or radical prostatectomy, were sectioned and stained with either hematoxylin & eosin (H&E), immunohistochemistry for p63 and AMACR or Time Resolved Fluorescence (TRF) for androgen receptor (AR). Image pairs were aligned allowing for translation, rotation and scaling. The registration was performed automatically by first detecting landmarks in both images, using the scale invariant image transform (SIFT), followed by the well-known RANSAC protocol for finding point correspondences and finally aligned by Procrustes fit. The Registration results were evaluated using both visual and quantitative criteria as defined in the text. Three experiments were carried out. First, images of consecutive tissue sections stained with H&E and p63/AMACR were successfully aligned in 85 of 88 cases (96.6%). The failures occurred in 3 out of 13 cores with highly aggressive cancer (Gleason score ≥ 8). Second, TRF and H&E image pairs were aligned correctly in 103 out of 106 cases (97%). The third experiment considered the alignment of image pairs with the same staining (H&E) coming from a stack of 4 sections. The success rate for alignment dropped from 93.8% in adjacent sections to 22% for sections furthest away. The proposed method is both reliable and fast and therefore well suited

  9. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0, technical reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; Atwood, C.L.; Galyean, W.J.; Sattison, M.B.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume provides information on the principles used in the construction and operation of Version 5.0 of the Integrated Reliability and Risk Analysis System (IRRAS) and the System Analysis and Risk Assessment (SARA) system. It summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms that these programs use to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that are appropriate under various assumptions concerning repairability and mission time. It defines the measures of basic event importance that these programs can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by these programs to generate random basic event probabilities from various distributions. Further references are given, and a detailed example of the reduction and quantification of a simple fault tree is provided in an appendix

  10. Boltzmann equation analysis of electron-molecule collision cross sections in water vapor and ammonia

    International Nuclear Information System (INIS)

    Yousfi, M.; Benabdessadok, M.D.

    1996-01-01

    Sets of electron-molecule collision cross sections for H 2 O and NH 3 have been determined from a classical technique of electron swarm parameter unfolding. This deconvolution method is based on a simplex algorithm using a powerful multiterm Boltzmann equation analysis established in the framework of the classical hydrodynamic approximation. It is well adapted for the simulation of the different classes of swarm experiments (i.e., time resolved, time of flight, and steady state experiments). The sets of collision cross sections that exist in the literature are reviewed and analyzed. Fitted sets of cross sections are determined for H 2 O and NH 3 which exhibit features characteristic of polar molecules such as high rotational excitation collision cross sections. The hydrodynamic swarm parameters (i.e., drift velocity, longitudinal and transverse diffusion coefficients, ionization and attachment coefficients) calculated from the fitted sets are in excellent agreement with the measured ones. These sets are finally used to calculate the transport and reaction coefficients needed for discharge modeling in two cases of typical gas mixtures for which experimental swarm data are very sparse or nonexistent (i.e., flue gas mixtures and gas mixtures for rf plasma surface treatment). copyright 1996 American Institute of Physics

  11. Unified Formulation Applied to Free Vibrations Finite Element Analysis of Beams with Arbitrary Section

    Directory of Open Access Journals (Sweden)

    E. Carrera

    2011-01-01

    Full Text Available This paper presents hierarchical finite elements on the basis of the Carrera Unified Formulation for free vibrations analysis of beam with arbitrary section geometries. The displacement components are expanded in terms of the section coordinates, (x, y, using a set of 1-D generalized displacement variables. N-order Taylor type expansions are employed. N is a free parameter of the formulation, it is supposed to be as high as 4. Linear (2 nodes, quadratic (3 nodes and cubic (4 nodes approximations along the beam axis, (z, are introduced to develop finite element matrices. These are obtained in terms of a few fundamental nuclei whose form is independent of both N and the number of element nodes. Natural frequencies and vibration modes are computed. Convergence and assessment with available results is first made considering different type of beam elements and expansion orders. Additional analyses consider different beam sections (square, annular and airfoil shaped as well as boundary conditions (simply supported and cantilever beams. It has mainly been concluded that the proposed model is capable of detecting 3-D effects on the vibration modes as well as predicting shell-type vibration modes in case of thin walled beam sections.

  12. Sectional analysis for volume determination and selection of volume equations for the Tapajos Nacional Forest

    Directory of Open Access Journals (Sweden)

    Renato Bezerra da Silva Ribeiro

    2014-12-01

    Full Text Available The aim of this study was to analyze different sections lengths for volume determination, fitting of volumetric models for timber production estimation in an area of forest management in the Tapajós National Forest (FNT. Six treatments for sectioning were tested in 152 logs of 12 commercial species. The obtained volumes were statistically compared by analysis of variance (ANOVA for the choice of the best method of sectioning and calculating the actual volume of 2,094 sample trees in different diameter commercial classes. Ten mathematical models were fitted to the whole data and to the species Manilkara huberi (Ducke Chevalier (maçaranduba Lecythis lurida (Miers Samori (jarana and Hymenaea courbaril L. (Jatobá. The criteria to choose the best model were adjusted coefficient of determination in percentage (R2adj%, standard error of estimate in percentage (Syx%, significance of the parameters, normality of residuals, Variance Inflation Factor (VIF and residuals graphic distribution. There was no statistical difference between the methods of sectioning and thus the total length of the logs was more operational in the field. The models in logarithmic form of Schumacher and Hall and Spurr were the best to estimate the volume for the species and for the whole sample set.

  13. Multi-Resolution Wavelet-Transformed Image Analysis of Histological Sections of Breast Carcinomas

    Directory of Open Access Journals (Sweden)

    Hae-Gil Hwang

    2005-01-01

    Full Text Available Multi-resolution images of histological sections of breast cancer tissue were analyzed using texture features of Haar- and Daubechies transform wavelets. Tissue samples analyzed were from ductal regions of the breast and included benign ductal hyperplasia, ductal carcinoma in situ (DCIS, and invasive ductal carcinoma (CA. To assess the correlation between computerized image analysis and visual analysis by a pathologist, we created a two-step classification system based on feature extraction and classification. In the feature extraction step, we extracted texture features from wavelet-transformed images at 10× magnification. In the classification step, we applied two types of classifiers to the extracted features, namely a statistics-based multivariate (discriminant analysis and a neural network. Using features from second-level Haar transform wavelet images in combination with discriminant analysis, we obtained classification accuracies of 96.67 and 87.78% for the training and testing set (90 images each, respectively. We conclude that the best classifier of carcinomas in histological sections of breast tissue are the texture features from the second-level Haar transform wavelet images used in a discriminant function.

  14. SU-D-202-02: Quantitative Imaging: Correlation Between Image Feature Analysis and the Accuracy of Manually Drawn Contours On PET Images

    Energy Technology Data Exchange (ETDEWEB)

    Lamichhane, N; Johnson, P; Chinea, F; Patel, V; Yang, F [University of Miami, Miami, FL (United States)

    2016-06-15

    Purpose: To evaluate the correlation between image features and the accuracy of manually drawn target contours on synthetic PET images Methods: A digital PET phantom was used in combination with Monte Carlo simulation to create a set of 26 simulated PET images featuring a variety of tumor shapes and activity heterogeneity. These tumor volumes were used as a gold standard in comparisons with manual contours delineated by 10 radiation oncologist on the simulated PET images. Metrics used to evaluate segmentation accuracy included the dice coefficient, false positive dice, false negative dice, symmetric mean absolute surface distance, and absolute volumetric difference. Image features extracted from the simulated tumors consisted of volume, shape complexity, mean curvature, and intensity contrast along with five texture features derived from the gray-level neighborhood difference matrices including contrast, coarseness, busyness, strength, and complexity. Correlation between these features and contouring accuracy were examined. Results: Contour accuracy was reasonably well correlated with a variety of image features. Dice coefficient ranged from 0.7 to 0.90 and was correlated closely with contrast (r=0.43, p=0.02) and complexity (r=0.5, p<0.001). False negative dice ranged from 0.10 to 0.50 and was correlated closely with contrast (r=0.68, p<0.001) and complexity (r=0.66, p<0.001). Absolute volumetric difference ranged from 0.0002 to 0.67 and was correlated closely with coarseness (r=0.46, p=0.02) and complexity (r=0.49, p=0.008). Symmetric mean absolute difference ranged from 0.02 to 1 and was correlated closely with mean curvature (r=0.57, p=0.02) and contrast (r=0.6, p=0.001). Conclusion: The long term goal of this study is to assess whether contouring variability can be reduced by providing feedback to the practitioner based on image feature analysis. The results are encouraging and will be used to develop a statistical model which will enable a prediction of

  15. The organizational measurement manual

    National Research Council Canada - National Science Library

    Wealleans, David

    2001-01-01

    ... Relationship of process to strategic measurements Summary 37 36Contents 19/10/2000 1:23 pm Page vi vi THE ORGANIZATIONAL MEASUREMENT MANUAL 4 PART 2 ESTABLISHING A PROCESS MEASUREMENT PROGRAMME...

  16. Geochemical engineering reference manual

    Energy Technology Data Exchange (ETDEWEB)

    Owen, L.B.; Michels, D.E.

    1984-01-01

    The following topics are included in this manual: physical and chemical properties of geothermal brine and steam, scale and solids control, processing spent brine for reinjection, control of noncondensable gas emissions, and goethermal mineral recovery. (MHR)

  17. NCDC Archive Documentation Manuals

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Climatic Data Center Tape Deck Documentation library is a collection of over 400 manuals describing NCDC's digital holdings (both historic and current)....

  18. EMAP Users Manual.

    Science.gov (United States)

    Kotz, Arnold; Redondo, Rory

    Presented is the user's manual for the Educational Manpower Information Sources Project (EMAP), an information file containing approximately 325 document abstracts related to the field of educational planning. (The EMAP file is described in document SP 006 747.) (JB)

  19. WAVELET ANALYSIS AND NEURAL NETWORK CLASSIFIERS TO DETECT MID-SAGITTAL SECTIONS FOR NUCHAL TRANSLUCENCY MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Giuseppa Sciortino

    2016-04-01

    Full Text Available We propose a methodology to support the physician in the automatic identification of mid-sagittal sections of the fetus in ultrasound videos acquired during the first trimester of pregnancy. A good mid-sagittal section is a key requirement to make the correct measurement of nuchal translucency which is one of the main marker for screening of chromosomal defects such as trisomy 13, 18 and 21. NT measurement is beyond the scope of this article. The proposed methodology is mainly based on wavelet analysis and neural network classifiers to detect the jawbone and on radial symmetry analysis to detect the choroid plexus. Those steps allow to identify the frames which represent correct mid-sagittal sections to be processed. The performance of the proposed methodology was analyzed on 3000 random frames uniformly extracted from 10 real clinical ultrasound videos. With respect to a ground-truth provided by an expert physician, we obtained a true positive, a true negative and a balanced accuracy equal to 87.26%, 94.98% and 91.12% respectively.

  20. Evaluation of conservatisms and environmental effects in ASME Code, Section III, Class 1 fatigue analysis

    International Nuclear Information System (INIS)

    Deardorff, A.F.; Smith, J.K.

    1994-08-01

    This report documents the results of a study regarding the conservatisms in ASME Code Section 3, Class 1 component fatigue evaluations and the effects of Light Water Reactor (LWR) water environments on fatigue margins. After review of numerous Class 1 stress reports, it is apparent that there is a substantial amount of conservatism present in many existing component fatigue evaluations. With little effort, existing evaluations could be modified to reduce the overall predicted fatigue usage. Areas of conservatism include design transients considerably more severe than those experienced during service, conservative grouping of transients, conservatisms that have been removed in later editions of Section 3, bounding heat transfer and stress analysis, and use of the ''elastic-plastic penalty factor'' (K 3 ). Environmental effects were evaluated for two typical components that experience severe transient thermal cycling during service, based on both design transients and actual plant data. For all reasonable values of actual operating parameters, environmental effects reduced predicted margins, but fatigue usage was still bounded by the ASME Section 3 fatigue design curves. It was concluded that the potential increase in predicted fatigue usage due to environmental effects should be more than offset by decreases in predicted fatigue usage if re-analysis were conducted to reduce the conservatisms that are present in existing component fatigue evaluations

  1. QCD analysis of neutral and charged current cross sections and search for contact interactions at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Pirumov, Hayk

    2013-11-15

    A QCD analysis of the inclusive deep inelastic ep scattering cross section measured by the H1 experiment at HERA is presented. The data correspond to a total integrated luminosity of about 0.5 fb{sup -1} and covers a kinematic range of 0.5 GeV{sup 2} - 30000 GeV{sup 2} in the negative four-momentum transfer Q{sup 2} and 3 . 10{sup -5} - 0.65 in Bjorken x. The performed QCD analysis of the double differential neutral and charged current cross sections results in a set of parton distribution functions H1PDF 2012. The precise data from HERA II period in the kinematic region of high Q{sup 2} considerably improve the accuracy of the PDFs at the high x. In addition a search for signs of new physics using single differential neutral current cross section measurements at high Q{sup 2} is performed. The observed good agreement of the analysed data with the Standard Model predictions allows to set constraints on various new physics models within the framework of contact interactions. Limits are derived on the compositeness scale for general contact interactions, on the ratio of mass to the Yukawa coupling for heavy leptoquark models, on the effective Plank-mass scale in the large extra dimension models and on the quark radius.

  2. Sensitivity analysis of U238 cross sections in fast nuclear systems-SENSEAV-R computer code

    International Nuclear Information System (INIS)

    Amorim, E.S. de; D'Oliveira, A.B.; Oliveira, E.C. de

    1981-01-01

    For many performance parameters of reactors the tabulated ratio calculation/experiment indicate that some potential problems may exist either in the cross section data or in the calculation models used to investigate the critical experimental data. A first step toward drawing a more definite conclusion is to perform a selective analysis of sensitivity profiles and covariance data files for the cross section data used in the calculation. Many works in the current literature show that some of these uncertainties come from uncertainties in 238 U(n,γ), 238 U(n,f) 239 Pu(n,f). Perturbation methods were developed to analyze the effects of finite changes in a large number of cross sections and summarize the investigation by a group dependent sensitivity coefficient. As an application, the results of this investigation indicates that improvements should be done only on the medium and low energy ranges of 238 U(n,γ) based on an analysis of cost and economic benefits. (Author) [pt

  3. DIMAC program user's manual

    International Nuclear Information System (INIS)

    Lee, Byoung Oon; Song, Tae Young

    2003-11-01

    DIMAC (A DIspersion Metallic fuel performance Analysis Code) is a computer program for simulating the behavior of dispersion fuel rods under normal operating conditions of HYPER. It computes the one-dimensional temperature distribution and the thermo-mechanical characteristics of fuel rod under the steady state operation condition, including the swelling and rod deformation. DIMAC was developed based on the experience of research reactor fuel. DIMAC consists of the temperature calculation module, the mechanical swelling calculation module, and the fuel deformation calculation module in order to predict the deformation of a dispersion fuel as a function of power history. Because there are a little of available U-TRU-Zr or TRU-Zr characteristics, the material data of U-Pu-Zr or Pu-Zr are used for those of U-TRU-Zr or TRU-Zr. This report is mainly intended as a user's manual for the DIMAC code. The general description on this code, the description on input parameter, the description on each subroutine, the sample problem and the sample input and partial output are written in this report

  4. DIMAC program user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byoung Oon; Song, Tae Young

    2003-11-01

    DIMAC (A DIspersion Metallic fuel performance Analysis Code) is a computer program for simulating the behavior of dispersion fuel rods under normal operating conditions of HYPER. It computes the one-dimensional temperature distribution and the thermo-mechanical characteristics of fuel rod under the steady state operation condition, including the swelling and rod deformation. DIMAC was developed based on the experience of research reactor fuel. DIMAC consists of the temperature calculation module, the mechanical swelling calculation module, and the fuel deformation calculation module in order to predict the deformation of a dispersion fuel as a function of power history. Because there are a little of available U-TRU-Zr or TRU-Zr characteristics, the material data of U-Pu-Zr or Pu-Zr are used for those of U-TRU-Zr or TRU-Zr. This report is mainly intended as a user's manual for the DIMAC code. The general description on this code, the description on input parameter, the description on each subroutine, the sample problem and the sample input and partial output are written in this repo0008.

  5. IAEA safeguards technical manual

    International Nuclear Information System (INIS)

    1982-03-01

    Part F of the Safeguards Technical Manual is being issued in three volumes. Volume 1 was published in 1977 and revised slightly in 1979. Volume 1 discusses basic probability concepts, statistical inference, models and measurement errors, estimation of measurement variances, and calibration. These topics of general interest in a number of application areas, are presented with examples drawn from nuclear materials safeguards. The final two chapters in Volume 1 deal with problem areas unique to safeguards: calculating the variance of MUF and of D respectively. Volume 2 continues where Volume 1 left off with a presentation of topics of specific interest to Agency safeguards. These topics include inspection planning from a design and effectiveness evaluation viewpoint, on-facility site inspection activities, variables data analysis as applied to inspection data, preparation of inspection reports with respect to statistical aspects of the inspection, and the distribution of inspection samples to more than one analytical laboratory. Volume 3 covers generally the same material as Volumes 1 and 2 but with much greater unity and cohesiveness. Further, the cook-book style of the previous two volumes has been replaced by one that makes use of equations and formulas as opposed to computational steps, and that also provides the bases for the statistical procedures discussed. Hopefully, this will help minimize the frequency of misapplications of the techniques

  6. Evaluation of pressed powders and thin section standards for multi-elemental analysis by conventional and micro-PIXE analysis

    International Nuclear Information System (INIS)

    Homma-Takeda, Shino; Iso, Hiroyuki; Ito, Masaki

    2010-01-01

    For multi-elemental analysis, various standards are used to quantify the elements consists of environmental and biological samples. In this paper two different configuration standards, pressed powders and thin section standards, were assessed for their purpose as standards by conventional and micro-PIXE analysis. Homogeneity of manganese, iron, zinc (Zn), copper and yttrium added to pressed powder standard materials were validated and the relative standard deviation (RSD) of the X-ray intensity of the standards was 2 area and the metal concentration was acceptable. (author)

  7. Peace Corps Aquaculture Training Manual. Training Manual T0057.

    Science.gov (United States)

    Peace Corps, Washington, DC. Information Collection and Exchange Div.

    This Peace Corps training manual was developed from two existing manuals to provide a comprehensive training program in fish production for Peace Corps volunteers. The manual encompasses the essential elements of the University of Oklahoma program that has been training volunteers in aquaculture for 25 years. The 22 chapters of the manual are…

  8. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited......This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  9. Nonlinear Analysis and Preliminary Testing Results of a Hybrid Wing Body Center Section Test Article

    Science.gov (United States)

    Przekop, Adam; Jegley, Dawn C.; Rouse, Marshall; Lovejoy, Andrew E.; Wu, Hsi-Yung T.

    2015-01-01

    A large test article was recently designed, analyzed, fabricated, and successfully tested up to the representative design ultimate loads to demonstrate that stiffened composite panels with through-the-thickness reinforcement are a viable option for the next generation large transport category aircraft, including non-conventional configurations such as the hybrid wing body. This paper focuses on finite element analysis and test data correlation of the hybrid wing body center section test article under mechanical, pressure and combined load conditions. Good agreement between predictive nonlinear finite element analysis and test data is found. Results indicate that a geometrically nonlinear analysis is needed to accurately capture the behavior of the non-circular pressurized and highly-stressed structure when the design approach permits local buckling.

  10. Role of ''standard'' fine-group cross section libraries in shielding analysis

    International Nuclear Information System (INIS)

    Weisbin, C.R.; Roussin, R.W.; Oblow, E.M.; Cullen, D.E.; White, J.E.; Wright, R.Q.

    1977-01-01

    The Divisions of Magnetic Fusion Energy (DMFE) and Reactor Development and Demonstration (DRDD) of the United States Energy Research and Development Administration (ERDA) have jointly sponsored the development of a 171 neutron, 36 gamma ray group pseudo composition independent cross section library based upon ENDF/B-IV. This library (named VITAMIN-C and packaged by RSIC as DLC-41) is intended to be generally applicable to fusion blanket and LMFBR core and shield analysis. The purpose of this paper is to evaluate this library as a possible candidate for specific designation as a ''standard'' in light of American Nuclear Society standards for fine-group cross section data sets. The rationale and qualification procedure for such a standard are discussed. Finally, current limitations and anticipated extensions to this processed data file are described

  11. Finite element analysis of composite beam-to-column connection with cold-formed steel section

    Science.gov (United States)

    Firdaus, Muhammad; Saggaff, Anis; Tahir, Mahmood Md

    2017-11-01

    Cold-formed steel (CFS) sections are well known due to its lightweight and high structural performance which is very popular for building construction. Conventionally, they are used as purlins and side rails in the building envelopes of the industrial buildings. Recent research development on cold-formed steel has shown that the usage is expanded to the use in composite construction. This paper presents the modelling of the proposed composite connection of beam-to-column connection where cold-formed steel of lipped steel section is positioned back-to-back to perform as beam. Reinforcement bars is used to perform the composite action anchoring to the column and part of it is embedded into a slab. The results of the finite element and numerical analysis has showed good agreement. The results show that the proposed composite connection contributes to significant increase to the moment capacity.

  12. A Cross-Section Adjustment Method for Double Heterogeneity Problem in VHTGR Analysis

    International Nuclear Information System (INIS)

    Yun, Sung Hwan; Cho, Nam Zin

    2011-01-01

    Very High Temperature Gas-Cooled Reactors (VHTGRs) draw strong interest as candidates for a Gen-IV reactor concept, in which TRISO (tristructuralisotropic) fuel is employed to enhance the fuel performance. However, randomly dispersed TRISO fuel particles in a graphite matrix induce the so-called double heterogeneity problem. For design and analysis of such reactors with the double heterogeneity problem, the Monte Carlo method is widely used due to its complex geometry and continuous-energy capabilities. However, its huge computational burden, even in the modern high computing power, is still problematic to perform wholecore analysis in reactor design procedure. To address the double heterogeneity problem using conventional lattice codes, the RPT (Reactivityequivalent Physical Transformation) method considers a homogenized fuel region that is geometrically transformed to provide equivalent self-shielding effect. Another method is the coupled Monte Carlo/Collision Probability method, in which the absorption and nu-fission resonance cross-section libraries in the deterministic CPM3 lattice code are modified group-wise by the double heterogeneity factors determined by Monte Carlo results. In this paper, a new two-step Monte Carlo homogenization method is described as an alternative to those methods above. In the new method, a single cross-section adjustment factor is introduced to provide self-shielding effect equivalent to the self-shielding in heterogeneous geometry for a unit cell of compact fuel. Then, the homogenized fuel compact material with the equivalent cross-section adjustment factor is used in continuous-energy Monte Carlo calculation for various types of fuel blocks (or assemblies). The procedure of cross-section adjustment is implemented in the MCNP5 code

  13. Measurement and QCD analysis of the diffractive deep-inelastic scattering cross section at HERA

    International Nuclear Information System (INIS)

    Aktas, A.; Andreev, V.; Anthonis, T.

    2006-05-01

    A detailed analysis is presented of the diffractive deep-inelastic scattering process ep→eXY, where Y is a proton or a low mass proton excitation carrying a fraction 1-x P >0.95 of the incident proton longitudinal momentum and the squared four-momentum transfer at the proton vertex satisfies t 2 . Using data taken by the H1 experiment, the cross section is measured for photon virtualities in the range 3.5 ≤Q 2 ≤1600 GeV 2 , triple differentially in x P , Q 2 and β=x/x P , where x is the Bjorken scaling variable. At low x P , the data are consistent with a factorisable x P dependence, which can be described by the exchange of an effective pomeron trajectory with intercept α P (0)=1.118 ±0.008(exp.) +0.029 -0.010 (model). Diffractive parton distribution functions and their uncertainties are determined from a next-to-leading order DGLAP QCD analysis of the Q 2 and β dependences of the cross section. The resulting gluon distribution carries an integrated fraction of around 70% of the exchanged momentum in the Q 2 range studied. Total and differential cross sections are also measured for the diffractive charged current process e + p → anti ν e XY and are found to be well described by predictions based on the diffractive parton distributions. The ratio of the diffractive to the inclusive neutral current ep cross sections is studied. Over most of the kinematic range, this ratio shows no significant dependence on Q 2 at fixed x P and x or on x at fixed Q 2 and β. (Orig.)

  14. Measurement and QCD analysis of the diffractive deep-inelastic scattering cross section at HERA

    Science.gov (United States)

    Aktas, A.; Andreev, V.; Anthonis, T.; Antunovic, B.; Aplin, S.; Asmone, A.; Astvatsatourov, A.; Babaev, A.; Backovic, S.; Baghdasaryan, A.; Baranov, P.; Barrelet, E.; Bartel, W.; Baudrand, S.; Baumgartner, S.; Beckingham, M.; Behnke, O.; Behrendt, O.; Belousov, A.; Berger, N.; Bizot, J. C.; Boenig, M.-O.; Boudry, V.; Bracinik, J.; Brandt, G.; Brisson, V.; Bruncko, D.; Büsser, F. W.; Bunyatyan, A.; Buschhorn, G.; Bystritskaya, L.; Campbell, A. J.; Cassol-Brunner, F.; Cerny, K.; Cerny, V.; Chekelian, V.; Contreras, J. G.; Coughlan, J. A.; Coppens, Y. R.; Cox, B. E.; Cozzika, G.; Cvach, J.; Dainton, J. B.; Dau, W. D.; Daum, K.; de Boer, Y.; Delcourt, B.; Del Degan, M.; de Roeck, A.; de Wolf, E. A.; Diaconu, C.; Dodonov, V.; Dubak, A.; Eckerlin, G.; Efremenko, V.; Egli, S.; Eichler, R.; Eisele, F.; Eliseev, A.; Elsen, E.; Essenov, S.; Falkewicz, A.; Faulkner, P. J. W.; Favart, L.; Fedotov, A.; Felst, R.; Feltesse, J.; Ferencei, J.; Finke, L.; Fleischer, M.; Flucke, G.; Fomenko, A.; Franke, G.; Frisson, T.; Gabathuler, E.; Garutti, E.; Gayler, J.; Gerlich, C.; Ghazaryan, S.; Ginzburgskaya, S.; Glazov, A.; Glushkov, I.; Goerlich, L.; Goettlich, M.; Gogitidze, N.; Gorbounov, S.; Grab, C.; Greenshaw, T.; Gregori, M.; Grell, B. R.; Grindhammer, G.; Gwilliam, C.; Haidt, D.; Hansson, M.; Heinzelmann, G.; Henderson, R. C. W.; Henschel, H.; Herrera, G.; Hildebrandt, M.; Hiller, K. H.; Hoffmann, D.; Horisberger, R.; Hovhannisyan, A.; Hreus, T.; Hussain, S.; Ibbotson, M.; Ismail, M.; Jacquet, M.; Janssen, X.; Jemanov, V.; Jönsson, L.; Johnson, C. L.; Johnson, D. P.; Jung, A. W.; Jung, H.; Kapichine, M.; Katzy, J.; Kenyon, I. R.; Kiesling, C.; Klein, M.; Kleinwort, C.; Klimkovich, T.; Kluge, T.; Knies, G.; Knutsson, A.; Korbel, V.; Kostka, P.; Krastev, K.; Kretzschmar, J.; Kropivnitskaya, A.; Krüger, K.; Landon, M. P. J.; Lange, W.; Laštovička-Medin, G.; Laycock, P.; Lebedev, A.; Leibenguth, G.; Lendermann, V.; Levonian, S.; Lindfeld, L.; Lipka, K.; Liptaj, A.; List, B.; List, J.; Lobodzinska, E.; Loktionova, N.; Lopez-Fernandez, R.; Lubimov, V.; Lucaci-Timoce, A.-I.; Lueders, H.; Lux, T.; Lytkin, L.; Makankine, A.; Malden, N.; Malinovski, E.; Marage, P.; Marshall, R.; Marti, L.; Martisikova, M.; Martyn, H.-U.; Maxfield, S. J.; Mehta, A.; Meier, K.; Meyer, A. B.; Meyer, H.; Meyer, J.; Michels, V.; Mikocki, S.; Milcewicz-Mika, I.; Milstead, D.; Mladenov, D.; Mohamed, A.; Moreau, F.; Morozov, A.; Morris, J. V.; Mozer, M. U.; Müller, K.; Murín, P.; Nankov, K.; Naroska, B.; Naumann, T.; Newman, P. R.; Niebuhr, C.; Nikiforov, A.; Nowak, G.; Nowak, K.; Nozicka, M.; Oganezov, R.; Olivier, B.; Olsson, J. E.; Osman, S.; Ozerov, D.; Palichik, V.; Panagoulias, I.; Papadopoulou, T.; Pascaud, C.; Patel, G. D.; Peng, H.; Perez, E.; Perez-Astudillo, D.; Perieanu, A.; Petrukhin, A.; Pitzl, D.; Plačakytė, R.; Portheault, B.; Povh, B.; Prideaux, P.; Rahmat, A. J.; Raicevic, N.; Reimer, P.; Rimmer, A.; Risler, C.; Rizvi, E.; Robmann, P.; Roland, B.; Roosen, R.; Rostovtsev, A.; Rurikova, Z.; Rusakov, S.; Salvaire, F.; Sankey, D. P. C.; Sauter, M.; Sauvan, E.; Schilling, F.-P.; Schmidt, S.; Schmitt, S.; Schmitz, C.; Schoeffel, L.; Schöning, A.; Schultz-Coulon, H.-C.; Sefkow, F.; Shaw-West, R. N.; Sheviakov, I.; Shtarkov, L. N.; Sloan, T.; Smirnov, P.; Soloviev, Y.; South, D.; Spaskov, V.; Specka, A.; Steder, M.; Stella, B.; Stiewe, J.; Stoilov, A.; Straumann, U.; Sunar, D.; Tchoulakov, V.; Thompson, G.; Thompson, P. D.; Toll, T.; Tomasz, F.; Traynor, D.; Trinh, T. N.; Truöl, P.; Tsakov, I.; Tsipolitis, G.; Tsurin, I.; Turnau, J.; Tzamariudaki, E.; Urban, K.; Urban, M.; Usik, A.; Utkin, D.; Valkárová, A.; Vallée, C.; van Mechelen, P.; Vargas Trevino, A.; Vazdik, Y.; Veelken, C.; Vinokurova, S.; Volchinski, V.; Wacker, K.; Weber, G.; Weber, R.; Wegener, D.; Werner, C.; Wessels, M.; Wessling, B.; Wissing, C.; Wolf, R.; Wünsch, E.; Xella, S.; Yan, W.; Yeganov, V.; Žáček, J.; Zálešák, J.; Zhang, Z.; Zhelezov, A.; Zhokin, A.; Zhu, Y. C.; Zimmermann, J.; Zimmermann, T.; Zohrabyan, H.; Zomer, F.

    2006-12-01

    A detailed analysis is presented of the diffractive deep-inelastic scattering process ep→eXY, where Y is a proton or a low mass proton excitation carrying a fraction 1-xIP>0.95 of the incident proton longitudinal momentum and the squared four-momentum transfer at the proton vertex satisfies |t|<1 GeV2. Using data taken by the H1 experiment, the cross section is measured for photon virtualities in the range 3.5≤Q2≤1600 GeV2, triple differentially in xIP, Q2 and β=x/xIP, where x is the Bjorken scaling variable. At low xIP, the data are consistent with a factorisable xIP dependence, which can be described by the exchange of an effective pomeron trajectory with intercept αIP(0)=1.118±0.008(exp.)+0.029 -0.010(model). Diffractive parton distribution functions and their uncertainties are determined from a next-to-leading order DGLAP QCD analysis of the Q2 and β dependences of the cross section. The resulting gluon distribution carries an integrated fraction of around 70% of the exchanged momentum in the Q2 range studied. Total and differential cross sections are also measured for the diffractive charged current process e+p→ν¯eXY and are found to be well described by predictions based on the diffractive parton distributions. The ratio of the diffractive to the inclusive neutral current ep cross sections is studied. Over most of the kinematic range, this ratio shows no significant dependence on Q2 at fixed xIP and x or on x at fixed Q2 and β.

  15. Nuclear science references coding manual

    International Nuclear Information System (INIS)

    Ramavataram, S.; Dunford, C.L.

    1996-08-01

    This manual is intended as a guide to Nuclear Science References (NSR) compilers. The basic conventions followed at the National Nuclear Data Center (NNDC), which are compatible with the maintenance and updating of and retrieval from the Nuclear Science References (NSR) file, are outlined. In Section H, the structure of the NSR file such as the valid record identifiers, record contents, text fields as well as the major TOPICS for which are prepared are enumerated. Relevant comments regarding a new entry into the NSR file, assignment of , generation of and linkage characteristics are also given in Section II. In Section III, a brief definition of the Keyword abstract is given followed by specific examples; for each TOPIC, the criteria for inclusion of an article as an entry into the NSR file as well as coding procedures are described. Authors preparing Keyword abstracts either to be published in a Journal (e.g., Nucl. Phys. A) or to be sent directly to NNDC (e.g., Phys. Rev. C) should follow the illustrations in Section III. The scope of the literature covered at the NNDC, the categorization into Primary and Secondary sources, etc., is discussed in Section IV. Useful information regarding permitted character sets, recommended abbreviations, etc., is given under Section V as Appendices

  16. SMACS: a system of computer programs for probabilistic seismic analysis of structures and subsystems. Volume I. User's manual

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.; Bumpus, S.; Gerhard, M.A.

    1985-03-01

    The SMACS (Seismic Methodology Analysis Chain with Statistics) system of computer programs, one of the major computational tools of the Seismic Safety Margins Research Program (SSMRP), links the seismic input with the calculation of soil-structure interaction, major structure response, and subsystem response. The seismic input is defined by ensembles of acceleration time histories in three orthogonal directions. Soil-structure interaction and detailed structural response are then determined simultaneously, using the substructure approach to SSI as implemented in the CLASSI family of computer programs. The modus operandi of SMACS is to perform repeated deterministic analyses, each analysis simulating an earthquake occurrence. Parameter values for each simulation are sampled from assumed probability distributions according to a Latin hypercube experimental design. The user may specify values of the coefficients of variation (COV) for the distributions of the input variables. At the heart of the SMACS system is the computer program SMAX, which performs the repeated SSI response calculations for major structure and subsystem response. This report describes SMAX and the pre- and post-processor codes, used in conjunction with it, that comprise the SMACS system

  17. R-matrix analysis of the 235U neutron cross sections

    International Nuclear Information System (INIS)

    Leal, L.C.; de Saussure, G.; Perez, R.B.

    1988-01-01

    The ENDFB-V representation of the 235 U neutron cross sections in the resolved resonance region is unsatisfactory: below 1 eV the cross sections are given by ''smooth files'' (file 3) rather than by resonance parameters; above 1 eV the single-level formalism used by ENDFB-V necessitates a structured file 3 contribution consisting of more than 1300 energy points; furthermore, information on level-spins has not been included. Indeed the ENDFB-V 235 U resonance region is based on an analysis done in 1970 for ENDFB-III and therefore does not include the results of high quality measurements done in the past 18 years. The present paper presents the result of an R-matrix multilevel analysis of recent measurements as well as older data. The analysis also extends the resolved resonance region from its ENDFB-V upper limit of 81 eV to 110 eV. 13 refs., 2 figs., 1 tab

  18. Quantitative allochem compositional analysis of Lochkovian-Pragian boundary sections in the Prague Basin (Czech Republic)

    Science.gov (United States)

    Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich

    2017-06-01

    Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.

  19. Age-Related Trends in Hip Arthroscopy: A Large Cross-Sectional Analysis.

    Science.gov (United States)

    Sing, David C; Feeley, Brian T; Tay, Bobby; Vail, Thomas P; Zhang, Alan L

    2015-12-01

    To analyze a large national private payer population in the United States for trends over time in hip arthroscopy by age groups and to determine the rate of conversion to total hip arthroplasty (THA) after hip arthroscopy. We performed a retrospective analysis using the PearlDiver private insurance patient record database from 2007 through 2011. Hip arthroscopy procedures including newly introduced codes such as osteochondroplasty of cam and pincer lesions and labral repair were queried. Hip arthroscopy incidence and conversion rates to THA were stratified by age. Chi-squared analysis was used for statistical comparison. Conversion to THA was evaluated using Kaplan-Meier analysis. From 2007 through 2011, 20,484,172 orthopaedic patients were analyzed. Hip arthroscopy was performed in 8,227 cases (mean annual incidence, 2.7 cases per 10,000 orthopaedic patients). The incidence of hip arthroscopies increased over 250% from 1.6 cases per 10,000 in 2007 to 4.0 cases per 10,000 in 2011 (P arthroscopy, 17% of patients older than 50 required conversion to THA, compared with arthroscopy procedures are increasing in popularity across all age groups, with patients ages 40 to 49 having the highest incidence in this large cross-sectional population, despite a high rate of early conversion to THA within 2 years in patients over 50. IV, cross-sectional study. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  20. Resource Manual for Functional Vision Evaluation. Technical Assistance Bulletin.

    Science.gov (United States)

    Texas Education Agency, Austin. Div. of Special Education.

    The manual was intended to help school personnel (either an educator of the visually handicapped and/or a certified orientation and mobility instructor) evaluate the functional use of residual vision. The manual is organized in three major sections, which cover the following: (1) the general areas addressed in functional vision evaluation (such as…

  1. Partners Plus: Families and Caregivers in Partnerships. A Family-Centered Guide to Respite Care. Trainer's Workshop Manual [and] Community Planning Manual [and] A Family Manual [and] Caregiver Manual.

    Science.gov (United States)

    Ownby, Lisa L.; Hooke, Amanda C.; Moore, Dee Wylie; Garland, Corinne W.; Frank, Adrienne

    Four manuals on implementing the Partners Respite Model, which provides respite care for children with disabilities or chronic illnesses, comprise this document. The Community Planning Manual offers a step-by-step guide to replication of the Partners Respite Model and is divided into sections on developing the Partners program, implementing the…

  2. Softball for Boys and Girls. Skills Test Manual.

    Science.gov (United States)

    Rikli, Roberta E., Ed.

    The first section of this manual provides information on the history of softball and the development of testing for proficiency in the game. The tests in the manual cover batting, fielding ground balls, overhand throwing, and baserunning. Test norms are listed for males and females at each grade level. A review is included of proper techniques and…

  3. Environmental Measurements Laboratory (EML) procedures manual

    International Nuclear Information System (INIS)

    Chieco, N.A.; Bogen, D.C.; Knutson, E.O.

    1990-11-01

    Volume 1 of this manual documents the procedures and existing technology that are currently used by the Environmental Measurements Laboratory. A section devoted to quality assurance has been included. These procedures have been updated and revised and new procedures have been added. They include: sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications. 228 refs., 62 figs., 37 tabs. (FL)

  4. Public Affairs Manual. Revised 1976 Edition.

    Science.gov (United States)

    American Alliance for Health, Physical Education, and Recreation, Washington, DC.

    This public affairs manual is designed for health, physical education, and recreation personnel. It begins with a position statement by the American Alliance for Health, Physical Education, and Recreation (AAHPER). In section two, resources and procedures for crises action at the local and state level are discussed. Several organizational models…

  5. NJOY nuclear data processing system: user's manual

    International Nuclear Information System (INIS)

    MacFarlane, R.E.; Barrett, R.J.; Muir, D.W.; Boicourt, R.M.

    1978-12-01

    The NJOY nuclear data processing system is a comprehensive computer code package for producing cross sections for neutron and photon transport calculations from ENDF/B-IV and -V evaluated nuclear data. This user's manual provides a concise description of the code, input instructions, sample problems, and installation instructions. 1 figure, 3 tables

  6. Interlibrary Loan Communications Subsystem: Users Manual.

    Science.gov (United States)

    OCLC Online Computer Library Center, Inc., Dublin, OH.

    The OCLC Interlibrary Loan (ILL) Communications Subsystem provides participating libraries with on-line control of ILL transactions. This user manual includes a glossary of terms related to the procedures in using the system. Sections describe computer entry, searching, loan request form, loan response form, ILL procedures, the special message…

  7. User's manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) software, version 5

    Science.gov (United States)

    Cuffney, Thomas F.; Brightbill, Robin A.

    2011-01-01

    The Invertebrate Data Analysis System (IDAS) software was developed to provide an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The IDAS software is a stand-alone program for personal computers that run Microsoft Windows(Registered). It allows users to read data downloaded from the NAWQA Program Biological Transactional Database (Bio-TDB) or to import data from other sources either as Microsoft Excel(Registered) or Microsoft Access(Registered) files. The program consists of five modules: Edit Data, Data Preparation, Calculate Community Metrics, Calculate Diversities and Similarities, and Data Export. The Edit Data module allows the user to subset data on the basis of taxonomy or sample type, extract a random subsample of data, combine or delete data, summarize distributions, resolve ambiguous taxa (see glossary) and conditional/provisional taxa, import non-NAWQA data, and maintain and create files of invertebrate attributes that are used in the calculation of invertebrate metrics. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa on the basis of laboratory processing notes, delete pupae or terrestrial adults, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa on the basis of the number of sites where a taxon occurs and (or) the abundance of a taxon in a sample, and resolve taxonomic ambiguities by one of four methods. The Calculate Community Metrics module allows the user to calculate 184 community metrics, including metrics based on organism tolerances, functional feeding groups, and behavior. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data Export module allows the user to export data to other software packages (CANOCO, Primer

  8. NETL CO2 Storage prospeCtive Resource Estimation Excel aNalysis (CO2-SCREEN) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sanguinito, Sean M. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Goodman, Angela [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Levine, Jonathan [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2017-04-03

    This user’s manual guides the use of the National Energy Technology Laboratory’s (NETL) CO2 Storage prospeCtive Resource Estimation Excel aNalysis (CO2-SCREEN) tool, which was developed to aid users screening saline formations for prospective CO2 storage resources. CO2- SCREEN applies U.S. Department of Energy (DOE) methods and equations for estimating prospective CO2 storage resources for saline formations. CO2-SCREEN was developed to be substantive and user-friendly. It also provides a consistent method for calculating prospective CO2 storage resources that allows for consistent comparison of results between different research efforts, such as the Regional Carbon Sequestration Partnerships (RCSP). CO2-SCREEN consists of an Excel spreadsheet containing geologic inputs and outputs, linked to a GoldSim Player model that calculates prospective CO2 storage resources via Monte Carlo simulation.

  9. FAMIAS User Manual

    Science.gov (United States)

    Zima, Wolfgang

    2008-10-01

    The excitation of pulsation modes in Beta Cephei and Slowly Pulsating B stars is known to be very sensitive to opacity changes in the stellar interior where T ~ 2 x 10E5 K. In this region differences in opacity up to ~ 50% can be induced by the choice between OPAL and OP opacity tables, and between two different metal mixtures (Grevesse & Noels 1993 and Asplund et al. 2005). We have extended the non-adiabatic computations presented in Miglio et al. (2007) towards models of higher mass and pulsation modes of degree l = 3, and we present here the instability domains in the HR- and log P-log Teff diagrams resulting from different choices of opacity tables, and for three different metallicities. FAMIAS (Frequency Analysis and Mode Identification for AsteroSeismology) is a collection of state-of-the-art software tools for the analysis of photometric and spectroscopic time series data. It is one of the deliverables of the Work Package NA5: Asteroseismology of the European Coordination Action in Helio-and Asteroseismology (HELAS). Two main sets of tools are incorporated in FAMIAS. The first set allows to search for periodicities in the data using Fourier and non-linear least-squares fitting algorithms. The other set allows to carry out a mode identification for the detected pulsation frequencies to determine their pulsational quantum numbers, the harmonic degree, m. The types of stars to which famias is applicable are main-sequence pulsators hotter than the Sun. This includes the Gamma Dor stars, Delta Sct stars, the slowly pulsating B stars and the Beta Cep stars - basically all pulsating main-sequence stars, for which empirical mode identification is required to successfully carry out asteroseismology. This user manual describes how to use the different features of FAMIAS and provides two tutorials that demonstrate the usage of FAMIAS for spectroscopic and photometric mode identification.

  10. SAM Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Rui [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-03-01

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactor concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.

  11. COBRA-SFS [Spent Fuel Storage]: A thermal-hydraulic analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    Rector, D.R.; Cuta, J.M.; Lombardo, N.J.; Michener, T.E.; Wheeler, C.L.

    1986-11-01

    COBRA-SFS (Spent Fuel Storage) is a general thermal-hydraulic analysis computer code used to predict temperatures and velocities in a wide variety of systems. The code was refined and specialized for spent fuel storage system analyses for the US Department of Energy's Commercial Spent Fuel Management Program. The finite-volume equations governing mass, momentum, and energy conservation are written for an incompressible, single-phase fluid. The flow equations model a wide range of conditions including natural circulation. The energy equations include the effects of solid and fluid conduction, natural convection, and thermal radiation. The COBRA-SFS code is structured to perform both steady-state and transient calculations; however, the transient capability has not yet been validated. This volume contains the input instructions for COBRA-SFS and an auxiliary radiation exchange factor code, RADX-1. It is intended to aid the user in becoming familiar with the capabilities and modeling conventions of the code

  12. Analysis of RF section of 250 kW CW C-Band high power klystron

    International Nuclear Information System (INIS)

    Badola, Richa; Kaushik, Meenu; Baloda, Suman; Kirti; Vrati; Lamba, O.S.; Joshi, L.M.

    2012-01-01

    Klystron is a microwave tube which is used as a power amplifier in various applications like radar, particle accelerators and thermonuclear reactors. The paper deals with the analysis of RF section of 250 kW CW C band high power klystron for 50 to 60 kV beam voltage The simulation is done using Poisson's superfish and AJ disk software's Design of cavity is done using superfish. The result of superfish is used to decide the dimensions of the geometry of the cavity and AJ disk is used to determined the centre to centre distances between the cavities in order to obtain the desired powers. (author)

  13. Microbial ecology laboratory procedures manual NASA/MSFC

    Science.gov (United States)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  14. TA-55 change control manual

    International Nuclear Information System (INIS)

    Blum, T.W.; Selvage, R.D.; Courtney, K.H.

    1997-11-01

    This manual is the guide for initiating change at the Plutonium Facility, which handles the processing of plutonium as well as research on plutonium metallurgy. It describes the change and work control processes employed at TA-55 to ensure that all proposed changes are properly identified, reviewed, approved, implemented, tested, and documented so that operations are maintained within the approved safety envelope. All Laboratory groups, their contractors, and subcontractors doing work at TA-55 follow requirements set forth herein. This manual applies to all new and modified processes and experiments inside the TA-55 Plutonium Facility; general plant project (GPP) and line item funded construction projects at TA-55; temporary and permanent changes that directly or indirectly affect structures, systems, or components (SSCs) as described in the safety analysis, including Facility Control System (FCS) software; and major modifications to procedures. This manual does not apply to maintenance performed on process equipment or facility SSCs or the replacement of SSCs or equipment with documented approved equivalents

  15. Cuerpo de Paz Manual de Sistema de Programacion y Capacitacion (Peace Corps Programming and Training System Manual): T0063.

    Science.gov (United States)

    Peace Corps, Washington, DC.

    This Spanish version of the Peace Corps Programming and Training System Manual is designed to help field staff members of the Peace Corps train volunteers. Its task descriptions, guidelines, examples, and definitions are intended to be practical and informative rather than restrictive. The manual is divided into six major sections: (1)…

  16. TV Trouble-Shooting Manual. Volumes 7-8. Part 3: Synchronisation and Deflection Circuits. Student and Instructor's Manuals.

    Science.gov (United States)

    Mukai, Masaaki; Kobayashi, Ryozo

    These volumes are, respectively, the self-instructional student manual and the teacher manual that cover the third set of training topics in this course for television repair technicians. Both contain identical information on synchronization and deflection circuits, including sections on the principle of synchronized deflection, synchronization…

  17. Analysis of neutron cross sections using the coupled-channel theory

    International Nuclear Information System (INIS)

    Tanaka, Shigeya

    1975-01-01

    Fast neutron total and scattering cross sections calculated with the coupled-channel theory and the spherical optical model are compared with experimental data. The optical-potential parameters used in both the calculations were obtained from comparison of calculations with scattering data for 209 Bi. The calculations for total cross sections were made for thirty-five nuclides from 23 Na to 239 Pu in the energy range of 0.25 to 15 MeV, and good results were obtained with the coupled-channel calculations. The comparisons of the calculations with the elastic data for about twenty nuclides were made at incident energies of 8 and 14 MeV. In general, the coupled-channel calculations at 8 MeV have given better agreements with the experimental data than the spherical optical-model calculations. At 14 MeV, differences between both the calculations were small. The analysis was also made for the elastic and inelastic scattering by several nuclei such as Fe, Ni, 120 Sn, Pu in the low energy region, and good results have been given by the coupled-channel calculations. Thus, it is demonstrated that the coupled-channel calculations with one set of the optical parameters well reproduce the total and scattering cross sections over a wide energy and mass region. (auth.)

  18. Measurement and QCD Analysis of Neutral and Charged Current Cross Sections at HERA

    CERN Document Server

    Adloff, C.; Andrieu, B.; Anthonis, T.; Astvatsatourov, A.; Babaev, A.; Bahr, J.; Baranov, P.; Barrelet, E.; Bartel, W.; Baumgartner, S.; Becker, J.; Beckingham, M.; Beglarian, A.; Behnke, O.; Belousov, A.; Berger, C.; Berndt, T.; Bizot, J.C.; Bohme, J.; Boudry, V.; Braunschweig, W.; Brisson, V.; Broker, H.B.; Brown, D.P.; Bruncko, D.; Busser, F.W.; Bunyatyan, A.; Burrage, A.; Buschhorn, G.; Bystritskaya, L.; Campbell, A.J.; Cao, Jun; Caron, S.; Cassol-Brunner, F.; Chekelian, V.; Clarke, D.; Collard, C.; Contreras, J.G.; Coppens, Y.R.; Coughlan, J.A.; Cousinou, M.C.; Cox, B.E.; Cozzika, G.; Cvach, J.; Dainton, J.B.; Dau, W.D.; Daum, K.; Davidsson, M.; Delcourt, B.; Delerue, N.; Demirchyan, R.; De Roeck, A.; De Wolf, E.A.; Diaconu, C.; Dingfelder, J.; Dixon, P.; Dodonov, V.; Dowell, J.D.; Dubak, A.; Duprel, C.; Eckerlin, Guenter; Eckstein, D.; Efremenko, V.; Egli, S.; Eichler, R.; Eisele, F.; Eisenhandler, E.; Ellerbrock, M.; Elsen, E.; Erdmann, M.; Erdmann, W.; Faulkner, P.J.W.; Favart, L.; Fedotov, A.; Felst, R.; Ferencei, J.; Ferron, S.; Fleischer, M.; Fleischmann, P.; Fleming, Y.H.; Flucke, G.; Flugge, G.; Fomenko, A.; Foresti, I.; Formanek, J.; Franke, G.; Frising, G.; Gabathuler, E.; Gabathuler, K.; Garvey, J.; Gassner, J.; Gayler, Joerg; Gerhards, R.; Gerlich, C.; Ghazaryan, Samvel; Goerlich, L.; Gogitidze, N.; Grab, C.; Grabski, V.; Grassler, H.; Greenshaw, T.; Grindhammer, Guenter; Haidt, D.; Hajduk, L.; Haller, J.; Heinemann, B.; Heinzelmann, G.; Henderson, R.C.W.; Hengstmann, S.; Henschel, H.; Henshaw, O.; Heremans, R.; Herrera, G.; Herynek, I.; Hildebrandt, M.; Hilgers, M.; Hiller, K.H.; Hladky, J.; Hoting, P.; Hoffmann, D.; Horisberger, R.; Hovhannisyan, A.; Ibbotson, M.; Issever, C .; Jacquet, M.; Jaffre, M.; Janauschek, L.; Janssen, X.; Jemanov, V.; Jonsson, L.; Johnson, C.; Johnson, D.P.; Jones, M.A.S.; Jung, H.; Kant, D.; Kapichine, M.; Karlsson, M.; Karschnick, O.; Katzy, J.; Keil, F.; Keller, N.; Kennedy, J.; Kenyon, I.R.; Kiesling, Christian M.; Kjellberg, P.; Klein, M.; Kleinwort, C.; Kluge, T.; Knies, G.; Koblitz, B.; Kolya, S.D.; Korbel, V.; Kostka, P.; Koutouev, R.; Koutov, A.; Kroseberg, J.; Kruger, K.; Kuhr, T.; Lamb, D.; Landon, M.P.J.; Lange, W.; Lastovicka, T.; Laycock, P.; Lebailly, E.; Lebedev, A.; Leissner, B.; Lemrani, R.; Lendermann, V.; Levonian, S.; List, B.; Lobodzinska, E.; Lobodzinski, B.; Loginov, A.; Loktionova, N.; Lubimov, V.; Luders, S.; Luke, D.; Lytkin, L.; Malden, N.; Malinovski, E.; Mangano, S.; Marage, P.; Marks, J.; Marshall, R.; Martyn, H.U.; Martyniak, J.; Maxfield, S.J.; Meer, D.; Mehta, A.; Meier, K.; Meyer, A.B.; Meyer, H.; Meyer, J.; Michine, S.; Mikocki, S.; Milstead, D.; Mohrdieck, S.; Mondragon, M.N.; Moreau, F.; Morozov, A.; Morris, J.V.; Muller, K.; Murin, P.; Nagovizin, V.; Naroska, B.; Naumann, J.; Naumann, T.; Newman, Paul R.; Niebergall, F.; Niebuhr, C.; Nix, O.; Nowak, G.; Nozicka, M.; Olivier, B.; Olsson, J.E.; Ozerov, D.; Panassik, V.; Pascaud, C.; Patel, G.D.; Peez, M.; Perez, E.; Petrukhin, A.; Phillips, J.P.; Pitzl, D.; Portheault, B.; Poschl, R.; Potachnikova, I.; Povh, B.; Rauschenberger, J.; Reimer, P.; Reisert, B.; Risler, C.; Rizvi, E.; Robmann, P.; Roosen, R.; Rostovtsev, A.; Rusakov, S.; Rybicki, K.; Sankey, D.P.C.; Sauvan, E.; Schatzel, S.; Scheins, J.; Schilling, F.P.; Schleper, P.; Schmidt, D.; Schmidt, S.; Schmitt, S.; Schneider, M.; Schoeffel, L.; Schoning, A.; Schoerner-Sadenius, Thomas; Schroder, V.; Schultz-Coulon, H.C.; Schwanenberger, C.; Sedlak, K.; Sefkow, F.; Sheviakov, I.; Shtarkov, L.N.; Sirois, Y.; Sloan, T.; Smirnov, P.; Soloviev, Y.; South, D.; Spaskov, V.; Specka, Arnd E.; Spitzer, H.; Stamen, R.; Stella, B.; Stiewe, J.; Strauch, I.; Straumann, U.; Tchetchelnitski, S.; Thompson, Graham; Thompson, P.D.; Tomasz, F.; Traynor, D.; Truoel, Peter; Tsipolitis, G.; Tsurin, I.; Turnau, J.; Turney, J.E.; Tzamariudaki, E.; Uraev, A.; Urban, Marcel; Usik, A.; Valkar, S.; Valkarova, A.; Vallee, C.; Van Mechelen, P.; Vargas Trevino, A.; Vassiliev, S.; Vazdik, Y.; Veelken, C.; Vest, A.; Vichnevski, A.; Volchinski, V.; Wacker, K.; Wagner, J.; Wallny, R.; Waugh, B.; Weber, G.; Weber, R.; Wegener, D.; Werner, C.; Werner, N.; Wessels, M.; Wiesand, S.; Winde, M.; Winter, G.G.; Wissing, C.; Wobisch, M.; Woehrling, E.E.; Wunsch, E.; Wyatt, A.C.; Zacek, J.; Zalesak, J.; Zhang, Z.; Zhokin, A.; Zomer, F.; zur Nedden, M.

    2003-01-01

    The inclusive e^+ p single and double differential cross sections for neutral and charged current processes are measured with the H1 detector at HERA. The data were taken in 1999 and 2000 at a centre-of-mass energy of \\sqrt{s} = 319 GeV and correspond to an integrated luminosity of 65.2 pb^-1. The cross sections are measured in the range of four-momentum transfer squared Q^2 between 100 and 30000 GeV^2 and Bjorken x between 0.0013 and 0.65. The neutral current analysis for the new e^+ p data and the earlier e^- p data taken in 1998 and 1999 is extended to small energies of the scattered electron and therefore to higher values of inelasticity y, allowing a determination of the longitudinal structure function F_L at high Q^2 (110 - 700 GeV^2). A new measurement of the structure function x F_3 is obtained using the new e^+ p and previously published e^\\pm p neutral current cross section data at high Q^2. These data together with H1 low Q^2 precision data are further used to perform new next-to-leading order QCD ...

  19. Operation manual for EDXRDDA - a software package for Bragg peak analysis of energy dispersive powder X-ray diffraction data

    International Nuclear Information System (INIS)

    Jayaswal, Balhans; Vijaykumar, V.; Momin, S.N.; Sikka, S.K.

    1992-01-01

    EDXRDDA is a software package for analysis of raw data for energy dispersive x-ray diffraction from powder samples. It resolves the spectra into individual peaks by a constrained non-linear least squares method (Hughes and Sexton, 1988). The profile function adopted is the Gaussian/Lorentzian product with the mixing ratio refinable in the program. The program is implemented on an IBM PC and is highly interactive with extensive plotting facilities. This report is a user's guide for running the program. In the first step after inputting the spectra, the full spectra is plotted on the screen. The user then chooses a portion of this for peak resolution. The initial guess for the peak intensity, peak position are input with the help of a cursor or a mouse. Upto twenty peaks can be fitted at a time in an interval of 500 channels. For overlapping peaks, various constraints can be applied. Bragg peaks and fluorescence peaks with different half widths can be handled simultaneously. The program on execution produces a look up table which contains the refined values of the peak position, half width, peak intensity, integrated intensity, and their error estimates of each peak. The program is very general and can also be used for curve fitting of data from many other experiments. (author). 2 refs., 7 figs., 2 appendices

  20. Partial neutron capture cross sections of actinides using cold neutron prompt gamma activation analysis

    International Nuclear Information System (INIS)

    Genreith, Christoph

    2015-01-01

    Nuclear waste needs to be characterized for its safe handling and storage. In particular long-lived actinides render the waste characterization challenging. The results described in this thesis demonstrate that Prompt Gamma Neutron Activation Analysis (PGAA) with cold neutrons is a reliable tool for the non-destructive analysis of actinides. Nuclear data required for an accurate identification and quantification of actinides was acquired. Therefore, a sample design suitable for accurate and precise measurements of prompt γ-ray energies and partial cross sections of long-lived actinides at existing PGAA facilities was presented. Using the developed sample design the fundamental prompt γ-ray data on 237 Np, 241 Am and 242 Pu were measured. The data were validated by repetitive analysis of different samples at two individual irradiation and counting facilities - the BRR in Budapest and the FRM II in Garching near Munich. Employing cold neutrons, resonance neutron capture by low energetic resonances was avoided during the experiments. This is an improvement over older neutron activation based works at thermal reactor neutron energies. 152 prompt γ-rays of 237 Np were identified, as well as 19 of 241 Am, and 127 prompt γ-rays of 242 Pu. In all cases, both high and lower energetic prompt γ-rays were identified. The most intense line of 237 Np was observed at an energy of E γ =182.82(10) keV associated with a partial capture cross section of σ γ =22.06(39) b. The most intense prompt γ-ray lines of 241 Am and of 242 Pu were observed at E γ =154.72(7) keV with σ γ =72.80(252) b and E γ =287.69(8) keV with σ γ =7.07(12) b, respectively. The measurements described in this thesis provide the first reported quantifications on partial radiative capture cross sections for 237 Np, 241 Am and 242 Pu measured simultaneously over the large energy range from 45 keV to 12 MeV. Detailed uncertainty assessments were performed and the validity of the given uncertainties was