WorldWideScience

Sample records for preliminary computational analysis

  1. Computer content analysis of schizophrenic speech: a preliminary report.

    Science.gov (United States)

    Tucker, G J; Rosenberg, S D

    1975-06-01

    Computer analysis significantly differtiated the thermatic content of the free speech of 10 schizophrenic patients from that of 10 nonschizophrenic patients and from the content of transcripts of dream material from 10 normal subjects. Schizophrenic patients used the thematic categories in factor 1 (the "schizophrenic factor") 3 times more frequently than the nonschizophrenics and 10 times more frequently than the normal subjects (p smaller than 01). In general, the language content of the schizophrenic patient mirrored an almost agitated attempt to locate oneself in time and space and to defend against internal discomfort and confusion. The authors discuss the implications of this study for future research.

  2. Preliminary Computational Analysis of the (HIRENASD) Configuration in Preparation for the Aeroelastic Prediction Workshop

    Science.gov (United States)

    Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.

    2011-01-01

    This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.

  3. Preliminary Design and Computational Fluid Dynamics Analysis of Supercritical Carbon Dioxide Turbine Blade

    International Nuclear Information System (INIS)

    Jeong, Wi S.; Kim, Tae W.; Suh, Kune Y.

    2007-01-01

    The supercritical gas turbine Brayton cycle has been adopted in the secondary loop of the Generation IV Nuclear Energy Systems, and planned to be installed in power conversion cycles of the nuclear fusion reactors as well. The supercritical carbon dioxide (SCO 2 ) is one of widely considered fluids for this concept. The potential beneficiaries include the Secure Transportable Autonomous Reactor- Liquid Metal (STAR-LM), the Korea Advanced Liquid Metal Reactor (KALIMER) and Battery Omnibus Reactor Integral System (BORIS) which is being developed at the Seoul National University. The reason for these welcomed applications is that the SCO 2 Brayton cycle can achieve higher overall energy conversion efficiency than the steam turbine Rankine cycle. Seoul National University has recently been working on the SCO 2 based Modular Optimized Brayton Integral System (MOBIS). The MOBIS design power conversion efficiency is about 45%. Gas turbine design is crucial part in achieving this high efficiency. In this paper, the preliminary analysis on first stage of gas turbine was performed using CFX as a solver

  4. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  5. Preliminary analysis of the MER magnetic properties experiment using a computational fluid dynamics model

    DEFF Research Database (Denmark)

    Kinch, K.M.; Merrison, J.P.; Gunnlaugsson, H.P.

    2006-01-01

    Motivated by questions raised by the magnetic properties experiments on the NASA Mars Pathfinder and Mars Exploration Rover (MER) missions, we have studied in detail the capture of airborne magnetic dust by permanent magnets using a computational fluid dynamics (CFD) model supported by laboratory...... simulations. The magnets studied are identical to the capture magnet and filter magnet on MER, though results are more generally applicable. The dust capture process is found to be dependent upon wind speed, dust magnetization, dust grain size and dust grain mass density. Here we develop an understanding...... of how these parameters affect dust capture rates and patterns on the magnets and set bounds for these parameters based on MER data and results from the numerical model. This results in a consistent picture of the dust as containing varying amounts of at least two separate components with different...

  6. UVISS preliminary visibility analysis

    DEFF Research Database (Denmark)

    Betto, Maurizio

    1998-01-01

    The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part of the w......The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...... of the work is also to setup the kernel of a software tool for the visibility analysis thatshould be easily expandable to consider more complex strucures for future activities.This analysis is part of the UVISS assessment study and it is meant to provide elementsfor the definition and the selection...

  7. Pickering safeguards: a preliminary analysis

    International Nuclear Information System (INIS)

    Todd, J.L.; Hodgkinson, J.G.

    1977-05-01

    A summary is presented of thoughts relative to a systems approach for implementing international safeguards. Included is a preliminary analysis of the Pickering Generating Station followed by a suggested safeguards system for the facility

  8. Age estimation by pulp-to-tooth area ratio using cone-beam computed tomography: A preliminary analysis.

    Science.gov (United States)

    Rai, Arpita; Acharya, Ashith B; Naikmasur, Venkatesh G

    2016-01-01

    Age estimation of living or deceased individuals is an important aspect of forensic sciences. Conventionally, pulp-to-tooth area ratio (PTR) measured from periapical radiographs have been utilized as a nondestructive method of age estimation. Cone-beam computed tomography (CBCT) is a new method to acquire three-dimensional images of the teeth in living individuals. The present study investigated age estimation based on PTR of the maxillary canines measured in three planes obtained from CBCT image data. Sixty subjects aged 20-85 years were included in the study. For each tooth, mid-sagittal, mid-coronal, and three axial sections-cementoenamel junction (CEJ), one-fourth root level from CEJ, and mid-root-were assessed. PTR was calculated using AutoCAD software after outlining the pulp and tooth. All statistical analyses were performed using an SPSS 17.0 software program. Linear regression analysis showed that only PTR in axial plane at CEJ had significant age correlation ( r = 0.32; P < 0.05). This is probably because of clearer demarcation of pulp and tooth outline at this level.

  9. Computer code and users' guide for the preliminary analysis of dual-mode space nuclear fission solid core power and propulsion systems, NUROC3A. AMS report No. 1239b

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, R.A.; Smith, W.W.

    1976-06-30

    The three-volume report describes a dual-mode nuclear space power and propulsion system concept that employs an advanced solid-core nuclear fission reactor coupled via heat pipes to one of several electric power conversion systems. The second volume describes the computer code and users' guide for the preliminary analysis of the system.

  10. Preliminary HECTOR analysis by Dragon

    Energy Technology Data Exchange (ETDEWEB)

    Presser, W; Woloch, F

    1972-06-02

    From the different cores measured in HECTOR, only ACH 4/B-B was selected for the Dragon analysis, since it presented the largest amount of uniform fuel loading in the central test region and is therefore nearest to an infinite lattice. Preliminary results are discussed.

  11. Concept Overview & Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, Mark

    2017-07-12

    'H2@Scale' is an opportunity for wide-scale use of hydrogen as an intermediate that carries energy from various production options to multiple uses. It is based on identifying and developing opportunities for low-cost hydrogen production and investigating opportunities for using that hydrogen across the electricity, industrial, and transportation sectors. One of the key production opportunities is use of low-cost electricity that may be generated under high penetrations of variable renewable generators such as wind and solar photovoltaics. The technical potential demand for hydrogen across the sectors is 60 million metric tons per year. The U.S. has sufficient domestic renewable resources so that each could meet that demand and could readily meet the demand using a portfolio of generation options. This presentation provides an overview of the concept and the technical potential demand and resources. It also motivates analysis and research on H2@Scale.

  12. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  13. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  14. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  15. Preliminary Context Analysis of Community Informatics Social ...

    African Journals Online (AJOL)

    Preliminary context analysis is always part of the feasibility study phase in the development of information system for Community Development (CD) purposes. In this paper, a context model and a preliminary context analysis are presented for Social Network Web Application (SNWA) for CD in the Niger Delta region of ...

  16. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  17. Preliminary evaluation of the BIODOSE computer program

    International Nuclear Information System (INIS)

    Bonner, N.A.; Ng, Y.C.

    1979-09-01

    The BIODOSE computer program simulates the environmental transport of radionuclides released to surface water and predicts the dosage to humans. We have evaluated the program for its suitability to the needs of the Nuclear Regulatory Commission Waste Management Program. In particular, it is an evaluation to determine whether BIODOSE models account for the significant pathways and mechanisms resulting in radiological doses to man. In general, BIODOSE is a satisfactory code for converting radionuclide releases to the aqueous environment into doses to man

  18. Original Article PRELIMINARY BIOAUTOGRAPHIC ANALYSIS OF ...

    African Journals Online (AJOL)

    PRELIMINARY BIOAUTOGRAPHIC ANALYSIS OF THE SEEDS OF GLYPHAEA BREVIS. (SPRENG) MONACHINO FOR ANTIOXIDANT AND ANTIBACTERIAL PRINCIPLES. Michael Lahai1, Tiwalade Adewale Olugbade2. 1Department of Pharmaceutical Chemistry, Faculty of Pharmaceutical Sciences, College of Medicine ...

  19. Preliminary Analysis of Reinforced Concrete Waffle Walls

    National Research Council Canada - National Science Library

    Shugar, Theodore

    1997-01-01

    A preliminary analytical method based upon modified plate bending theory is offered for structural analysis of a promising new construction method for walls of small buildings and residential housing...

  20. Licensing support system preliminary needs analysis: Volume 1

    International Nuclear Information System (INIS)

    1989-01-01

    This Preliminary Needs Analysis, together with the Preliminary Data Scope Analysis (next in this series of reports), is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This preliminary analysis of the LSS requirements has been divided into a ''needs'' and a ''data scope'' portion only for project management and scheduling reasons. The Preliminary Data Scope Analysis will address all issues concerning the content and size of the LSS data base; providing the requirements basis for data acquisition, cataloging and storage sizing specifications. This report addresses all other requirements for the LSS. The LSS consists of both computer subsystems and non-computer archives. This study addresses only the computer subsystems, focusing on the Access Subsystems. After providing background on previous LSS-related work, this report summarizes the findings from previous examinations of needs and describes a number of other requirements that have an impact on the LSS. The results of interviews conducted for this report are then described and analyzed. The final section of the report brings all of the key findings together and describes how these needs analyses will continue to be refined and utilized in on-going design activities. 14 refs., 2 figs., 1 tab

  1. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  2. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  3. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  4. Preliminary failure mode and effect analysis

    International Nuclear Information System (INIS)

    Addison, J.V.

    1972-01-01

    A preliminary Failure Mode and Effect Analysis (FMEA) was made on the overall 5 Kwe system. A general discussion of the system and failure effect is given in addition to the tabulated FMEA and a primary block diagram of the system. (U.S.)

  5. Preliminary safety analysis report for the TFTR

    International Nuclear Information System (INIS)

    Lind, K.E.; Levine, J.D.; Howe, H.J.

    A Preliminary Safety Analysis Report has been prepared for the Tokamak Fusion Test Reactor. No accident scenarios have been identified which would result in exposures to on-site personnel or the general public in excess of the guidelines defined for the project by DOE

  6. A Generative Computer Model for Preliminary Design of Mass Housing

    Directory of Open Access Journals (Sweden)

    Ahmet Emre DİNÇER

    2014-05-01

    Full Text Available Today, we live in what we call the “Information Age”, an age in which information technologies are constantly being renewed and developed. Out of this has emerged a new approach called “Computational Design” or “Digital Design”. In addition to significantly influencing all fields of engineering, this approach has come to play a similar role in all stages of the design process in the architectural field. In providing solutions for analytical problems in design such as cost estimate, circulation systems evaluation and environmental effects, which are similar to engineering problems, this approach is being used in the evaluation, representation and presentation of traditionally designed buildings. With developments in software and hardware technology, it has evolved as the studies based on design of architectural products and production implementations with digital tools used for preliminary design stages. This paper presents a digital model which may be used in the preliminary stage of mass housing design with Cellular Automata, one of generative design systems based on computational design approaches. This computational model, developed by scripts of 3Ds Max software, has been implemented on a site plan design of mass housing, floor plan organizations made by user preferences and facade designs. By using the developed computer model, many alternative housing types could be rapidly produced. The interactive design tool of this computational model allows the user to transfer dimensional and functional housing preferences by means of the interface prepared for model. The results of the study are discussed in the light of innovative architectural approaches.

  7. Analysis of Vector Models in Quantification of Artifacts Produced by Standard Prosthetic Inlays in Cone-Beam Computed Tomography (CBCT – a Preliminary Study

    Directory of Open Access Journals (Sweden)

    Ingrid Różyło-Kalinowska

    2014-11-01

    Full Text Available Cone-beam computed tomography (CBCT is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany, at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL. The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071. Ceramic inlay with zirconium dioxide (Cera Post as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin and active (Flexi Flange. Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  8. Computer aided safety analysis

    International Nuclear Information System (INIS)

    1988-05-01

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  9. Licensing Support System: Preliminary data scope analysis

    International Nuclear Information System (INIS)

    1989-01-01

    The purpose of this analysis is to determine the content and scope of the Licensing Support System (LSS) data base. Both user needs and currently available data bases that, at least in part, address those needs have been analyzed. This analysis, together with the Preliminary Needs Analysis (DOE, 1988d) is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. These reports are preliminary. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This document provides a baseline for what is known at this time. Additional analyses, currently being conducted, will provide more precise information on the content and scope of the LSS data base. 23 refs., 4 figs., 8 tabs

  10. Preliminary Analysis of Google+'s Privacy

    OpenAIRE

    Mahmood, Shah; Desmedt, Yvo

    2011-01-01

    In this paper we provide a preliminary analysis of Google+ privacy. We identified that Google+ shares photo metadata with users who can access the photograph and discuss its potential impact on privacy. We also identified that Google+ encourages the provision of other names including maiden name, which may help criminals performing identity theft. We show that Facebook lists are a superset of Google+ circles, both functionally and logically, even though Google+ provides a better user interfac...

  11. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  12. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  13. Repository Subsurface Preliminary Fire Hazard Analysis

    International Nuclear Information System (INIS)

    Logan, Richard C.

    2001-01-01

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M and O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents

  14. Plasma brake model for preliminary mission analysis

    Science.gov (United States)

    Orsini, Leonardo; Niccolai, Lorenzo; Mengali, Giovanni; Quarta, Alessandro A.

    2018-03-01

    Plasma brake is an innovative propellantless propulsion system concept that exploits the Coulomb collisions between a charged tether and the ions in the surrounding environment (typically, the ionosphere) to generate an electrostatic force orthogonal to the tether direction. Previous studies on the plasma brake effect have emphasized the existence of a number of different parameters necessary to obtain an accurate description of the propulsive acceleration from a physical viewpoint. The aim of this work is to discuss an analytical model capable of estimating, with the accuracy required by a preliminary mission analysis, the performance of a spacecraft equipped with a plasma brake in a (near-circular) low Earth orbit. The simplified mathematical model is first validated through numerical simulations, and is then used to evaluate the plasma brake performance in some typical mission scenarios, in order to quantify the influence of the system parameters on the mission performance index.

  15. Preliminary Shielding Analysis for HCCB TBM Transport

    Science.gov (United States)

    Miao, Peng; Zhao, Fengchao; Cao, Qixiang; Zhang, Guoshu; Feng, Kaiming

    2015-09-01

    A preliminary shielding analysis on the transport of the Chinese helium cooled ceramic breeder test blanket module (HCCB TBM) from France back to China after being irradiated in ITER is presented in this contribution. Emphasis was placed on irradiation safety during transport. The dose rate calculated by MCNP/4C for the conceptual package design satisfies the relevant dose limits from IAEA that the dose rate 3 m away from the surface of the package containing low specific activity III materials should be less than 10 mSv/h. The change with location and the time evolution of dose rates after shutdown have also been studied. This will be helpful for devising the detailed transport plan of HCCB TBM back to China in the near future. supported by the Major State Basic Research Development Program of China (973 Program) (No. 2013GB108000)

  16. Introducing handheld computing into a residency program: preliminary results from qualitative and quantitative inquiry.

    OpenAIRE

    Manning, B.; Gadd, C. S.

    2001-01-01

    Although published reports describe specific handheld computer applications in medical training, we know very little yet about how, and how well, handheld computing fits into the spectrum of information resources available for patient care and physician training. This paper reports preliminary quantitative and qualitative results from an evaluation study designed to track changes in computer usage patterns and computer-related attitudes before and after introduction of handheld computing. Pre...

  17. Preliminary ATWS analysis for the IRIS PRA

    International Nuclear Information System (INIS)

    Maddalena Barra; Marco S Ghisu; David J Finnicum; Luca Oriani

    2005-01-01

    Full text of publication follows: The pressurized light water cooled, medium power (1000 MWt) IRIS (International Reactor Innovative and Secure) has been under development for four years by an international consortium of over 21 organizations from ten countries. The plant conceptual design was completed in 2001 and the preliminary design is nearing completion. The pre-application licensing process with NRC started in October, 2002. IRIS has been primarily focused on establishing a design with innovative safety characteristics. The first line of defense in IRIS is to eliminate event initiators that could potentially lead to core damage. In IRIS, this concept is implemented through the 'safety by design' approach, which allows to minimize the number and complexity of the safety systems and required operator actions. The end result is a design with significantly reduced complexity and improved operability, and extensive plant simplifications to enhance construction. To support the optimization of the plant design and confirm the effectiveness of the safety by design approach in mitigating or eliminating events and thus providing a significant reduction in the probability of severe accidents, the PRA is being used as an integral part of the design process. A preliminary but extensive Level 1 PRA model has been developed to support the pre-application licensing of the IRIS design. As a result of the Preliminary IRIS PRA, an optimization of the design from a reliability point of view was completed, and an extremely low (about 1.2 E -8 ) core damage frequency (CDF) was assessed to confirm the impact of the safety by design approach. This first assessment is a result of a PRA model including internal initiating events. During this assessment, several assumptions were necessary to complete the CDF evaluation. In particular Anticipated Transients Without Scram (ATWS) were not included in this initial assessment, because their contribution to core damage frequency was assumed

  18. Activation analysis by filtered neutrons. Preliminary investigation

    International Nuclear Information System (INIS)

    Skarnemark, G.; Rodinson, T.; Skaalberg, M.; Tokay, R.K.

    1986-01-01

    In order to investigate if measuring sensibility and precision by epithermal neutron activation analysis may be improved, different types of geological and biologic test samples were radiated. The test samples were enclosed in an extra filter of tungsten or sodium in order to reduce the flux of those neutrons that otherwise would induce interfering activity in the sample. The geological test samples consist of granites containing lanthanides which had been crushed in tung- sten carbide grinder. Normally such test samples show a interferins 1 87W-activity. By use of a tungsten filter the activity was reduced by up to 60%, which resulted in a considerable improvement of sensibility and precision of the measurement. The biologic test samples consisted of evaporated urine from patients treated with the cell poison cis-platinol. A reliable method to measure the platinum content has not existed so far. This method, however, enables platinum contents as low as about 0.1 ppm to be determined which is quite adequate. To sum up this preliminary study has demonstrated that activation analysis using filtered neutrons, correctly applied, is a satisfactory method of reducing interferences without complicated and time-consuming chemical separation procedures. (O.S.)

  19. Preliminary hazard analysis using sequence tree method

    International Nuclear Information System (INIS)

    Huang Huiwen; Shih Chunkuan; Hung Hungchih; Chen Minghuei; Yih Swu; Lin Jiinming

    2007-01-01

    A system level PHA using sequence tree method was developed to perform Safety Related digital I and C system SSA. The conventional PHA is a brainstorming session among experts on various portions of the system to identify hazards through discussions. However, this conventional PHA is not a systematic technique, the analysis results strongly depend on the experts' subjective opinions. The analysis quality cannot be appropriately controlled. Thereby, this research developed a system level sequence tree based PHA, which can clarify the relationship among the major digital I and C systems. Two major phases are included in this sequence tree based technique. The first phase uses a table to analyze each event in SAR Chapter 15 for a specific safety related I and C system, such as RPS. The second phase uses sequence tree to recognize what I and C systems are involved in the event, how the safety related systems work, and how the backup systems can be activated to mitigate the consequence if the primary safety systems fail. In the sequence tree, the defense-in-depth echelons, including Control echelon, Reactor trip echelon, ESFAS echelon, and Indication and display echelon, are arranged to construct the sequence tree structure. All the related I and C systems, include digital system and the analog back-up systems are allocated in their specific echelon. By this system centric sequence tree based analysis, not only preliminary hazard can be identified systematically, the vulnerability of the nuclear power plant can also be recognized. Therefore, an effective simplified D3 evaluation can be performed as well. (author)

  20. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    International Nuclear Information System (INIS)

    Fletcher, C.D.

    1986-01-01

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes. 6 refs., 17 figs., 1 tab

  1. Preliminary Analysis and Selection of Mooring Solution Candidates

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Delaney, Martin

    This report covers a preliminary analysis of mooring solutions candidates for four large floating wave energy converters. The work is part of the EUDP project “Mooring Solutions for Large Wave Energy Converters” and is the outcome of "Work Package 3: Preliminary Analysis". The report further...... compose the "Milestone 4: Report on results of preliminary analysis and selection of final candidates. The report is produced by Aalborg University with input from the partner WECs Floating Power Plant, KNSwing, LEANCON and Wave Dragon. Tension Technology International (TTI) has provided a significant...

  2. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  3. Preliminary safety analysis of the Gorleben site

    International Nuclear Information System (INIS)

    Bracke, G.; Fischer-Appelt, K.

    2014-01-01

    The safety requirements governing the final disposal of heat-generating radioactive waste in Germany were implemented by the Federal Ministry of Environment, Natural Conservation and Nuclear Safety (BMU) in 2010. The Ministry considers as a fundamental objective the protection of man and the environment against the hazards of radioactive waste. Unreasonable burdens and obligation for future generations shall be avoided. The main safety principles are concentration and inclusion of radioactive and other pollutants in a containment-providing rock zone. Any release of radioactive nuclides may increase the risk for men and the environment only negligibly compared to natural radiation exposure. No intervention or maintenance work shall be necessary in the post-closure phase. Retrieval/recovery of the waste shall be possible up to 500 years after closure. The Gorleben salt dome has been discussed since the 1970's as a possible repository site for heat-generating radioactive waste in Germany. The objective of the project preliminary safety analysis of the Gorleben site (VSG) was to assess if repository concepts at the Gorleben site or other sites with a comparable geology could comply with these requirements based on currently available knowledge (Fischer-Appelt, 2013; Bracke, 2013). In addition to this it was assessed if methodological approaches can be used for a future site selection procedure and which technological and conceptual considerations can be transferred to other geological situations. The objective included the compilation and review of the available exploration data of the Gorleben site and on disposal in salt rock, the development of repository designs, and the identification of the needs for future R and D work and further site investigations. (authors)

  4. The Scrap Tire Problem: A Preliminary Economic Analysis (1985)

    Science.gov (United States)

    The purpose of the study was to conduct a preliminary economic analysis of the social benefits of EPA action to require more appropriate disposal of scrap tires versus the social costs of such an action.

  5. Original Article PRELIMINARY BIOAUTOGRAPHIC ANALYSIS OF ...

    African Journals Online (AJOL)

    Sierra Leone 2Department of Pharmaceutical Chemistry, Faculty of Pharmacy, ... the seeds are used in the treatment of skin infections. ... Screening with DPPH showed prominent antioxidant spots on silica at Rf 0.8, 0.5, 0.4 .... underpins conditions like rheumatoid arthritis, ..... As a follow-up to the preliminary TLC studies.

  6. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  7. Krypton for computed tomography lung ventilation imaging: preliminary animal data.

    Science.gov (United States)

    Mahnken, Andreas H; Jost, Gregor; Pietsch, Hubertus

    2015-05-01

    The objective of this study was to assess the feasibility and safety of krypton ventilation imaging with intraindividual comparison to xenon ventilation computed tomography (CT). In a first step, attenuation of different concentrations of xenon and krypton was analyzed in a phantom setting. Thereafter, 7 male New Zealand white rabbits (4.4-6.0 kg) were included in an animal study. After orotracheal intubation, an unenhanced CT scan was obtained in end-inspiratory breath-hold. Thereafter, xenon- (30%) and krypton-enhanced (70%) ventilation CT was performed in random order. After a 2-minute wash-in of gas A, CT imaging was performed. After a 45-minute wash-out period and another 2-minute wash-in of gas B, another CT scan was performed using the same scan protocol. Heart rate and oxygen saturation were measured. Unenhanced and krypton or xenon data were registered and subtracted using a nonrigid image registration tool. Enhancement was quantified and statistically analyzed. One animal had to be excluded from data analysis owing to problems during intubation. The CT scans in the remaining 6 animals were completed without complications. There were no relevant differences in oxygen saturation or heart rate between the scans. Xenon resulted in a mean increase of enhancement of 35.3 ± 5.5 HU, whereas krypton achieved a mean increase of 21.9 ± 1.8 HU in enhancement (P = 0.0055). The use of krypton for lung ventilation imaging appears to be feasible and safe. Despite the use of a markedly higher concentration of krypton, enhancement is significantly worse when compared with xenon CT ventilation imaging, but sufficiently high for CT ventilation imaging studies.

  8. Preliminary Analysis For Wolsong Par Effects Using ISACC Calculations

    International Nuclear Information System (INIS)

    Song, Yong Mann; Kim, Dong Ha

    2012-01-01

    In the paper, hydrogen control effects using PARs only are analyzed for severe SBO station blackout (SBO) sequences beyond the design basis accidents in WS-1 which are of CANDU6 type reactor. As a computational tool, the latest version of ISAAC4.3 (Integrated Severe Accident Analysis Code for CANDU), which is a fully integrated and lumped severe accident computer code, is used to simulate hydrogen generation and transport inside the reactor building (R/B) before its failure. For the performance of hydrogen removal, the depletion rate equation of K-PAR developed in Korea is applied. In a CANDU reactor, three areas are identified as sources of hydrogen under severe accidents: fuel-coolant interactions in intact channels, suspended fuel or debris interactions in-calandria tank and debris interactions in-calandria vault. The first two origins provide source for the late ('late' terminology is used because it takes more than one day before calandria tank failure) potential hydrogen combustion before calandria tank failure and all the three origins would provide source for the very late potential hydrogen combustion occurring at or after calaria tank failure. If the hydrogen mitigation system fails, the AICC (adiabatic isochoric complete combustion) burning of highly flammable hydrogen may cause Wolsong R/B failure. So hydrogen induced failure possibility is evaluated, using preliminary ISAAC calculations, under several SBO conditions with and without PAR for both late and very late accident periods

  9. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  10. Life cycle analysis in preliminary design stages

    OpenAIRE

    Agudelo , Lina-Maria; Mejía-Gutiérrez , Ricardo; Nadeau , Jean-Pierre; PAILHES , Jérôme

    2014-01-01

    International audience; In a design process the product is decomposed into systems along the disciplinary lines. Each stage has its own goals and constraints that must be satisfied and has control over a subset of design variables that describe the overall system. When using different tools to initiate a product life cycle, including the environment and impacts, its noticeable that there is a gap in tools that linked the stages of preliminary design and the stages of materialization. Differen...

  11. Review of Preliminary Analysis Techniques for Tension Structures.

    Science.gov (United States)

    1984-02-01

    however,a linear dinamic analysis can be conducted for purposes of preliminary design, relative to the static configuration. It is noted that the amount of...16 Chapter 3. PRELIMINARY DESIGN OF TENSION STRUCTURES . . .. .. .. .... 22 S.3.1 Cable Systems . . . . . . . . . . . . .. .. .. .... 23...3.1.1 Singly-Connected Segments. .. .... ... 24 3.1.2 Multiply-Connected Segments . . .. .. .. .. 27 3.1.3 Linearized Dynamics of Cable Systems . . . . 29

  12. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  13. Surface Properties of TNOs: Preliminary Statistical Analysis

    Science.gov (United States)

    Antonieta Barucci, Maria; Fornasier, S.; Alvarez-Cantal, A.; de Bergh, C.; Merlin, F.; DeMeo, F.; Dumas, C.

    2009-09-01

    An overview of the surface properties based on the last results obtained during the Large Program performed at ESO-VLT (2007-2008) will be presented. Simultaneous high quality visible and near-infrared spectroscopy and photometry have been carried out on 40 objects with various dynamical properties, using FORS1 (V), ISAAC (J) and SINFONI (H+K bands) mounted respectively at UT2, UT1 and UT4 VLT-ESO telescopes (Cerro Paranal, Chile). For spectroscopy we computed the spectral slope for each object and searched for possible rotational inhomogeneities. A few objects show features in their visible spectra such as Eris, whose spectral bands are displaced with respect to pure methane-ice. We identify new faint absorption features on 10199 Chariklo and 42355 Typhon, possibly due to the presence of aqueous altered materials. The H+K band spectroscopy was performed with the new instrument SINFONI which is a 3D integral field spectrometer. While some objects show no diagnostic spectral bands, others reveal surface deposits of ices of H2O, CH3OH, CH4, and N2. To investigate the surface properties of these bodies, a radiative transfer model has been applied to interpret the entire 0.4-2.4 micron spectral region. The diversity of the spectra suggests that these objects represent a substantial range of bulk compositions. These different surface compositions can be diagnostic of original compositional diversity, interior source and/or different evolution with different physical processes affecting the surfaces. A statistical analysis is in progress to investigate the correlation of the TNOs’ surface properties with size and dynamical properties.

  14. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Li Chuanfu; Zhou Kangyuan

    2006-01-01

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm 3 , (1 211 725 ± 122 077) mm 3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm 3 , (1 203 400 ± 117 760) mm 3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P 0.05). The expression P(x)=-0.0008x 2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x 2 +0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  15. Use of electronic computers for processing of spectrometric data in instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.

    1977-01-01

    A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered

  16. Dynamic computed tomography scanning of benign bone lesions: Preliminary results

    International Nuclear Information System (INIS)

    Levine, E.; Neff, J.R.

    1983-01-01

    The majority of benign bone lesions can be evaluated adequately using conventional radiologic techniques. However, it is not always possible to differentiate reliably between different types of benign bone lesions on the basis of plain film appearances alone. Dynamic computed tomography (CT) scanning provides a means for further characterizing such lesions by assessing their degree of vascularity. Thus, it may help in distinguishing an osteoid osteoma, which has a hypervascular nidus, from a Brodie's abscess, which is avascular. Dynamic CT scanning may also help in the differentiation between a fluid-containing simple bone cyst, which is avascular, and other solid or semi-solid benign bone lesions which slow varying degrees of vascularity. However, because of the additional irradiation involved, dynamic CT scanning should be reserved for evaluation of selected patients with benign bone lesions in whom the plain film findings are not definitive and in whom the CT findings may have a significant influence on management. (orig.)

  17. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  18. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    1990-04-01

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  19. Preliminary safety design analysis of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Soo Dong; Kwon, Y. M.; Kim, K. D. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-03-01

    The national long-term R and D program updated in 1997 requires Korea Atomic Energy Research Institute(KAERI) to complete by the year 2006 the basic design of Korea Advanced Liquid Metal Reactor (KALIMER), along with supporting R and D work, with the capability of resolving the issue of spent fuel storage as well as with significantly enhanced safety. KALIMER is a 150 MWe pool-type sodium cooled prototype reactor that uses metallic fuel. The conceptual design is currently under way to establish a self consistent design meeting a set of the major safety design requirements for accident prevention. Some of current emphasis include those for inherent and passive means of negative reactivity insertion and decay heat removal, high shutdown reliability, prevention of and protection from sodium chemical reaction, and high seismic margin, among others. All of these requirements affect the reactor design significantly and involve supporting R and D programs of substance. This document first introduces a set of safety design requirements and accident evaluation criteria established for the conceptual design of KALIMER and then summarizes some of the preliminary results of engineering and design analyses performed for the safety of KALIMER. 19 refs., 19 figs., 6 tabs. (Author)

  20. Preliminary shielding analysis of VHTR reactors

    International Nuclear Information System (INIS)

    Flaspoehler, Timothy M.; Petrovic, Bojan

    2011-01-01

    Over the last 20 years a number of methods have been established for automated variance reduction in Monte Carlo shielding simulations. Hybrid methods rely on deterministic adjoint and/or forward calculations to generate these parameters. In the present study, we use the FWCADIS method implemented in MAVRIC sequence of the SCALE6 package to perform preliminary shielding analyses of a VHTR reactor. MAVRIC has been successfully used by a number of researchers for a range of shielding applications, including modeling of LWRs, spent fuel storage, radiation field throughout a nuclear power plant, study of irradiation facilities, and others. However, experience in using MAVRIC for shielding studies of VHTRs is more limited. Thus, the objective of this work is to contribute toward validating MAVRIC for such applications, and identify areas for potential improvement. A simplified model of a prismatic VHTR has been devised, based on general features of the 600 MWt reactor considered as one of the NGNP options. Fuel elements have been homogenized, and the core region is represented as an annulus. However, the overall mix of materials and the relatively large dimensions of the spatial domain challenging the shielding simulations have been preserved. Simulations are performed to evaluate fast neutron fluence, dpa, and other parameters of interest at relevant positions. The paper will investigate and discuss both the effectiveness of the automated variance reduction, as well as applicability of physics model from the standpoint of specific VHTR features. (author)

  1. Developing ontological model of computational linear algebra - preliminary considerations

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  2. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    Science.gov (United States)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  3. A Comparison between the Occurrence of Pauses, Repetitions and Recasts under Conditions of Face-to-Face and Computer-Mediated Communication: A Preliminary Study

    Science.gov (United States)

    Cabaroglu, Nese; Basaran, Suleyman; Roberts, Jon

    2010-01-01

    This study compares pauses, repetitions and recasts in matched task interactions under face-to-face and computer-mediated conditions. Six first-year English undergraduates at a Turkish University took part in Skype-based voice chat with a native speaker and face-to-face with their instructor. Preliminary quantitative analysis of transcripts showed…

  4. Preliminary analysis of B. E. C. I

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, H; Sato, S [Waseda Univ., Tokyo (Japan). Science and Engineering Research Lab.; Saito, T; Noma, M; Matsubayashi, T

    1974-10-01

    An emulsion chamber (B.E.C.I.) with a generating layer, mounted on a baloon, was flown as preliminary experiment in May 1973. The object of this experiment was (1) the observation of high energy cosmic ray, (2) study of ultra-high energy multiple generation phenomenon, and (3) study of ultra-high energy heavy ion nuclear reaction. The emulsion chamber comprises three portions. Upper portion is 130 sheets arranged vertically at 3 mm intervals, each sheet is a 1,500 ..mu..m methacrylic base coated on one side with 200 ..mu..m emulsion. Middle portion comprises horizontally arranged 800 ..mu..m methacrylic bases coated on both sides with 50 ..mu..m emulsion, and a 1 mm methacrylic sheet is inserted every five bases. Lower portion comprises first five layers of the sandwich of 1 mm lead sheet and 800 ..mu..m methacrylic base coated on both sides with 50 ..mu..m emulsion and second ten layers of the sandwich of 2 mm lead sheet, 800 ..mu..m methacrylic base coated on one side with 50 ..mu..m emulsion, and X-ray film of N type. The cascade having energy of Esub(o)>400 GeV as the scanning efficiency of lower E.C.C., the events having incidental energy of Esub(o)>3TeV among the jets occured in lower E.C.C., and the events having incidental energy of Esub(o)>10TeV among the jets occured in generating layer have been observed. Angle distribution of the secondary charged particles of jets produced in the generating layer can be obtained accurately.

  5. Practical Recommendations for the Preliminary Design Analysis of ...

    African Journals Online (AJOL)

    Interior-to-exterior shear ratios for equal and unequal bay frames, as well as column inflection points were obtained to serve as practical aids for preliminary analysis/design of fixed-feet multistory sway frames. Equal and unequal bay five story frames were analysed to show the validity of the recommended design ...

  6. Yucca Mountain transportation routes: Preliminary characterization and risk analysis

    International Nuclear Information System (INIS)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R.

    1991-01-01

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history

  7. Preliminary analysis of patent trends for magnetic fusion technology

    International Nuclear Information System (INIS)

    Levine, L.O.; Ashton, W.B.; Campbell, R.S.

    1984-02-01

    This study presents a preliminary analysis of development trends in magnetic fusion technology based on data from US patents. The research is limited to identification and description of general patent activity and ownership characteristics for 373 patents. The results suggest that more detailed studies of fusion patents could provide useful R and D planning information

  8. Preliminary thermal and stress analysis of the SINQ window

    International Nuclear Information System (INIS)

    Heidenreich, G.

    1991-01-01

    Preliminary results of a finite element analysis for the SINQ proton beam window are presented. Temperatures and stresses are calculated in an axisymmetric model. As a result of these calculations, the H 2 O-cooled window (safety window) could be redesigned in such a way that plastic deformation resulting from excessive stress in some areas is avoided. (author)

  9. Preliminary Integrated Safety Analysis Status Report

    International Nuclear Information System (INIS)

    Gwyn, D.

    2001-01-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001

  10. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  11. Preliminary analysis of alternative fuel cycles for proliferation evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Steindler, M. J.; Ripfel, H. C.F.; Rainey, R. H.

    1977-01-01

    The ERDA Division of Nuclear Research and Applications proposed 67 nuclear fuel cycles for assessment as to their nonproliferation potential. The object of the assessment was to determine which fuel cycles pose inherently low risk for nuclear weapon proliferation while retaining the major benefits of nuclear energy. This report is a preliminary analysis of these fuel cycles to develop the fuel-recycle data that will complement reactor data, environmental data, and political considerations, which must be included in the overall evaluation. This report presents the preliminary evaluations from ANL, HEDL, ORNL, and SRL and is the basis for a continuing in-depth study. (DLC)

  12. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  13. Computational analysis of cerebral cortex

    International Nuclear Information System (INIS)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni

    2010-01-01

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  14. antibacterial properties and preliminary phytochemical analysis

    African Journals Online (AJOL)

    DR. AMINU

    2Department of Pharmaceutical Chemistry, Faculty of Pharmacy, University of Benin, Benin City. *Correspondence ... phytochemical analysis of the dried leaves extracts revealed the presence of alkaloids, ... for the synthesis of useful drugs.

  15. Preliminary analysis of ROSAIII experiment, (2)

    International Nuclear Information System (INIS)

    Kitaguchi, Hidemi; Suzuki, Mitsuhiro; Sobajima, Makoto; Adachi, Hiromichi; Shiba, Masayoshi.

    1978-02-01

    Loss-of-coolant accident (LOCA) experiments to be performed in ROSAIII has been examined with computer code RELAP-4J concerning the experimental conditions. From the results (1) to (3), the needs (4) to (6) are there. (1) Initial enthalpy distribution is important for simulation of break flow of an actual BWR. (2) The simulations of lower plenumn flashing and pressure transient in pressure vessel are good except when power is lacking. (3) The simulation of the cladding temperature transient is difficult because of lack of physical properties. (4) The initial pressure distribution in the facility for different core flow rates up to 72 lb/sec must be attained to analyze accurately. (5) Reverse core flow detectors and reverse jet pump flow detectors are necessary to compare flow pattern of recirculation loops between calculation and experiment. (6) Further information is necessary on physical properties of the fuels. (auth.)

  16. Preliminary conceptual design and analysis on KALIMER reactor structures

    International Nuclear Information System (INIS)

    Kim, Jong Bum

    1996-10-01

    The objectives of this study are to perform preliminary conceptual design and structural analyses for KALIMER (Korea Advanced Liquid Metal Reactor) reactor structures to assess the design feasibility and to identify detailed analysis requirements. KALIMER thermal hydraulic system analysis results and neutronic analysis results are not available at present, only-limited preliminary structural analyses have been performed with the assumptions on the thermal loads. The responses of reactor vessel and reactor internal structures were based on the temperature difference of core inlet and outlet and on engineering judgments. Thermal stresses from the assumed temperatures were calculated using ANSYS code through parametric finite element heat transfer and elastic stress analyses. While, based on the results of preliminary conceptual design and structural analyses, the ASME Code limits for the reactor structures were satisfied for the pressure boundary, the needs for inelastic analyses were indicated for evaluation of design adequacy of the support barrel and the thermal liner. To reduce thermal striping effects in the bottom are of UIS due to up-flowing sodium form reactor core, installation of Inconel-718 liner to the bottom area was proposed, and to mitigate thermal shock loads, additional stainless steel liner was also suggested. The design feasibilities of these were validated through simplified preliminary analyses. In conceptual design phase, the implementation of these results will be made for the design of the reactor structures and the reactor internal structures in conjunction with the thermal hydraulic, neutronic, and seismic analyses results. 4 tabs., 24 figs., 4 refs. (Author)

  17. Experience with PET FDG - Preliminary analysis

    International Nuclear Information System (INIS)

    Massardo, Teresa; Jofre, Josefina; Canessa, Jose; Gonzalez, Patricio; Humeres, Pamela; Sierralta, Paulina; Galaz, Rodrigo; Miranda, Karina

    2004-01-01

    Full text: The objective of this preliminary communication was to analyse the indications and data in initial group of patients studied with first dedicated PET scanner in the country at Hospital Militar in Santiago Chile. The main application of positron emission tomography (PET) with 18-Fluoro deoxyglucose (FDG) is related with oncological patients management. We studied 136 patients, 131 (97%) with known or suspected malignant disease and remaining 5 for cardiological or neuropsychiatric disease. Ten patients were controlled diabetics (1 insulin dependent). Their mean age was 51.6±18 years ranging from 6 to 84 years and 65% were females. A total of 177 scans were acquired using a dedicated PET (Siemens HR + with 4mm resolution) system. Mean F18-FDG injected dose was 477±107 MBq (12.9±2.9 mCi). Mean blood glucose levels, performed prior the injection, were 94±17mg/dl (range 62-161). F18-FDG was obtained from the cyclotron IBA Cyclone 18/9 installed in the Chilean Agency of Nuclear Energy, distant about 15 miles away from the clinical PET facility. PET studies were analyzed by at least 4 independent observers visually. Standardized uptake value (SUV) was calculated in some cases. Image fusion of FDG images with recent anatomical (CT, MRI) studies was performed where available. Data acquisition protocol consisted in 7-8 beds/study from head to mid-thighs, with 6-7-min/bed acquisitions, 36% transmission with germanium 68 rods. Data was reconstructed with standard OSEM protocol. The main indications included pulmonary lesions in 31%, gastrointestinal cancers in 21%, melanoma in 13% and lymphoma in 9% patients. The remaining were of breast, thyroid, testes, ovary, musculoskeletal (soft tissue and bone), brain tumour etc. Abnormal focal tracer uptake was observed in 83/131 oncological patients, 54% corroborating with clinical diagnosis of primary tumor or recurrence while 46% showed new metastatic localization. FDG scans were normal 36/131 patients. In 9 patients

  18. Experience with PET FDG - Preliminary analysis

    Energy Technology Data Exchange (ETDEWEB)

    Massardo, Teresa; Jofre, Josefina; Canessa, Jose; Gonzalez, Patricio; Humeres, Pamela; Sierralta, Paulina; Galaz, Rodrigo; Miranda, Karina [Centro PET de Imagenes Moleculares, Hospital Militar de Santiago, Santiago (Chile)

    2004-01-01

    Full text: The objective of this preliminary communication was to analyse the indications and data in initial group of patients studied with first dedicated PET scanner in the country at Hospital Militar in Santiago Chile. The main application of positron emission tomography (PET) with 18-Fluoro deoxyglucose (FDG) is related with oncological patients management. We studied 136 patients, 131 (97%) with known or suspected malignant disease and remaining 5 for cardiological or neuropsychiatric disease. Ten patients were controlled diabetics (1 insulin dependent). Their mean age was 51.6{+-}18 years ranging from 6 to 84 years and 65% were females. A total of 177 scans were acquired using a dedicated PET (Siemens HR + with 4mm resolution) system. Mean F18-FDG injected dose was 477{+-}107 MBq (12.9{+-}2.9 mCi). Mean blood glucose levels, performed prior the injection, were 94{+-}17mg/dl (range 62-161). F18-FDG was obtained from the cyclotron IBA Cyclone 18/9 installed in the Chilean Agency of Nuclear Energy, distant about 15 miles away from the clinical PET facility. PET studies were analyzed by at least 4 independent observers visually. Standardized uptake value (SUV) was calculated in some cases. Image fusion of FDG images with recent anatomical (CT, MRI) studies was performed where available. Data acquisition protocol consisted in 7-8 beds/study from head to mid-thighs, with 6-7-min/bed acquisitions, 36% transmission with germanium 68 rods. Data was reconstructed with standard OSEM protocol. The main indications included pulmonary lesions in 31%, gastrointestinal cancers in 21%, melanoma in 13% and lymphoma in 9% patients. The remaining were of breast, thyroid, testes, ovary, musculoskeletal (soft tissue and bone), brain tumour etc. Abnormal focal tracer uptake was observed in 83/131 oncological patients, 54% corroborating with clinical diagnosis of primary tumor or recurrence while 46% showed new metastatic localization. FDG scans were normal 36/131 patients. In 9

  19. Preliminary engineering analysis for clothes washers

    Energy Technology Data Exchange (ETDEWEB)

    Biermayer, Peter J.

    1996-10-01

    The Engineering Analysis provides information on efficiencies, manufacturer costs, and other characteristics of the appliance class being analyzed. For clothes washers, there are two classes: standard and compact. Since data were not available to analyze the compact class, only clothes washers were analyzed in this report. For this analysis, individual design options were combined and ordered in a manner that resulted in the lowest cumulative cost/savings ratio. The cost/savings ratio is the increase in manufacturer cost for a design option divided by the reduction in operating costs due to fuel and water savings.

  20. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  1. [Efficiency of computer-based documentation in long-term care--preliminary project].

    Science.gov (United States)

    Lüngen, Markus; Gerber, Andreas; Rupprecht, Christoph; Lauterbach, Karl W

    2008-06-01

    In Germany the documentation of processes in long-term care is mainly paper-based. Planning, realization and evaluation are not supported in an optimal way. In a preliminary study we evaluated the consequences of the introduction of a computer-based documentation system using handheld devices. We interviewed 16 persons before and after introducing the computer-based documentation and assessed costs for the documentation process and administration. The results show that reducing costs is likely. The job satisfaction of the personnel increased, more time could be spent for caring for the residents. We suggest further research to reach conclusive results.

  2. Preliminary analysis of the KAERI RCCS Experiment Using GAMMA+

    Energy Technology Data Exchange (ETDEWEB)

    Khoza, Samukelisiwe; Tak, Nam-il; Lim, Hong-Sik; Lee, Sung-Nam; Cho, Bong-Hyun; Kim, Jong-Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    This paper describes the analysis of the KAERI RCCS experiment. GAMMA+ code was used for analysis of the RCCS 1/4-scale natural cooling experimental facility designed and built at KAERI to verify the performance of the natural circulation phenomenon. The results obtained from the GAMMA+ analysis showing the temperature profiles and flow rates at steady state were compared with the results from the preliminary experiments conducted in this facility. GAMMA+ analysis for the KAERI RCCS experimental setup was carried out to understand its natural circulation behavior. The air flow rate at the chimney exit achieved by experiments was from to be almost same as that of GAMMA+.

  3. Gravity field of Venus - A preliminary analysis

    Science.gov (United States)

    Phillips, R. J.; Sjogren, W. L.; Abbott, E. A.; Smith, J. C.; Wimberly, R. N.; Wagner, C. A.

    1979-01-01

    The gravitational field of Venus obtained by tracking the Pioneer Venus Orbiter is examined. For each spacecraft orbit, two hours of Doppler data centered around periapsis were used to estimate spacecraft position and velocity and the velocity residuals obtained were spline fit and differentiated to produce line of sight gravitational accelerations. Consistent variations in line of sight accelerations from orbit to orbit reveal the presence of gravitational anomalies. A simulation of isostatic compensation for an elevated region on the surface of Venus indicates that the mean depth of compensation is no greater than about 100 km. Gravitational spectra obtained from a Fourier analysis of line of sight accelerations from selected Venus orbits are compared to the earth's gravitational spectrum and spherical harmonic gravitational potential power spectra of the earth, the moon and Mars. The Venus power spectrum is found to be remarkably similar to that of the earth, however systematic variations in the harmonics suggest differences in dynamic processes or lithospheric behavior.

  4. Advanced high conversion PWR: preliminary analysis

    International Nuclear Information System (INIS)

    Golfier, H.; Bellanger, V.; Bergeron, A.; Dolci, F.; Gastaldi, B.; Koberl, O.; Mignot, G.; Thevenot, C.

    2007-01-01

    In this paper, physical aspects of a HCPWR (High Conversion Light Water Reactor), which is an innovative PWR fuelled with mixed oxide and having a higher conversion ratio due to a lower moderation ratio. Moderation ratios lower than unity are considered which has led to low moderation PWR fuel assembly designs. The objectives of this parametric study are to define a feasibility area with regard to the following neutronic aspects: moderation ratio, Pu loading, reactor spectrum, irradiation time, and neutronic coefficients. Important thermohydraulic parameters are the pressure drop, the critical heat flux, the maximum temperature in the fuel rod and the pumping power. The thermohydraulic analysis shows that a range of moderation ratios from 0.8 to 1.2 is technically possible. A compromise between improved fuel utilization and research and development effort has been found for the moderation ration of about 1. The parametric study shows that there are 2 ranges of interest for the moderation ratio: -) moderation ratio between 0.8 and 1.2 with reduced fissile heights (> 3 m), hexagonal arrangement fuel assembly and square arrangement fuel assembly are possible; and -) moderation between 0.6 and 0.7 with a modification of the reactor operating conditions (reduction of the primary flow and of the thermal power), the fuel rods could be arranged inside a hexagonal fuel rod assembly. (A.C.)

  5. A preliminary analysis of bidayuh Jagoi patun

    Directory of Open Access Journals (Sweden)

    Dayang Sariah Abang Suhai

    2013-12-01

    Full Text Available Bidayuh Pantun or Patun remains a under researched topic in Borneo studies and language research due to the difficulties associated with obtaining critical, poetic information in oral culture, language variations and societal mobility. Existing data from anthologies however provide little detail about the instrinsic and extrinsic features ascribed to the poem by the people who produce and use them. This paper attempts to explore patun from the Jagoi community. In this study, the structural aspects, themes and moral values of 47 patun from the Jagoi community were analysed. The initial explanations suggested by the poet were further analysed to determine the various structural features to place it alongside existing mainstream lyric poetry. The analysis of the intrinsic features showed that good rhythmic patun has four to six words per line and eight to 12 syllables per line, and the final syllables of each line has assonance and consonance patterns of a-a-a-a and a-b-a-b. The themes of the patun include love, advice, forgiveness, beliefs, hopelessness and happiness, and the moral values take the form of subtle advice and admonishments. The Bidayuh patun is indeed a projection of knowledge, experiences, beliefs, values, and emotions of the community.

  6. Preliminary Seismic Response and Fragility Analysis for DACS Cabinet

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jinho; Kwag, Shinyoung; Lee, Jongmin; Kim, Youngki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    A DACS cabinet is installed in the main control room. The objective of this paper is to perform seismic analyses and evaluate the preliminary structural integrity and seismic capacity of the DACS cabinet. For this purpose, a 3-D finite element model of the DACS cabinet was developed and its modal analyses are carried out to analyze the dynamic characteristics. The response spectrum analyses and the related safety evaluation are then performed for the DACS cabinet subject to seismic loads. Finally, the seismic margin and seismic fragility of the DACS cabinet are investigated. A seismic analysis and preliminary structural integrity of the DACS cabinet under self weight and SSE load have been evaluated. For this purpose, 3-D finite element models of the DACS cabinet were developed. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. Therefore, it is concluded that the DACS cabinet was safely designed in that no damage to the preliminary structural integrity and sufficient seismic margin is expected.

  7. Preliminary Seismic Response and Fragility Analysis for DACS Cabinet

    International Nuclear Information System (INIS)

    Oh, Jinho; Kwag, Shinyoung; Lee, Jongmin; Kim, Youngki

    2013-01-01

    A DACS cabinet is installed in the main control room. The objective of this paper is to perform seismic analyses and evaluate the preliminary structural integrity and seismic capacity of the DACS cabinet. For this purpose, a 3-D finite element model of the DACS cabinet was developed and its modal analyses are carried out to analyze the dynamic characteristics. The response spectrum analyses and the related safety evaluation are then performed for the DACS cabinet subject to seismic loads. Finally, the seismic margin and seismic fragility of the DACS cabinet are investigated. A seismic analysis and preliminary structural integrity of the DACS cabinet under self weight and SSE load have been evaluated. For this purpose, 3-D finite element models of the DACS cabinet were developed. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. Therefore, it is concluded that the DACS cabinet was safely designed in that no damage to the preliminary structural integrity and sufficient seismic margin is expected

  8. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    2008-06-01

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  9. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  10. Preliminary Mass Spectrometric Analysis of Uranium on Environmental Swipe Materials

    International Nuclear Information System (INIS)

    Cheong, Chang-Sik; Jeong, Youn-Joong; Ryu, Jong-Sik; Shin, Hyung-Seon; Cha, Hyun-Ju; Ahn, Gil-Hoon; Park, Il-Jin; Min, Gyung-Sik

    2006-01-01

    It is well-known that uranium and plutonium isotopic compositions of safeguards samples are very useful to investigate the history of nuclear activities. To strengthen the capabilities of environmental sampling analysis in the ROK through MOST/DOE collaboration, round robin test for uranium and plutonium was designed in 2003. As the first round robin test, a set of dried uranium-containing solutions (∼35ng and (∼300ng) was distributed to the participating laboratories in November of 2003, with results reported in April of 2004. The KBSI (Korea Basic Science Institute) and ORNL (Oak Ridge National Laboratory) are currently in the process of analyzing uranium on cotton swipes for the second round robin test. As a preliminary test for the second round, KBSI intends to analyze home-made swipe samples into which international uranium standards are added. Here we describe technical steps of sample preparation and mass spectrometry at KBSI, and report some results of the preliminary test

  11. Active cooling for downhole instrumentation: Preliminary analysis and system selection

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, G.A.

    1988-03-01

    A feasibility study and a series of preliminary designs and analyses were done to identify candidate processes or cycles for use in active cooling systems for downhole electronic instruments. A matrix of energy types and their possible combinations was developed and the energy conversion process for each pari was identified. The feasibility study revealed conventional as well as unconventional processes and possible refrigerants and identified parameters needing further clarifications. A conceptual design or series od oesigns for each system was formulated and a preliminary analysis of each design was completed. The resulting coefficient of performance for each system was compared with the Carnot COP and all systems were ranked by decreasing COP. The system showing the best combination of COP, exchangeability to other operating conditions, failure mode, and system serviceability is chosen for use as a downhole refrigerator. 85 refs., 48 figs., 33 tabs.

  12. Preliminary study of soil permeability properties using principal component analysis

    Science.gov (United States)

    Yulianti, M.; Sudriani, Y.; Rustini, H. A.

    2018-02-01

    Soil permeability measurement is undoubtedly important in carrying out soil-water research such as rainfall-runoff modelling, irrigation water distribution systems, etc. It is also known that acquiring reliable soil permeability data is rather laborious, time-consuming, and costly. Therefore, it is desirable to develop the prediction model. Several studies of empirical equations for predicting permeability have been undertaken by many researchers. These studies derived the models from areas which soil characteristics are different from Indonesian soil, which suggest a possibility that these permeability models are site-specific. The purpose of this study is to identify which soil parameters correspond strongly to soil permeability and propose a preliminary model for permeability prediction. Principal component analysis (PCA) was applied to 16 parameters analysed from 37 sites consist of 91 samples obtained from Batanghari Watershed. Findings indicated five variables that have strong correlation with soil permeability, and we recommend a preliminary permeability model, which is potential for further development.

  13. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  14. Preliminary Safety Analysis Report for the Tokamak Physics Experiment

    International Nuclear Information System (INIS)

    Motloch, C.G.; Bonney, R.F.; Levine, J.D.; Masson, L.S.; Commander, J.C.

    1995-04-01

    This Preliminary Safety Analysis Report (PSAR), includes an indication of the magnitude of facility hazards, complexity of facility operations, and the stage of the facility life-cycle. It presents the results of safety analyses, safety assurance programs, identified vulnerabilities, compensatory measures, and, in general, the rationale describing why the Tokamak Physics Experiment (TPX) can be safely operated. It discusses application of the graded approach to the TPX safety analysis, including the basis for using Department of Energy (DOE) Order 5480.23 and DOE-STD-3009-94 in the development of the PSAR

  15. Sensitivity Measurement of Transmission Computer Tomography: thePreliminary Experimental Study

    International Nuclear Information System (INIS)

    Widodo, Chomsin-S; Sudjatmoko; Kusminarto; Agung-BS Utomo; Suparta, Gede B

    2000-01-01

    This paper reports result of preliminary experimental study onmeasurement method for sensitivity of a computed tomography (CT) scanner. ACT scanner has been build at the Department of Physics, FMIPA UGM and itsperformance based on its sensitivity was measured. The result showed that themeasurement method for sensitivity confirmed this method may be developedfurther as a measurement standard. Although the CT scanner developed has anumber of shortcoming, the analytical results from the sensitivitymeasurement suggest a number of reparations and improvements for the systemso that improved reconstructed CT images can be obtained. (author)

  16. ORNL: PWR-BDHT analysis procedure, a preliminary overview

    International Nuclear Information System (INIS)

    Cliff, S.B.

    1978-01-01

    The computer programs currently used in the analysis of the ORNL-PWR Blowdown Heat Transfer Separate-Effects Program are overviewed. The current linkages and relationships among the programs are given along with general comments about the future directions of some of these programs. The overview is strictly from the computer science point of view with only minimal information concerning the engineering aspects of the analysis procedure

  17. Waste Feed Delivery System Phase 1 Preliminary RAM Analysis

    International Nuclear Information System (INIS)

    DYKES, A.A.

    2000-01-01

    This report presents the updated results of the preliminary reliability, availability, and maintainability (RAM) analysis of selected waste feed delivery (WFD) operations to be performed by the Tank Farm Contractor (TFC) during Phase I activities in support of the Waste Treatment and Immobilization Plant (WTP). For planning purposes, waste feed tanks are being divided into five classes in accordance with the type of waste in each tank and the activities required to retrieve, qualify, and transfer waste feed. This report reflects the baseline design and operating concept, as of the beginning of Fiscal Year 2000, for the delivery of feed from three of these classes, represented by source tanks 241-AN-102, 241-AZ-101 and 241-AN-105. The preliminary RAM analysis quantifies the potential schedule delay associated with operations and maintenance (OBM) field activities needed to accomplish these operations. The RAM analysis is preliminary because the system design, process definition, and activity planning are in a state of evolution. The results are being used to support the continuing development of an O and M Concept tailored to the unique requirements of the WFD Program, which is being documented in various volumes of the Waste Feed Delivery Technical Basis (Carlson. 1999, Rasmussen 1999, and Orme 2000). The waste feed provided to the WTP must: (1) meet limits for chemical and radioactive constituents based on pre-established compositional envelopes (i.e., feed quality); (2) be in acceptable quantities within a prescribed sequence to meet feed quantities; and (3) meet schedule requirements (i.e., feed timing). In the absence of new criteria related to acceptable schedule performance due to the termination of the TWRS Privatization Contract, the original criteria from the Tank Waste Remediation System (77443s) Privatization Contract (DOE 1998) will continue to be used for this analysis

  18. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  19. Preliminary Study on the Enhancement of Reconstruction Speed for Emission Computed Tomography Using Parallel Processing

    International Nuclear Information System (INIS)

    Park, Min Jae; Lee, Jae Sung; Kim, Soo Mee; Kang, Ji Yeon; Lee, Dong Soo; Park, Kwang Suk

    2009-01-01

    Conventional image reconstruction uses simplified physical models of projection. However, real physics, for example 3D reconstruction, takes too long time to process all the data in clinic and is unable in a common reconstruction machine because of the large memory for complex physical models. We suggest the realistic distributed memory model of fast-reconstruction using parallel processing on personal computers to enable large-scale technologies. The preliminary tests for the possibility on virtual machines and various performance test on commercial super computer, Tachyon were performed. Expectation maximization algorithm with common 2D projection and realistic 3D line of response were tested. Since the process time was getting slower (max 6 times) after a certain iteration, optimization for compiler was performed to maximize the efficiency of parallelization. Parallel processing of a program on multiple computers was available on Linux with MPICH and NFS. We verified that differences between parallel processed image and single processed image at the same iterations were under the significant digits of floating point number, about 6 bit. Double processors showed good efficiency (1.96 times) of parallel computing. Delay phenomenon was solved by vectorization method using SSE. Through the study, realistic parallel computing system in clinic was established to be able to reconstruct by plenty of memory using the realistic physical models which was impossible to simplify

  20. Cognitive impairment and computer tomography image in patients with arterial hypertension -preliminary results

    International Nuclear Information System (INIS)

    Yaneva-Sirakova, T.; Tarnovska-Kadreva, R.; Traykov, L.; Zlatareva, D.

    2012-01-01

    Arterial hypertension is the leading risk factor for cognitive impairment, but it is developed only in some of the patients with pour control. On the other hand, not all of the patents with white matter changes have cognitive deficit. There may be a variety of reasons for this: the accuracy of methods for blood pressure measurement, the specific brain localization or some other reason. Here are the preliminary results of a study (or the potential correlation between self-measured, office-, ambulatory monitored blood pressure, central aortic blood pressure, minimal cognitive impairment and the specific brain image on contrast computer tomography. We expect to answer, the question whether central aortic or self-measured blood pressure have the leading role for the development of cognitive impairment in the presence of a specific neuroimaging finding, as well as what is the prerequisite for the clinical manifestation of cognitive dysfunction in patients with computer tomographic pathology. (authors)

  1. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    International Nuclear Information System (INIS)

    Broekema, P.C.; Nieuwpoort, R.V. van; Bal, H.E.

    2015-01-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload

  2. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice

    International Nuclear Information System (INIS)

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Amedee, Joelle; Fricain, Jean-Christophe; Catros, Sylvain; Miraux, Sylvain

    2010-01-01

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  3. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice

    Energy Technology Data Exchange (ETDEWEB)

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Amedee, Joelle; Fricain, Jean-Christophe; Catros, Sylvain [INSERM, U577, Bordeaux, F-33076 (France) and Universite Victor Segalen Bordeaux 2, UMR-S577 Bordeaux, F-33076 (France); Miraux, Sylvain [Centre de Resonance Magnetique des Systemes Biologiques, UMR 5536 (France)

    2010-03-15

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  4. Preliminary analysis of a target factory for laser fusion

    International Nuclear Information System (INIS)

    Sherohman, J.W.; Hendricks, C.D.

    1980-01-01

    An analysis of a target factory leading to the determination of production expressions has provided for the basis of a parametric study. Parameters involving the input and output rate of a process system, processing yield factors, and multiple processing steps and production lines have been used to develop an understanding of their dependence on the rate of target injection for laser fusion. Preliminary results have indicated that a parametric study of this type will be important in the selection of processing methods to be used in the final production scheme of a target factory

  5. Determinants of Trade Credit: A Preliminary Analysis on Construction Sector

    Directory of Open Access Journals (Sweden)

    Nicoleta Barbuta-Misu

    2016-07-01

    Full Text Available This paper introduces a preliminary analysis of the correlations between trade credit and some selected measures of financial performance for a sample of 958 firms acting in the construction sector. The examined period covers 2004-2013. The sample derived from Amadeus database contains firms that have sold and bought on credit. Results showed that larger firms offered and used more credit than counterparties. Firms offered and used in same time credit, but not in same level. Firms with higher return on assets and profit margin used and offered less credit from suppliers, respectively to clients. Moreover, more liquid firms used less trade payables.

  6. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    Science.gov (United States)

    Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.

    2012-01-01

    MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.

  7. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  8. Preliminary CFD Analysis for HVAC System Design of a Containment Building

    Energy Technology Data Exchange (ETDEWEB)

    Son, Sung Man; Choi, Choengryul [ELSOLTEC, Yongin (Korea, Republic of); Choo, Jae Ho; Hong, Moonpyo; Kim, Hyungseok [KEPCO Engineering and Construction, Gimcheon (Korea, Republic of)

    2016-10-15

    HVAC (Heating, Ventilation, Air Conditioning) system has been mainly designed based on overall heat balance and averaging concepts, which is simple and useful for designing overall system. However, such a method has the disadvantage that cannot predict the local flow and temperature distributions in a containment building. In this study, a CFD (Computational Fluid Dynamics) preliminary analysis is carried out to obtain detailed flow and temperature distributions in a containment building and to ensure that such information can be obtained via CFD analysis. This approach can be useful for hydrogen analysis in an accident related to hydrogen released into a containment building. In this study, CFD preliminary analysis has been performed to obtain the detailed information of the reactor containment building by using the CFD analysis techniques and to ensure that such information can be obtained via CFD analysis. We confirmed that CFD analysis can offer enough detailed information about flow patterns and temperature field and that CFD technique is a useful tool for HVAC design of nuclear power plants.

  9. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  10. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  11. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  12. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  13. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  14. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    International Nuclear Information System (INIS)

    Lewis, W.S.

    1994-01-01

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment

  15. Analysis preliminary phytochemical raw extract of leaves Nephrolepis pectinata

    Directory of Open Access Journals (Sweden)

    Natally Marreiros Gomes

    2017-06-01

    Full Text Available The Nephrolepis pectinata popularly known as paulista fern, ladder-heaven, cat tail, belongs to the family Davalliaceae. For the beauty of the arrangements of their leaves ferns are quite commercialized in Brazil, however, have not been described in the literature studies on their pharmacological potential. Thus, the objective of this research was to analyze the phytochemical properties of the crude extract of the leaves of Nephrolepis pectinata. To perform the phytochemical analysis were initially made the collection of the vegetable, preparation of voucher specimen, washing, drying and grinding. Then, extraction by percolation method and end the phytochemical analysis. Preliminary results phytochemicals the crude extract of the leaves of Nephrolepis pectinata tested positive for reducing sugars, phenols/tannins (catechins tannins and catechins.

  16. Preliminary analysis of accident in SST-1 current feeder system

    International Nuclear Information System (INIS)

    Roy, Swati; Kanabar, Deven; Garg, Atul; Singh, Amit; Tanna, Vipul; Prasad, Upendra; Srinivasan, R.

    2017-01-01

    Steady-state Tokamak-1 (SST-1) has 16 superconducting Toroidal field (TF) and 9 superconducting poloidal field (PF) coils rated for 10kA DC. All the TF are connected in series and are operated in DC condition whereas PF coils are individually operated in pulse mode during SST-1 campaigns. SST-1 current feeder system (CFS) houses 9 pairs of PF current leads and 1 pair of TF current leads. During past SST-1 campaign, there were arcing incidents within SST-1 CFS chamber which caused significant damage to PF superconducting current leads as well as its Helium cooling lines of the current leads. This paper brings out the preliminary analysis of the mentioned arcing incident, possible reasons and its investigation thereby laying out the sequence of events. From this analysis and observations, various measures to avoid such arcing incidents have also been proposed. (author)

  17. Rayleigh to Compton ratio scatter tomography applied to breast cancer diagnosis: A preliminary computational study

    International Nuclear Information System (INIS)

    Antoniassi, M.; Conceição, A.L.C.; Poletti, M.E.

    2014-01-01

    In the present work, a tomographic technique based on Rayleigh to Compton scattering ratio (R/C) was studied using computational simulation in order to assess its application to breast cancer diagnosis. In this preliminary study, some parameters that affect the image quality were evaluated, such as: (i) energy beam, (ii) size and glandularity of the breast, and (iii) statistical count noise. The results showed that the R/C contrast increases with increasing photon energy and decreases with increasing glandularity of the sample. The statistical noise showed to be a significant parameter, although the quality of the obtained images was acceptable for a considerable range of noise level. The preliminary results suggest that the R/C tomographic technique has a potential of being applied as a complementary tool in the breast cancer diagnostic. - Highlights: ► A tomographic technique based on Rayleigh to Compton scattering ratio is proposed in order to study breast tissues. ► The Rayleigh to Compton scattering ratio technique is compared with conventional transmission technique. ► The influence of experimental parameters (energy, sample, detection system) is studied

  18. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10 8 kg, with a corresponding kinetic energy of 1.88 x 10 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references

  19. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-04-01

    A computational approach used for subsurface explosion cratering has been extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for our first computer simulation because it was the most thoroughly studied. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Shoemaker estimates that the impact occurred about 20,000 to 30,000 years ago [Roddy (1977)]. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s. meteorite mass of 1.57E + 08 kg, with a corresponding kinetic energy of 1.88E + 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation a Tillotson equation-of-state description for iron and limestone was used with no shear strength. A color movie based on this calculation was produced using computer-generated graphics. Results obtained for this preliminary calculation of the formation of Meteor Crater, Arizona, are in good agreement with Meteor Crater Measurements

  20. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10/sup 8/ kg, with a corresponding kinetic energy of 1.88 x 10/sup 16/ J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references.

  1. Preliminary RAMI analysis of WCLL blanket and breeder systems

    International Nuclear Information System (INIS)

    Arroyo, Jose Manuel; Brown, Richard; Harman, Jon; Rosa, Elena; Ibarra, Angel

    2015-01-01

    Highlights: • Preliminary RAMI model for WCLL has been developed. • Critical parts and parameters influencing WCLL availability have been focused. • Necessary developments of tools/models to represent system performance have been identified. - Abstract: DEMO will be a prototype fusion reactor designed to prove the capability to produce electrical power in a commercially acceptable way. One of the key factors in that endeavor is the achievement of certain level of plant availability. Therefore, RAMI (Reliability, Availability, Maintainability and Inspectability) will be a key element in the engineering development of DEMO. Some studies have been started so as to develop the tools and models to assess different design alternatives from RAMI point of view. The main objective of these studies is to be able to evaluate the influence of different parameters on DEMO availability and to focus the critical parts that should be further researched and improved in order to develop a high-availability oriented DEMO design. A preliminary RAMI analysis of the Water Cooled Lithium-Lead (WCLL) blanket and breeder concept for DEMO has been developed. The amounts of single elements that may fail (e.g. more than 180,000 C-shaped tubes) and the mean down time associated to failures inside the vacuum vessel (around 3 months) have been highlighted as the critical parameters influencing the system availability. On the other hand, the necessary developments of tools/models to better represent the system performance have been identified and proposed for future work.

  2. Preliminary RAMI analysis of WCLL blanket and breeder systems

    Energy Technology Data Exchange (ETDEWEB)

    Arroyo, Jose Manuel, E-mail: josemanuel.arroyo@ciemat.es [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain); Brown, Richard [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon (United Kingdom); Harman, Jon [EFDA Close Support Unit, Garching (Germany); Rosa, Elena; Ibarra, Angel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain)

    2015-10-15

    Highlights: • Preliminary RAMI model for WCLL has been developed. • Critical parts and parameters influencing WCLL availability have been focused. • Necessary developments of tools/models to represent system performance have been identified. - Abstract: DEMO will be a prototype fusion reactor designed to prove the capability to produce electrical power in a commercially acceptable way. One of the key factors in that endeavor is the achievement of certain level of plant availability. Therefore, RAMI (Reliability, Availability, Maintainability and Inspectability) will be a key element in the engineering development of DEMO. Some studies have been started so as to develop the tools and models to assess different design alternatives from RAMI point of view. The main objective of these studies is to be able to evaluate the influence of different parameters on DEMO availability and to focus the critical parts that should be further researched and improved in order to develop a high-availability oriented DEMO design. A preliminary RAMI analysis of the Water Cooled Lithium-Lead (WCLL) blanket and breeder concept for DEMO has been developed. The amounts of single elements that may fail (e.g. more than 180,000 C-shaped tubes) and the mean down time associated to failures inside the vacuum vessel (around 3 months) have been highlighted as the critical parameters influencing the system availability. On the other hand, the necessary developments of tools/models to better represent the system performance have been identified and proposed for future work.

  3. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  4. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  5. Preliminary analysis of biomass potentially useful for producing biodiesel

    International Nuclear Information System (INIS)

    Cabrera Cifuentes, Gerardo; Burbano Jaramillo, Juan Carlos; Garcia Melo, Jose Isidro

    2011-01-01

    Given that biodiesel is emerging as a viable solution for some energy and environmental problems, research on raw materials appropriate for its production is a matter of growing interest. In this study we present the results of research devoted to preliminary analysis on several vegetable (biomass) species potentially useful for producing biodiesel. The bioprospection zone is a region on the Colombian Pacific coast. The candidate species collected underwent different standardized ASTM tests in order for us to define properties that facilitate their evaluation. Some of the species underwent a transesterification process. Comparisons between the thermo-physical properties of the biofuels obtained and the properties of commercial diesel were carried out. Also, performance tests for these biofuels were conducted in compression ignition engines, particularly evaluating efficiency, fuel consumption, and potency at different RPMs.

  6. Preliminary radar systems analysis for Venus orbiter missions

    Science.gov (United States)

    Brandenburg, R. K.; Spadoni, D. J.

    1971-01-01

    A short, preliminary analysis is presented of the problems involved in mapping the surface of Venus with radar from an orbiting spacecraft. Two types of radar, the noncoherent sidelooking and the focused synthetic aperture systems, are sized to fulfill two assumed levels of Venus exploration. The two exploration levels, regional and local, assumed for this study are based on previous Astro Sciences work (Klopp 1969). The regional level is defined as 1 to 3 kilometer spatial and 0.5 to 1 km vertical resolution of 100 percent 0 of the planet's surface. The local level is defined as 100 to 200 meter spatial and 50-10 m vertical resolution of about 100 percent of the surfAce (based on the regional survey). A 10cm operating frequency was chosen for both radar systems in order to minimize the antenna size and maximize the apparent radar cross section of the surface.

  7. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  8. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  9. Preliminary Study on Hybrid Computational Phantom for Radiation Dosimetry Based on Subdivision Surface

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Choi, Sang Hyoun; Cho, Sung Koo; Kim, Chan Hyeong

    2007-01-01

    The anthropomorphic computational phantoms are classified into two groups. One group is the stylized phantoms, or MIRD phantoms, which are based on mathematical representations of the anatomical structures. The shapes and positions of the organs and tissues in these phantoms can be adjusted by changing the coefficients of the equations in use. The other group is the voxel phantoms, which are based on tomographic images of a real person such as CT, MR and serially sectioned color slice images from a cadaver. Obviously, the voxel phantoms represent the anatomical structures of a human body much more realistically than the stylized phantoms. A realistic representation of anatomical structure is very important for an accurate calculation of radiation dose in the human body. Consequently, the ICRP recently has decided to use the voxel phantoms for the forthcoming update of the dose conversion coefficients. However, the voxel phantoms also have some limitations: (1) The topology and dimensions of the organs and tissues in a voxel model are extremely difficult to change, and (2) The thin organs, such as oral mucosa and skin, cannot be realistically modeled unless the voxel resolution is prohibitively high. Recently, a new approach has been implemented by several investigators. The investigators converted their voxel phantoms to hybrid computational phantoms based on NURBS (Non-Uniform Rational B-Splines) surface, which is smooth and deformable. It is claimed that these new phantoms have the flexibility of the stylized phantom along with the realistic representations of the anatomical structures. The topology and dimensions of the anatomical structures can be easily changed as necessary. Thin organs can be modeled without affecting computational speed or memory requirement. The hybrid phantoms can be also used for 4-D Monte Carlo simulations. In this preliminary study, the external shape of a voxel phantom (i.e., skin), HDRK-Man, was converted to a hybrid computational

  10. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  11. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  12. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  13. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  14. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  15. Preliminary systems-interaction results from the Digraph Matrix Analysis of the Watts Bar Nuclear Power Plant safety-injection systems

    International Nuclear Information System (INIS)

    Sacks, I.J.; Ashmore, B.C.; Champney, J.M.; Alesso, H.P.

    1983-06-01

    This report provides preliminary results generated by a Digraph Matrix Analysis (DMA) for a Systems Interaction analysis performed on the Safety Injection System of the Tennessee Valley Authority Watts Bar Nuclear Power Plant. An overview of DMA is provided along with a brief description of the computer codes used in DMA

  16. DFT computational analysis of piracetam

    Science.gov (United States)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  17. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  18. Preliminary analysis on incore performance of nuclear fuel: pt. 4

    International Nuclear Information System (INIS)

    Noh, S.K.; Chang, M.H.; Lee, C.C.; Chung, Y.H.; Kuk, K.Y.; Park, C.Y.; Lee, S.K.

    1981-01-01

    An analysis has been performed for thermal hydraulic design parameters of Wolsung-1 reactor core in steady state with the help of a computer code COBRA-IV-I. The design parameters are coolant enthalpy, flow velocity, coolant quality, pressure and fuel temperature distribution. The maximum power channel has been taken into account in this work. The results appear to be reasonably agreeable with data from PSR'S, with the maximum difference between this work and PSR'S being 4.3%

  19. Synthesis, Preliminary Bioevaluation and Computational Analysis of Caffeic Acid Analogues

    Directory of Open Access Journals (Sweden)

    Zhiqian Liu

    2014-05-01

    Full Text Available A series of caffeic acid amides were designed, synthesized and evaluated for anti-inflammatory activity. Most of them exhibited promising anti-inflammatory activity against nitric oxide (NO generation in murine macrophage RAW264.7 cells. A 3D pharmacophore model was created based on the biological results for further structural optimization. Moreover, predication of the potential targets was also carried out by the PharmMapper server. These amide analogues represent a promising class of anti-inflammatory scaffold for further exploration and target identification.

  20. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  1. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  2. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  3. Preliminary CFD analysis methodology for flow in a LFR fuel assembly

    International Nuclear Information System (INIS)

    Catana, A.; Ioan, M.; Serbanel, M.

    2013-01-01

    In this paper a preliminary Computational Fluid Dynamics (CFD) analysis was performed in order to setup a methodology to be used for more complex coolant flow analysis inside ALFRED nuclear reactor fuel assembly. The core contains 171 separated fuel assembly, each consisting in a hexagonal array of 127 fuel rods. Three honey comb spacer grids are proposed along fuel rods with the aim to keep flow geometry intact during reactor operation. The main goal of this paper is to compute some hydraulic parameters: pressure, velocity, wall shear stress and turbulence parameters with and without spacer grids. In this analysis we consider an adiabatic case, so far no heat transfer is considered but we pave the road toward more complex thermo hydraulic analysis for ALFRED (LFR in general). The CAELINUX CFD distribution was used with its main components: Salome-Meca (for geometry and mesh) and Code-Saturne as mono-phase CFD solver. Paraview and Visist Postprocessors were used for data extraction and graphical displays. (authors)

  4. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  5. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  6. Non-invasive coronary angiography with multislice computed tomography. Technology, methods, preliminary experience and prospects.

    Science.gov (United States)

    Traversi, Egidio; Bertoli, Giuseppe; Barazzoni, Giancarlo; Baldi, Maurizia; Tramarin, Roberto

    2004-02-01

    The recent technical developments in multislice computed tomography (MSCT), with ECG retro-gated image reconstruction, have elicited great interest in the possibility of accurate non-invasive imaging of the coronary arteries. The latest generation of MSCT systems with 8-16 rows of detectors permits acquisition of the whole cardiac volume during a single 15-20 s breath-hold with a submillimetric definition of the images and an outstanding signal-to-noise ratio. Thus the race which, between MSCT, electron beam computed tomography and cardiac magnetic resonance imaging, can best provide routine and reliable imaging of the coronary arteries in clinical practice has recommenced. Currently available MSCT systems offer different options for both cardiac image acquisition and reconstruction, including multiplanar and curved multiplanar reconstruction, three-dimensional volume rendering, maximum intensity projection, and virtual angioscopy. In our preliminary experience including 176 patients suffering from known or suspected coronary artery disease, MSCT was feasible in 161 (91.5%) and showed a sensitivity of 80.4% and a specificity of 80.3%, with respect to standard coronary angiography, in detecting critical stenosis in coronary arteries and artery or venous bypass grafts. These results correspond to a positive predictive value of 58.6% and a negative predictive value of 92.2%. The true role that MSCT is likely to play in the future in non-invasive coronary imaging is still to be defined. Nevertheless, the huge amount of data obtainable by MSCT along with the rapid technological advances, shorter acquisition times and reconstruction algorithm developments will make the technique stronger, and possible applications are expected not only for non-invasive coronary angiography, but also for cardiac function and myocardial perfusion evaluation, as an all-in-one examination.

  7. Preliminary Analysis of a Submerged Wave Energy Device

    Science.gov (United States)

    Wagner, J. R.; Wagner, J. J.; Hayatdavoodi, M.; Ertekin, R. C.

    2016-02-01

    Preliminary analysis of a submerged wave energy harvesting device is presented. The device is composed of a thin, horizontally submerged plate that is restricted to heave oscillations under the influence of surface waves. The submerged plate is oscillating, and it can be attached to a fixed rotor, or a piston, to harvest the wave energy. A fully submerged wave energy converter is preferred over a surface energy convertor due to its durability and less visual and physical distractions it presents. In this study, the device is subject to nonlinear shallow-water waves. Wave loads on the submerged oscillating plate are obtained via the Level I Green-Naghdi equations. The unsteady motion of the plate is obtained by solving the nonlinear equations of motion. The results are obtained for a range of waves with varying heights and periods. The amplitude and period of plate oscillations are analyzed as functions of the wave parameters and plate width. Particular attention is given to the selection of the site of desired wave field. Initial estimation on the amount of energy extraction from the device, located near shore at a given site, is provided.

  8. Preliminary radiation criteria and nuclear analysis for ETF

    International Nuclear Information System (INIS)

    Engholm, B.A.

    1980-09-01

    Preliminary biological and materials radiation dose criteria for the Engineering Test Facility are described and tabulated. In keeping with the ETF Mission Statement, a key biological dose criterion is a 24-hour shutdown dose rate of 2 mrem/hr on the surface of the outboard bulk shield. Materials dose criteria, which primarily govern the inboard shield design, include 10 9 rads exposure limit to epoxy insulation, 3 x 10 -4 dpa damage to the TF coil copper stabilizer, and a total nuclear heating rate of 5 kW in the inboard TF coils. Nuclear analysis performed during FY 80 was directed primarily at the inboard and outboard bulk shielding, and at radiation streaming in the neutral beam drift ducts. Inboard and outboard shield thicknesses to achieve the biological and materials radiation criteria are 75 cm inboard and 125 cm outboard, the configuration consisting of alternating layers of stainless steel and borated water. The outboard shield also includes a 5 cm layer of lead. NBI duct streaming analyses performed by ORNL and LASL will play a key role in the design of the duct and NBI shielding in FY 81. The NBI aluminum cryopanel nuclear heating rate during the heating cycle is about 1 milliwatt/cm 3 , which is far less than the permissible limit

  9. Preliminary analysis of accelerated space flight ionizing radiation testing

    Science.gov (United States)

    Wilson, J. W.; Stock, L. V.; Carter, D. J.; Chang, C. K.

    1982-01-01

    A preliminary analysis shows that radiation dose equivalent to 30 years in the geosynchronous environment can be accumulated in a typical composite material exposed to space for 2 years or less onboard a spacecraft orbiting from perigee of 300 km out to the peak of the inner electron belt (approximately 2750 km). Future work to determine spacecraft orbits better tailored to materials accelerated testing is indicated. It is predicted that a range of 10 to the 9th power to 10 to the 10th power rads would be accumulated in 3-6 mil thick epoxy/graphite exposed by a test spacecraft orbiting in the inner electron belt. This dose is equivalent to the accumulated dose that this material would be expected to have after 30 years in a geosynchronous orbit. It is anticipated that material specimens would be brought back to Earth after 2 years in the radiation environment so that space radiation effects on materials could be analyzed by laboratory methods.

  10. Investigation of Sorption and Diffusion Mechanisms, and Preliminary Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bhave, Ramesh R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jubin, Robert Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Spencer, Barry B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nair, Sankar [Georgia Inst. of Technology, Atlanta, GA (United States)

    2017-02-01

    This report describes the synthesis and evaluation of molecular sieve zeolite membranes to separate and concentrate tritiated water (HTO) from dilute HTO-bearing aqueous streams. Several monovalent and divalent cation exchanged silico alumino phosphate (SAPO-34) molecular sieve zeolite membranes were synthesized on disk supports and characterized with gas and vapor permeation measurements. The pervaporation process performance was evaluated for the separation and concentration of tritiated water. Experiments were performed using tritiated water feed solution containing tritium at the high end of the range (1 mCi/mL) anticipated in a nuclear fuel processing system that includes both acid and water streams recycling. The tritium concentration was about 0.1 ppm. The permeate was recovered under vacuum. The HTO/H2O selectivity and separation factor calculated from the measured tritium concentrations ranged from 0.99 to 1.23, and 0.83-0.98, respectively. Although the membrane performance for HTO separation was lower than expected, several encouraging observations including molecular sieving and high vapor permeance are reported. Additionally, several new approaches are proposed, such as tuning the sorption and diffusion properties offered by small pore LTA zeolite materials, and cation exchanged aluminosilicates with high metal loading. It is hypothesized that substantially improved preferential transport of tritium (HTO) resulting in a more concentrated permeate can be achieved. Preliminary economic analysis for the membrane-based process to concentrate tritiated water is also discussed.

  11. Preliminary analysis of public dose from CFETR gaseous tritium release

    Energy Technology Data Exchange (ETDEWEB)

    Nie, Baojie [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); University of Science and Technology of China, Hefei, Anhui 230027 (China); Ni, Muyi, E-mail: muyi.ni@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); Lian, Chao; Jiang, Jieqiong [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China)

    2015-02-15

    Highlights: • Present the amounts and limit dose of tritium release to the environment for CFETR. • Perform a preliminary simulation of radiation dose for gaseous tritium release. • Key parameters about soil types, wind speed, stability class, effective release height and age were sensitivity analyzed. • Tritium release amount is recalculated consistently with dose limit in Chinese regulation for CFETR. - Abstract: To demonstrate tritium self-sufficiency and other engineering issues, the scientific conception of Chinese Fusion Engineering Test Reactor (CFETR) has been proposed in China parallel with ITER and before DEMO reactor. Tritium environmental safety for CFETR is an important issue and must be evaluated because of the huge amounts of tritium cycling in reactor. In this work, different tritium release scenarios of CFETR and dose limit regulations in China are introduced. And the public dose is preliminarily analyzed under normal and accidental events. Furthermore, after finishing the sensitivity analysis of key input parameters, the public dose is reevaluated based on extreme parameters. Finally, tritium release amount is recalculated consistently with the dose limit in Chinese regulation for CFETR, which would provide a reference for tritium system design of CFETR.

  12. Cost risk analysis of radioactive waste management Preliminary study

    International Nuclear Information System (INIS)

    Forsstroem, J.

    2006-12-01

    This work begins with exposition of the basics of risk analysis. These basics are then applied to the Finnish radioactive waste disposal environment in which the nuclear power companies are responsible for all costs of radioactive waste management including longterm disposal of spent fuel. Nuclear power companies prepare cost estimates of the waste disposal on a yearly basis to support the decision making on accumulation of resources to the nuclear waste disposal fund. These cost estimates are based on the cost level of the ongoing year. A Monte Carlo simulation model of the costs of the waste disposal system was defined and it was used to produce preliminary results of its cost risk characteristics. Input data was synthesised by modifying the original coefficients of cost uncertainty to define a cost range for each cost item. This is a suitable method for demonstrating results obtainable by the model but it is not accurate enough for supporting decision making. Two key areas of further development were identified: the input data preparation and identifying and handling of (i.e. eliminating or merging) interacting cost elements in the simulation model. Further development in both of the mentioned areas can be carried out by co-operating with the power companies as they are the sources of the original data. (orig.)

  13. Temporal fringe pattern analysis with parallel computing

    International Nuclear Information System (INIS)

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-01-01

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis

  14. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  15. Grid-connected ICES: preliminary feasibility analysis and evaluation. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    The HEAL Complex in New Orleans will serve as a Demonstration Community for which the ICES Demonstration System will be designed. The complex is a group of hospitals, clinics, research facilities, and medical educational facilities. The five tasks reported on are: preliminary energy analysis; preliminary institutional assessment; conceptual design; firming-up of commitments; and detailed work management plan.

  16. Preliminary safety analysis of molten salt breeder reactor

    International Nuclear Information System (INIS)

    Cheng Maosong; Dai Zhimin

    2013-01-01

    Background: The molten salt reactor is one of the six advanced reactor concepts identified by the Generation IV International Forum as a candidate for cooperative development, which is characterized by remarkable advantages in inherent safety, fuel cycle, miniaturization, effective utilization of nuclear resources and proliferation resistance. ORNL finished the conceptual design of Molten Salt Breeder Reactor (MSBR) based on the design, building and operation of Molten Salt Reactor Experiment (MSRE). Purpose: We attempt to implement the preliminary safety analysis of MSBR in order to provide a reference for the design and optimization of MSBR in the future. Methods: According to the conceptual design of MSBR, a model of safety analysis using point kinetics coupled with the simplified heat transfer mechanism is presented. The model is applied to simulate the transient phenomena of MSBR initiated by an abnormal step reactivity addition and an abnormal ramp reactivity addition at full-power equilibrium condition. Results: The thermal power in the core increases rapidly at the beginning and is accompanied by a rise of the fuel and graphite temperatures after 100, 300, 500 and 600 pcm reactivity addition. The maximum outlet temperature of the fuel in the core is at 1250℃ in 500 pcm reactivity addition, but up to 1350℃ in 600 pcm reactivity addition. The maximum of the power and the temperature are delayed and lower in the ramp reactivity addition rather than in the step reactivity addition. Conclusions: Based on the results, when the reactivity inserted is less than 500 pcm in maximum at full power equilibrium condition, the structural material in Hastelloy-N is not melted and can keep integrity without external control action. And it is necessary to try to avoid inserting a reactivity at short time. (authors)

  17. Conversion Preliminary Safety Analysis Report for the NIST Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Diamond, D. J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, J. S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Hanson, A. L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Cheng, L-Y [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, N. [Brookhaven National Lab. (BNL), Upton, NY (United States); Cuadra, A. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-01-30

    The NIST Center for Neutron Research (NCNR) is a reactor-laboratory complex providing the National Institute of Standards and Technology (NIST) and the nation with a world-class facility for the performance of neutron-based research. The heart of this facility is the NIST research reactor (aka NBSR); a heavy water moderated and cooled reactor operating at 20 MW. It is fueled with high-enriched uranium (HEU) fuel elements. A Global Threat Reduction Initiative (GTRI) program is underway to convert the reactor to low-enriched uranium (LEU) fuel. This program includes the qualification of the proposed fuel, uranium and molybdenum alloy foil clad in an aluminum alloy, and the development of the fabrication techniques. This report is a preliminary version of the Safety Analysis Report (SAR) that would be submitted to the U.S. Nuclear Regulatory Commission (NRC) for approval prior to conversion. The report follows the recommended format and content from the NRC codified in NUREG-1537, “Guidelines for Preparing and Reviewing Applications for the Licensing of Non-power Reactors,” Chapter 18, “Highly Enriched to Low-Enriched Uranium Conversions.” The emphasis in any conversion SAR is to explain the differences between the LEU and HEU cores and to show the acceptability of the new design; there is no need to repeat information regarding the current reactor that will not change upon conversion. Hence, as seen in the report, the bulk of the SAR is devoted to Chapter 4, Reactor Description, and Chapter 13, Safety Analysis.

  18. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  19. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  20. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  1. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  2. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  3. First fungal genome sequence from Africa: A preliminary analysis

    Directory of Open Access Journals (Sweden)

    Rene Sutherland

    2012-01-01

    Full Text Available Some of the most significant breakthroughs in the biological sciences this century will emerge from the development of next generation sequencing technologies. The ease of availability of DNA sequence made possible through these new technologies has given researchers opportunities to study organisms in a manner that was not possible with Sanger sequencing. Scientists will, therefore, need to embrace genomics, as well as develop and nurture the human capacity to sequence genomes and utilise the ’tsunami‘ of data that emerge from genome sequencing. In response to these challenges, we sequenced the genome of Fusarium circinatum, a fungal pathogen of pine that causes pitch canker, a disease of great concern to the South African forestry industry. The sequencing work was conducted in South Africa, making F. circinatum the first eukaryotic organism for which the complete genome has been sequenced locally. Here we report on the process that was followed to sequence, assemble and perform a preliminary characterisation of the genome. Furthermore, details of the computer annotation and manual curation of this genome are presented. The F. circinatum genome was found to be nearly 44 million bases in size, which is similar to that of four other Fusarium genomes that have been sequenced elsewhere. The genome contains just over 15 000 open reading frames, which is less than that of the related species, Fusarium oxysporum, but more than that for Fusarium verticillioides. Amongst the various putative gene clusters identified in F. circinatum, those encoding the secondary metabolites fumosin and fusarin appeared to harbour evidence of gene translocation. It is anticipated that similar comparisons of other loci will provide insights into the genetic basis for pathogenicity of the pitch canker pathogen. Perhaps more importantly, this project has engaged a relatively large group of scientists

  4. Computational analysis of a multistage axial compressor

    Science.gov (United States)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  5. An explorative study of the technology transfer coach as a preliminary for the design of a computer aid

    OpenAIRE

    Jönsson, Oscar

    2014-01-01

    The university technology transfer coach has an important role in supporting the commercialization of research results. This thesis has studied the technology transfer coach and their needs in the coaching process. The goal has been to investigate information needs of the technology transfer coach as a preliminary for the design of computer aids.Using a grounded theory approach, we interviewed 17 coaches working in the Swedish technology transfer environment. Extracted quotes from interviews ...

  6. Risk Assessment of Healthcare Waste by Preliminary Hazard Analysis Method

    Directory of Open Access Journals (Sweden)

    Pouran Morovati

    2017-09-01

    Full Text Available Introduction and purpose: Improper management of healthcare waste (HCW can pose considerable risks to human health and the environment and cause serious problems in developing countries such as Iran. In this study, we sought to determine the hazards of HCW in the public hospitals affiliated to Abadan School of Medicine using the preliminary hazard analysis (PHA method. Methods: In this descriptive and analytic study, health risk assessment of HCW in government hospitals affiliated to Abadan School of Medicine (4 public hospitals was carried out by using PHA in the summer of  2016. Results: We noted the high risk of sharps and infectious wastes. Considering the dual risk of injury and disease transmission, sharps were classified in the very high-risk group, and pharmaceutical and chemical and radioactive wastes were classified in the medium-risk group. Sharps posed the highest risk, while pharmaceutical and chemical wastes had the lowest risk. Among the various stages of waste management, the waste treatment stage was the most hazardous in all the studied hospitals. Conclusion: To diminish the risks associated with healthcare waste management in the studied hospitals, adequate training of healthcare workers and care providers, provision of suitable personal protective and transportation equipment, and supervision of the environmental health manager of hospitals should be considered by the authorities.  

  7. Elastic and inelastic methods of piping systems analysis: a preliminary review

    International Nuclear Information System (INIS)

    Reich, M.; Esztergar, E.P.; Spence, J.; Boyle, J.; Chang, T.Y.

    1975-02-01

    A preliminary review of the methods used for elastic and inelastic piping system analysis is presented. The following principal conclusions are reached: techniques for the analysis of complex piping systems operating in the high temperature creep regime should be further developed; accurate analysis of a complete pipework system in creep using the ''complete shell finite element methods'' is not feasible at the present, and the ''reduced shell finite element method'' still requires excessive computer time and also requires further investigation regarding the compatibility problems associated with the pipe bend element, particularly when applied to cases involving general loading conditions; and with the current size of proposed high temperature systems requiring the evaluation of long-term operating life (30 to 40 years), it is important to adopt a simplified analysis method. A design procedure for a simplified analysis method based on currently available techniques applied in a three-stage approach is outlined. The work required for implementation of these procedures together with desirable future developments are also briefly discussed. Other proposed simplified approximations also are reviewed in the text. 101 references. (U.S.)

  8. Computer-Aided Diagnosis Based on Convolutional Neural Network System for Colorectal Polyp Classification: Preliminary Experience.

    Science.gov (United States)

    Komeda, Yoriaki; Handa, Hisashi; Watanabe, Tomohiro; Nomura, Takanobu; Kitahashi, Misaki; Sakurai, Toshiharu; Okamoto, Ayana; Minami, Tomohiro; Kono, Masashi; Arizumi, Tadaaki; Takenaka, Mamoru; Hagiwara, Satoru; Matsui, Shigenaga; Nishida, Naoshi; Kashida, Hiroshi; Kudo, Masatoshi

    2017-01-01

    Computer-aided diagnosis (CAD) is becoming a next-generation tool for the diagnosis of human disease. CAD for colon polyps has been suggested as a particularly useful tool for trainee colonoscopists, as the use of a CAD system avoids the complications associated with endoscopic resections. In addition to conventional CAD, a convolutional neural network (CNN) system utilizing artificial intelligence (AI) has been developing rapidly over the past 5 years. We attempted to generate a unique CNN-CAD system with an AI function that studied endoscopic images extracted from movies obtained with colonoscopes used in routine examinations. Here, we report our preliminary results of this novel CNN-CAD system for the diagnosis of colon polyps. A total of 1,200 images from cases of colonoscopy performed between January 2010 and December 2016 at Kindai University Hospital were used. These images were extracted from the video of actual endoscopic examinations. Additional video images from 10 cases of unlearned processes were retrospectively assessed in a pilot study. They were simply diagnosed as either an adenomatous or nonadenomatous polyp. The number of images used by AI to learn to distinguish adenomatous from nonadenomatous was 1,200:600. These images were extracted from the videos of actual endoscopic examinations. The size of each image was adjusted to 256 × 256 pixels. A 10-hold cross-validation was carried out. The accuracy of the 10-hold cross-validation is 0.751, where the accuracy is the ratio of the number of correct answers over the number of all the answers produced by the CNN. The decisions by the CNN were correct in 7 of 10 cases. A CNN-CAD system using routine colonoscopy might be useful for the rapid diagnosis of colorectal polyp classification. Further prospective studies in an in vivo setting are required to confirm the effectiveness of a CNN-CAD system in routine colonoscopy. © 2017 S. Karger AG, Basel.

  9. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  10. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  11. Computer image analysis of etched tracks from ionizing radiation

    Science.gov (United States)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  12. Preliminary analysis of the proposed BN-600 benchmark core

    International Nuclear Information System (INIS)

    John, T.M.

    2000-01-01

    The Indira Gandhi Centre for Atomic Research is actively involved in the design of Fast Power Reactors in India. The core physics calculations are performed by the computer codes that are developed in-house or by the codes obtained from other laboratories and suitably modified to meet the computational requirements. The basic philosophy of the core physics calculations is to use the diffusion theory codes with the 25 group nuclear cross sections. The parameters that are very sensitive is the core leakage, like the power distribution at the core blanket interface etc. are calculated using transport theory codes under the DSN approximations. All these codes use the finite difference approximation as the method to treat the spatial variation of the neutron flux. Criticality problems having geometries that are irregular to be represented by the conventional codes are solved using Monte Carlo methods. These codes and methods have been validated by the analysis of various critical assemblies and calculational benchmarks. Reactor core design procedure at IGCAR consists of: two and three dimensional diffusion theory calculations (codes ALCIALMI and 3DB); auxiliary calculations, (neutron balance, power distributions, etc. are done by codes that are developed in-house); transport theory corrections from two dimensional transport calculations (DOT); irregular geometry treated by Monte Carlo method (KENO); cross section data library used CV2M (25 group)

  13. Cost analysis of small hydroelectric power plants components and preliminary estimation of global cost

    International Nuclear Information System (INIS)

    Basta, C.; Olive, W.J.; Antunes, J.S.

    1990-01-01

    An analysis of cost for each components of Small Hydroelectric Power Plant, taking into account the real costs of these projects is shown. It also presents a global equation which allows a preliminary estimation of cost for each construction. (author)

  14. Preliminary analysis of knee stress in Full Extension Landing

    Directory of Open Access Journals (Sweden)

    Majid Davoodi Makinejad

    2013-09-01

    Full Text Available OBJECTIVE: This study provides an experimental and finite element analysis of knee-joint structure during extended-knee landing based on the extracted impact force, and it numerically identifies the contact pressure, stress distribution and possibility of bone-to-bone contact when a subject lands from a safe height. METHODS: The impact time and loads were measured via inverse dynamic analysis of free landing without knee flexion from three different heights (25, 50 and 75 cm, using five subjects with an average body mass index of 18.8. Three-dimensional data were developed from computed tomography scans and were reprocessed with modeling software before being imported and analyzed by finite element analysis software. The whole leg was considered to be a fixed middle-hinged structure, while impact loads were applied to the femur in an upward direction. RESULTS: Straight landing exerted an enormous amount of pressure on the knee joint as a result of the body's inability to utilize the lower extremity muscles, thereby maximizing the threat of injury when the load exceeds the height-safety threshold. CONCLUSIONS: The researchers conclude that extended-knee landing results in serious deformation of the meniscus and cartilage and increases the risk of bone-to-bone contact and serious knee injury when the load exceeds the threshold safety height. This risk is considerably greater than the risk of injury associated with walking downhill or flexion landing activities.

  15. The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis

    DEFF Research Database (Denmark)

    Kristensen, Niels Heine; Nielsen, Thorkild; Bruselius-Jensen, Maria Louisa

    2003-01-01

    Kristensen NH, Nielsen T, Bruselius-Jensen M, Scheperlen-Bøgh P, Beckie M, Foster C, Midmore P, Padel S (2002): The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis. Final Report to the EU Commission......Kristensen NH, Nielsen T, Bruselius-Jensen M, Scheperlen-Bøgh P, Beckie M, Foster C, Midmore P, Padel S (2002): The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis. Final Report to the EU Commission...

  16. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  17. Oxygenates in automotive fuels. Consequence analysis - preliminary study

    International Nuclear Information System (INIS)

    Brandberg, Aa.; Saevbark, B.

    1994-01-01

    Oxygenates is used in gasoline due to several reasons. They are added as high-octane components in unleaded gasoline and as agents to reduce the emission of harmful substances. Oxygenates produced from biomass might constitute a coming market for alternative fuels. This preliminary study describes the prerequisites and consequences of such an oxygenate utilization. 39 refs, 9 figs, 5 tabs

  18. Computer-assisted learning in anatomy at the international medical school in Debrecen, Hungary: a preliminary report.

    Science.gov (United States)

    Kish, Gary; Cook, Samuel A; Kis, Gréta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an introduction to anatomical digital images along with clinical cases. This low-budget course has a large visual component using images from magnetic resonance imaging and computer axial tomogram scans, ultrasound clinical studies, and readily available anatomy software that presents topics which run in parallel to the university's core anatomy curriculum. From the combined computer images and CHA lecture information, students are asked to solve computer-based clinical anatomy problems in the CHA computer laboratory. A statistical comparison was undertaken of core anatomy oral examination performances of English program first-year medical students who took the elective CHA course and those who did not in the three academic years 2007-2008, 2008-2009, and 2009-2010. The results of this study indicate that the CHA-enrolled students improved their performance on required anatomy core curriculum oral examinations (P computer-assisted learning may play an active role in anatomy curriculum improvement. These preliminary results have prompted ongoing evaluation of what specific aspects of CHA are valuable and which students benefit from computer-assisted learning in a multilingual and diverse cultural environment. Copyright © 2012 American Association of Anatomists.

  19. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  20. [Management of cytostatic drugs by nurses: analysis of preliminary results].

    Science.gov (United States)

    Bilski, Bartosz

    2004-01-01

    Cytostatic drugs pose a quite specific occupational risk to health care workers. There is a wide range of potential harmful effects, including remote effects, exerted by this group of drugs. In Polish and international regulations, standards of work safety and hygiene concerning these substances are clearly defined. Nevertheless working conditions in Polish health care institutions are now mostly influenced by economic and organizational problems, which may also be reflected in the compliance with the work safety rules. This paper presents a preliminary analysis of subjective assessment of practice with regard to the management of cytostatics reported by nurses, an occupational group mostly exposed to these substances. The study was carried out at hospital departments in the Warmińsko-Mazurskie Voivodship, where exposure of the staff to these drugs was observed. The study covered the whole nursing staff exposed. Completed questionnaires were obtained from 60 nurses, aged +/- 32 years (20-54 years) with job seniority +/- 8 years (2-18), including 58 nurses with secondary education and two university graduates. Undergraduate education did not develop in respondents skills to work with cytostatics. There is a need to increase the involvement of nursing schools, research institutes and teaching hospitals in the improvement of vocational training of nurses working with cytostatic drugs. To this end, all nurses should be covered with the obligatory training how to handle this group of drugs. The respondents reported that they had acquired their knowledge and experience of managing cytostatics in their work and during training organized at workplace. Despite the acquired knowledge and experience the interviewed nurses did not always comply with work safety and hygiene regulations. The problem of exposure to cytostatic drugs in the form of tablets was most frequently neglected. Some of the nurses were additionally exposed to ionizing radiation. Shortage of the nursing

  1. Analysis of rainfall-induced shallow landslides in Jamne and Jaszcze stream valleys (Polish Carpathians – preliminary results

    Directory of Open Access Journals (Sweden)

    Zydroń Tymoteusz

    2016-03-01

    Full Text Available Analysis of rainfall-induced shallow landslides in Jamne and Jaszcze stream valleys (Polish Carpathians - preliminary results. Preliminary shallow landslide susceptibility mapping of the Jamne and Jaszcze stream valleys, located in the Polish Flysch Carpathians, is presented in the paper. For the purpose of mapping, there were used SINMAP and Iverson’s models integrating infiltration and slope stability calculations. The calibration of the used models parameters, obtained from limited field and laboratory tests, was performed using data from 8-9 July 1997, when as a consequence of a very intense rainfall, 94 shallow landslides were observed on meadows and arable lands. A comparison of the slope stability calculation results and the localisation of the noticed shallow landslides showed satisfactory agreement between localisation of the observed and computed unstable areas. However, it was concluded that better simulation results were obtained using Iverson’s model.

  2. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  3. Ten Years toward Equity: Preliminary Results from a Follow-Up Case Study of Academic Computing Culture

    Directory of Open Access Journals (Sweden)

    Tanya L. Crenshaw

    2017-05-01

    Full Text Available Just over 10 years ago, we conducted a culture study of the Computer Science Department at the flagship University of Illinois at Urbana-Champaign, one of the top five computing departments in the country. The study found that while the department placed an emphasis on research, it did so in a way that, in conjunction with a lack of communication and transparency, devalued teaching and mentoring, and negatively impacted the professional development, education, and sense of belonging of the students. As one part of a multi-phase case study spanning over a decade, this manuscript presents preliminary findings from our latest work at the university. We detail early comparisons between data gathered at the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2005 and our most recent pilot case study, a follow-up research project completed in 2016. Though we have not yet completed the full data collection, we find it worthwhile to reflect on the pilot case study data we have collected thus far. Our data reveals improvements in the perceptions of undergraduate teaching quality and undergraduate peer mentoring networks. However, we also found evidence of continuing feelings of isolation, incidents of bias, policy opacity, and uneven policy implementation that are areas of concern, particularly with respect to historically underrepresented groups. We discuss these preliminary follow-up findings, offer research and methodological reflections, and share next steps for applied research that aims to create positive cultural change in computing.

  4. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  5. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  6. Measurement of single kidney contrast media clearance by multiphasic spiral computed tomography: preliminary results

    International Nuclear Information System (INIS)

    Hackstein, N.; Puille, M.F.; Bak, Benjamin H.; Scharwat, Oliver; Rau, W.S.

    2001-01-01

    Objective. We present preliminary results of a new method (hereinafter called 'CT-clearance') to measure single kidney contrast media clearance by performing multiphasic helical CT of the kidneys. CT-clearance was calculated according to an extension of the Patlak-Plot. In contrast to prior investigators, who repeatedly measured a single slice, this method makes it possible to calculate single kidney clearance from at least three spiral CTs, utilizing the whole kidney volume. Methods. Spiral CT of the kidneys was performed unenhanced and about 30 and 100 s after administration of about 120 ml iopromide. Sum-density of the whole kidneys and aortic density was calculated from this data. Using this data, renal clearance of contrast media was calculated by CT-clearance in 29 patients. As reference, Serum-clearance was calculated in 24 patients by application of a modified one-exponential slope model. Information on the relative kidney function was gained by renal scintigraphy with Tc99m-MAG-3 or Tc99m-DMSA in 29 patients. Results. Linear regression analysis revealed a correlation coefficient of CT-clearance with Serum-clearance of r=0.78 with Cl (CT) [ml/min]=22.2+1.03 * Cl (serum), n=24. Linear regression of the relative kidney function (rkf) of the right kidney calculated by CT-clearance compared to scintigraphy results provided a correlation coefficient r=0.89 with rkf(CT)[%]=18.6+0.58 * rkf(scintigraphy), n=29. Conclusion. The obtained results of contrast media clearance measured by CT-clearance are in the physiological range of the parameter. Future studies should be performed to improve the methodology with the aim of higher accuracy. More specifically, better determination of the aortic density curve might improve the accuracy

  7. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    Science.gov (United States)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  8. Computational Analysis of the G-III Laminar Flow Glove

    Science.gov (United States)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  9. Adaptation of Toodee-2 computer code for reflood analysis in Angra-1 reactor

    International Nuclear Information System (INIS)

    Praes, J.G.L.; Onusic Junior, J.

    1981-01-01

    A method of calculation the heat transfer coefficient used in Toodee-2 computer code for core reflood analysis in a loss of coolant accident, is presented. Preliminary results are presented with the use of heat transfer correlations based on FLECHT experiments adequate to a geometric arrangement such as 16 x 16 (Angra I). Optional calculations are suggested for the heat transfer coefficients when the cooling of fuel cladding by steam is used. (Author) [pt

  10. Preliminary Analysis of Helicopter Options to Support Tunisian Counterterrorism Operations

    Science.gov (United States)

    2016-04-27

    helicopters from Sikorsky to fulfill a number of roles in counterterrorism operations. Rising costs and delays in delivery raised the question of...whether other cost-effective options exist to meet Tunisia’s helicopter requirement. Approach Our team conducted a preliminary assessment of...alternative helicopters for counterterrorism air assault missions. Any decision to acquire an aircraft must consider many factors, including technical

  11. Computer-based attention training in the schools for children with attention deficit/hyperactivity disorder: a preliminary trial.

    Science.gov (United States)

    Steiner, Naomi J; Sheldrick, Radley Christopher; Gotthelf, David; Perrin, Ellen C

    2011-07-01

    Objective. This study examined the efficacy of 2 computer-based training systems to teach children with attention deficit/hyperactivity disorder (ADHD) to attend more effectively. Design/methods. A total of 41 children with ADHD from 2 middle schools were randomly assigned to receive 2 sessions a week at school of either neurofeedback (NF) or attention training through a standard computer format (SCF), either immediately or after a 6-month wait (waitlist control group). Parents, children, and teachers completed questionnaires pre- and postintervention. Results. Primary parents in the NF condition reported significant (P ADHD index, the BASC Attention Problems Scale, and on the Behavioral Rating Inventory of Executive Functioning (BRIEF). Conclusion. This randomized control trial provides preliminary evidence of the effectiveness of computer-based interventions for ADHD and supports the feasibility of offering them in a school setting.

  12. Synopsis of some preliminary computational studies related to unsaturated zone transport at Area G

    International Nuclear Information System (INIS)

    Vold, E.

    1998-03-01

    Computational transport models are described with applications in three problem areas related to unsaturated zone moisture movement beneath Area G. These studies may be used to support the ongoing maintenance of the site Performance Assessment. The three areas include: a 1-D transient analysis with average tuff hydraulic properties in the near surface region with computed results compared to field data; the influence on near surface transient moisture percolation due to realistic distributions in hydraulic properties derived statistically from the observed variance in the field data; and the west to east moisture flow in a 2-D steady geometry approximation of the Pajarito Plateau. Results indicate that a simple transient model for transport of moisture volume fraction fits field data well compared to a moisture pulse observed in the active disposal unit, pit 37. Using realistic infiltration boundary conditions for summer showers and for spring snow melt conditions, the computed moisture pulses show significant propagation to less than 10-ft depth. Next, the hydraulic properties were varied on a 2-D grid using statistical distributions based on the field data means and variances for the hydraulic parameters. Near surface transient percolation in these conditions shows a qualitatively realistic percolation with a spatially variable wave front moving into the tuff; however, the flow does not channel into preferred paths and suggests there is no formation of fast paths which could enhance transportation of contaminants. Finally, moisture transport is modeled through an unsaturated 2-D slice representing the upper stratigraphic layers beneath Area G and a west-to-east cut of several miles to examine possible lateral movement from the west where percolation is assumed to be greater than at Area G. Results show some west-to-east moisture flux consistent with the assumed profile for the percolation boundary conditions

  13. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  14. Aerodynamic analysis of Pegasus - Computations vs reality

    Science.gov (United States)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  15. Protective Alternatives of SMR against Extreme Threat Scenario – A Preliminary Risk Analysis

    International Nuclear Information System (INIS)

    Shohet, I.M.; Ornai, D.; Gal, E.; Ronen, Y.; Vidra, M.

    2014-01-01

    The article presents a preliminary risk analysis of the main features in NPP (Nuclear Power Plant) that includes SMR - Small and Modular Reactors, given an extreme threat scenario. A review of the structure and systems of the SMR is followed by systematic definitions and analysis of the threat scenario to which a preliminary risk analysis was carried out. The article outlines the basic events caused by the referred threat scenario, which had led to possible failure mechanisms according to FTA (Fault-Tree-Analysis),critical protective circuits, and todetecting critical topics for the protection and safety of the reactor

  16. Thermal Hydraulic Analysis of K-DEMO Single Blanket Module for Preliminary Accident Analysis using MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Sung Bo; Bang, In Cheol [UNIST, Ulsan (Korea, Republic of)

    2016-05-15

    To develop the Korean fusion commercial reactor, preliminary design concept for K-DEMO (Korean fusion demonstration reactor) has been announced by NFRI (National Fusion Research Institute). This pre-conceptual study of K-DEMO has been introduced to identify technical details of a fusion power plant for the future commercialization of fusion reactor in Korea. Before this consideration, to build the K-DEMO, accident analysis is essential. Since the Fukushima accident, which is severe accident from unexpected disaster, safety analysis of nuclear power plant has become important. The safety analysis of both fission and fusion reactors is deemed crucial in demonstrating the low radiological effect of these reactors on the environment, during severe accidents. A risk analysis of K-DEMO should be performed, as a prerequisite for the construction of a fusion reactor. In this research, thermal-hydraulic analysis of single blanket module of K-DEMO is conducted for preliminary accident analysis for K-DEMO. Further study about effect of flow distributer is conducted. The normal K-DEMO operation condition is applied to the boundary condition and simulated to verify the material temperature limit using MELCOR. MELCOR is fully integrated, relatively fast-running code developed by Sandia National Laboratories. MELCOR had been used for Light Water Reactors and fusion reactor version of MELCOR was developed for ITER accident analysis. This study shows the result of thermal-hydraulic simulation of single blanket module with MELCOR which is severe accident code for nuclear fusion safety analysis. The difference of mass flow rate for each coolant channel with or without flow distributer is presented. With flow distributer, advantage of broadening temperature gradient in the K-DEMO blanket module and increase mass flow toward first wall is obtained. This can enhance the safety of K-DEMO blanket module. Most 13 .deg. C temperature difference in blanket module is obtained.

  17. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  18. Preliminary analysis of engineered barrieer performances in geological disposal of high level waste

    International Nuclear Information System (INIS)

    Ohe, Toshiaki; Maki, Yasuo; Tanaka, Hiroshi; Kawanishi, Motoi.

    1988-01-01

    This report represents preliminary results of safety analysis of a engineered barrier system in geological disposal of high level radioactive waste. Three well-known computer codes; ORIGEN 2, TRUMP, and SWIFT were used in the simulation. Main conceptual design of the repository was almost identical to that of SKB in Sweden and NAGRA in Switzerland; the engineered barrier conasists glass solidified waste, steel overpack, and compacted bentonite. Two different underground formations are considered; granite and neogene sedimentary rock, which are typically found in Japan. We first determined the repository configuration, particularly the space between disposal pitts. The ORIGEN 2 was used to estimate heat generation in the waste glass reprocessed at 4 years after removal from PWR. Then, temperature distribution was calculated by the TRUMP. The results of two or three dimensional calculation indicated that the pit interval should be kept more than 5 m in the case of granite formation at 500 m depth, according to the temperature criteria in the bentonite layer ( 90 Sr, 241 Am, 239 Pu, and 237 Np were chosen in one or two dimensional calculations. For both cases of steady release and instanteneous release, the maximum concentration in the pore water at the boundary between bentonite and surrounding rock had the following order; 237 Np> 239 Pu> 90 Sr> 241 Am. Sensitivity analysis showed that the order mainly due to the different adsorption characteristics of the nuclides in bentonite layer. (author)

  19. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  20. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  1. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  2. Advanced analysis of finger-tapping performance: a preliminary study.

    Science.gov (United States)

    Barut, Cağatay; Kızıltan, Erhan; Gelir, Ethem; Köktürk, Fürüzan

    2013-06-01

    The finger-tapping test is a commonly employed quantitative assessment tool used to measure motor performance in the upper extremities. This task is a complex motion that is affected by external stimuli, mood and health status. The complexity of this task is difficult to explain with a single average intertap-interval value (time difference between successive tappings) which only provides general information and neglects the temporal effects of the aforementioned factors. This study evaluated the time course of average intertap-interval values and the patterns of variation in both the right and left hands of right-handed subjects using a computer-based finger-tapping system. Cross sectional study. Thirty eight male individuals aged between 20 and 28 years (Mean±SD = 22.24±1.65) participated in the study. Participants were asked to perform single-finger-tapping test for 10 seconds of test period. Only the results of right-handed (RH) 35 participants were considered in this study. The test records the time of tapping and saves data as the time difference between successive tappings for further analysis. The average number of tappings and the temporal fluctuation patterns of the intertap-intervals were calculated and compared. The variations in the intertap-interval were evaluated with the best curve fit method. An average tapping speed or tapping rate can reliably be defined for a single-finger tapping test by analysing the graphically presented data of the number of tappings within the test period. However, a different presentation of the same data, namely the intertap-interval values, shows temporal variation as the number of tapping increases. Curve fitting applications indicate that the variation has a biphasic nature. The measures obtained in this study reflect the complex nature of the finger-tapping task and are suggested to provide reliable information regarding hand performance. Moreover, the equation reflects both the variations in and the general

  3. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  4. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    Science.gov (United States)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  5. Application of computed tomography virtual noncontrast spectral imaging in evaluation of hepatic metastases: a preliminary study.

    Science.gov (United States)

    Tian, Shi-Feng; Liu, Ai-Lian; Liu, Jing-Hong; Sun, Mei-Yu; Wang, He-Qing; Liu, Yi-Jun

    2015-03-05

    The objective was to qualitatively and quantitatively evaluate hepatic metastases using computed tomography (CT) virtual noncontrast (VNC) spectral imaging in a retrospective analysis. Forty hepatic metastases patients underwent CT scans including the conventional true noncontrast (TNC) and the tri-phasic contrast-enhanced dual energy spectral scans in the hepatic arterial, portal venous, and equilibrium phases. The tri-phasic spectral CT images were used to obtain three groups of VNC images including in the arterial (VNCa), venous (VNCv), and equilibrium (VNCe) phase by the material decomposition process using water and iodine as a base material pair. The image quality and the contrast-to-noise ratio (CNR) of metastasis of the four groups were compared with ANOVA analysis. The metastasis detection rates with the four nonenhanced image groups were calculated and compared using the Chi-square test. There were no significant differences in image quality among TNC, VNCa and VNCv images (P > 0.05). The quality of VNCe images was significantly worse than that of other three groups (P 0.05). The metastasis detection rate of the four nonenhanced groups with no statistically significant difference (P > 0.05). The quality of VNCa and VNCv images is identical to that of TNC images, and the metastasis detection rate in VNC images is similar to that in TNC images. VNC images obtained from arterial phase show metastases more clearly. Thus, VNCa imaging may be a surrogate to TNC imaging in hepatic metastasis diagnosis.

  6. Application of Computed Tomography Virtual Noncontrast Spectral Imaging in Evaluation of Hepatic Metastases: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Shi-Feng Tian

    2015-01-01

    Full Text Available Objective: The objective was to qualitatively and quantitatively evaluate hepatic metastases using computed tomography (CT virtual noncontrast (VNC spectral imaging in a retrospective analysis. Methods: Forty hepatic metastases patients underwent CT scans including the conventional true noncontrast (TNC and the tri-phasic contrast-enhanced dual energy spectral scans in the hepatic arterial, portal venous, and equilibrium phases. The tri-phasic spectral CT images were used to obtain three groups of VNC images including in the arterial (VNCa, venous (VNCv, and equilibrium (VNCe phase by the material decomposition process using water and iodine as a base material pair. The image quality and the contrast-to-noise ratio (CNR of metastasis of the four groups were compared with ANOVA analysis. The metastasis detection rates with the four nonenhanced image groups were calculated and compared using the Chi-square test. Results: There were no significant differences in image quality among TNC, VNCa and VNCv images (P > 0.05. The quality of VNCe images was significantly worse than that of other three groups (P 0.05. The metastasis detection rate of the four nonenhanced groups with no statistically significant difference (P > 0.05. Conclusions: The quality of VNCa and VNCv images is identical to that of TNC images, and the metastasis detection rate in VNC images is similar to that in TNC images. VNC images obtained from arterial phase show metastases more clearly. Thus, VNCa imaging may be a surrogate to TNC imaging in hepatic metastasis diagnosis.

  7. Preliminary Hazard Analysis applied to Uranium Hexafluoride - UF6 production plant

    International Nuclear Information System (INIS)

    Tomzhinsky, David; Bichmacher, Ricardo; Braganca Junior, Alvaro; Peixoto, Orpet Jose

    1996-01-01

    The purpose of this paper is to present the results of the Preliminary hazard Analysis applied to the UF 6 Production Process, which is part of the UF 6 Conversion Plant. The Conversion Plant has designed to produce a high purified UF 6 in accordance with the nuclear grade standards. This Preliminary Hazard Analysis is the first step in the Risk Management Studies, which are under current development. The analysis evaluated the impact originated from the production process in the plant operators, members of public, equipment, systems and installations as well as the environment. (author)

  8. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  9. Extension of a simplified computer program for analysis of solid-propellant rocket motors

    Science.gov (United States)

    Sforzini, R. H.

    1973-01-01

    A research project to develop a computer program for the preliminary design and performance analysis of solid propellant rocket engines is discussed. The following capabilities are included as computer program options: (1) treatment of wagon wheel cross sectional propellant configurations alone or in combination with circular perforated grains, (2) calculation of ignition transients with the igniter treated as a small rocket engine, (3) representation of spherical circular perforated grain ends as an alternative to the conical end surface approximation used in the original program, and (4) graphical presentation of program results using a digital plotter.

  10. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  11. Gas turbine designer computer program - a study of using a computer for preliminary design of gas turbines

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Rickard

    1995-11-01

    This thesis presents calculation schemes and theories for preliminary design of the fan, high pressure compressor and turbine of a gas turbine. The calculations are presented step by step, making it easier to implement in other applications. The calculation schemes have been implemented as a subroutine in a thermodynamic program. The combination of the thermodynamic cycle calculation and the design calculation turned out to give quite relevant results, when predicting the geometry and performance of an existing aero engine. The program developed is able to handle several different gas turbines, including those in which the flow is split (i.e. turbofan engines). The design process is limited to the fan, compressor and turbine of the gas turbine, the rest of the components have not been considered. Output from the program are main geometry, presented both numerically and as a scale plot, component efficiencies, stresses in critical points and a simple prediction of turbine blade temperatures. 11 refs, 21 figs, 1 tab

  12. Preliminary development of a global 3-D magnetohydrodynamic computational model for solar wind-cometary and planetary interactions

    International Nuclear Information System (INIS)

    Stahara, S.S.

    1986-05-01

    This is the final summary report by Resource Management Associates, Inc., of the first year's work under Contract No. NASW-4011 to the National Aeronautics and Space Administration. The work under this initial phase of the contract relates to the preliminary development of a global, 3-D magnetohydrodynamic computational model to quantitatively describe the detailed continuum field and plasma interaction process of the solar wind with cometary and planetary bodies throughout the solar system. The work extends a highly-successful, observationally-verified computational model previously developed by the author, and is appropriate for the global determination of supersonic, super-Alfvenic solar wind flows past planetary obstacles. This report provides a concise description of the problems studied, a summary of all the important research results, and copies of the publications

  13. Preliminary analysis of the transient overpower accident for CRBRP. Final report

    International Nuclear Information System (INIS)

    Kastenberg, W.E.; Frank, M.V.

    1975-07-01

    A preliminary analysis of the transient overpower accident for the Clinch River Breeder Reactor Plant (CRBRP) is presented. Several uncertainties in the analysis and the estimation of ramp rates during the transition to disassembly are discussed. The major conclusions are summarized

  14. SUMS preliminary design and data analysis development. [shuttle upper atmosphere mass spectrometer experiment

    Science.gov (United States)

    Hinson, E. W.

    1981-01-01

    The preliminary analysis and data analysis system development for the shuttle upper atmosphere mass spectrometer (SUMS) experiment are discussed. The SUMS experiment is designed to provide free stream atmospheric density, pressure, temperature, and mean molecular weight for the high altitude, high Mach number region.

  15. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  16. Children's strategies to solving additive inverse problems: a preliminary analysis

    Science.gov (United States)

    Ding, Meixia; Auxter, Abbey E.

    2017-03-01

    Prior studies show that elementary school children generally "lack" formal understanding of inverse relations. This study goes beyond lack to explore what children might "have" in their existing conception. A total of 281 students, kindergarten to third grade, were recruited to respond to a questionnaire that involved both contextual and non-contextual tasks on inverse relations, requiring both computational and explanatory skills. Results showed that children demonstrated better performance in computation than explanation. However, many students' explanations indicated that they did not necessarily utilize inverse relations for computation. Rather, they appeared to possess partial understanding, as evidenced by their use of part-whole structure, which is a key to understanding inverse relations. A close inspection of children's solution strategies further revealed that the sophistication of children's conception of part-whole structure varied in representation use and unknown quantity recognition, which suggests rich opportunities to develop students' understanding of inverse relations in lower elementary classrooms.

  17. Preliminary frequency-domain analysis for the reconstructed spatial resolution of muon tomography

    Science.gov (United States)

    Yu, B.; Zhao, Z.; Wang, X.; Wang, Y.; Wu, D.; Zeng, Z.; Zeng, M.; Yi, H.; Luo, Z.; Yue, X.; Cheng, J.

    2014-11-01

    Muon tomography is an advanced technology to non-destructively detect high atomic number materials. It exploits the multiple Coulomb scattering information of muon to reconstruct the scattering density image of the traversed object. Because of the statistics of muon scattering, the measurement error of system and the data incompleteness, the reconstruction is always accompanied with a certain level of interference, which will influence the reconstructed spatial resolution. While statistical noises can be reduced by extending the measuring time, system parameters determine the ultimate spatial resolution that one system can reach. In this paper, an effective frequency-domain model is proposed to analyze the reconstructed spatial resolution of muon tomography. The proposed method modifies the resolution analysis in conventional computed tomography (CT) to fit the different imaging mechanism in muon scattering tomography. The measured scattering information is described in frequency domain, then a relationship between the measurements and the original image is proposed in Fourier domain, which is named as "Muon Central Slice Theorem". Furthermore, a preliminary analytical expression of the ultimate reconstructed spatial is derived, and the simulations are performed for validation. While the method is able to predict the ultimate spatial resolution of a given system, it can also be utilized for the optimization of system design and construction.

  18. Preliminary frequency-domain analysis for the reconstructed spatial resolution of muon tomography

    International Nuclear Information System (INIS)

    Yu, B.; Zhao, Z.; Wang, X.; Wang, Y.; Wu, D.; Zeng, Z.; Zeng, M.; Yi, H.; Luo, Z.; Yue, X.; Cheng, J.

    2014-01-01

    Muon tomography is an advanced technology to non-destructively detect high atomic number materials. It exploits the multiple Coulomb scattering information of muon to reconstruct the scattering density image of the traversed object. Because of the statistics of muon scattering, the measurement error of system and the data incompleteness, the reconstruction is always accompanied with a certain level of interference, which will influence the reconstructed spatial resolution. While statistical noises can be reduced by extending the measuring time, system parameters determine the ultimate spatial resolution that one system can reach. In this paper, an effective frequency-domain model is proposed to analyze the reconstructed spatial resolution of muon tomography. The proposed method modifies the resolution analysis in conventional computed tomography (CT) to fit the different imaging mechanism in muon scattering tomography. The measured scattering information is described in frequency domain, then a relationship between the measurements and the original image is proposed in Fourier domain, which is named as M uon Central Slice Theorem . Furthermore, a preliminary analytical expression of the ultimate reconstructed spatial is derived, and the simulations are performed for validation. While the method is able to predict the ultimate spatial resolution of a given system, it can also be utilized for the optimization of system design and construction

  19. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  20. A preliminary analysis of the reactor-based plutonium disposition alternative deployment schedules

    International Nuclear Information System (INIS)

    Zurn, R.M.

    1997-09-01

    This paper discusses the preliminary analysis of the implementation schedules of the reactor-based plutonium disposition alternatives. These schedule analyses are a part of a larger process to examine the nine decision criteria used to determine the most appropriate method of disposing of U.S. surplus weapons plutonium. The preliminary analysis indicates that the mission durations for the reactor-based alternatives range from eleven years to eighteen years and the initial mission fuel assemblies containing surplus weapons-usable plutonium could be loaded into the reactors between nine and fourteen years after the Record of Decision

  1. Purification, crystallization and preliminary X-ray structure analysis of the laccase from Ganoderma lucidum

    International Nuclear Information System (INIS)

    Lyashenko, Andrey V.; Belova, Oksana; Gabdulkhakov, Azat G.; Lashkov, Alexander A.; Lisov, Alexandr V.; Leontievsky, Alexey A.; Mikhailov, Al’bert M.

    2011-01-01

    The purification, crystallization and preliminary X-ray structure analysis of the laccase from G. lucidum are reported. The ligninolytic enzymes of the basidiomycetes play a key role in the global carbon cycle. A characteristic property of these enzymes is their broad substrate specificity, which has led to their use in various biotechnologies, thus stimulating research into the three-dimensional structures of ligninolytic enzymes. This paper presents the purification, crystallization and preliminary X-ray analysis of the laccase from the ligninolytic basidiomycete Ganoderma lucidum

  2. NRC staff preliminary analysis of public comments on advance notice of proposed rulemaking on emergency planning

    International Nuclear Information System (INIS)

    Peabody, C.A.; Hickey, J.W.N.

    1980-01-01

    The Nuclear Regulatory Commission (NRC) published an advance notice of proposed rulemaking on emergency planning on July 17, 1979 (44 FR 41483). In October and November 1979, the NRC staff submitted several papers to the Commission related to the emergency planning rulemaking. One of these papers was a preliminary analysis of public comments received on the advance notice (SECY-79-591B, November 13, 1979). This document consists of the preliminary analysis as it was submitted to the Commission, with minor editorial changes

  3. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  4. Preliminary Axial Flow Turbine Design and Off-Design Performance Analysis Methods for Rotary Wing Aircraft Engines. Part 1; Validation

    Science.gov (United States)

    Chen, Shu-cheng, S.

    2009-01-01

    For the preliminary design and the off-design performance analysis of axial flow turbines, a pair of intermediate level-of-fidelity computer codes, TD2-2 (design; reference 1) and AXOD (off-design; reference 2), are being evaluated for use in turbine design and performance prediction of the modern high performance aircraft engines. TD2-2 employs a streamline curvature method for design, while AXOD approaches the flow analysis with an equal radius-height domain decomposition strategy. Both methods resolve only the flows in the annulus region while modeling the impact introduced by the blade rows. The mathematical formulations and derivations involved in both methods are documented in references 3, 4 for TD2-2) and in reference 5 (for AXOD). The focus of this paper is to discuss the fundamental issues of applicability and compatibility of the two codes as a pair of companion pieces, to perform preliminary design and off-design analysis for modern aircraft engine turbines. Two validation cases for the design and the off-design prediction using TD2-2 and AXOD conducted on two existing high efficiency turbines, developed and tested in the NASA/GE Energy Efficient Engine (GE-E3) Program, the High Pressure Turbine (HPT; two stages, air cooled) and the Low Pressure Turbine (LPT; five stages, un-cooled), are provided in support of the analysis and discussion presented in this paper.

  5. Thick Concrete Specimen Construction, Testing, and Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoegh, Kyle [Univ. of Minnesota, Minneapolis, MN (United States); Khazanovich, Lev [Univ. of Minnesota, Minneapolis, MN (United States)

    2015-03-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations. A preliminary report detailed some of the challenges associated with thick reinforced concrete sections and prioritized conceptual designs of specimens that could be fabricated to represent NPP concrete structures for using in NDE evaluation comparisons. This led to the construction of the concrete specimen presented in this report, which has sufficient reinforcement density and cross-sectional size to represent an NPP containment wall. Details on how a suitably thick concrete specimen was constructed are presented, including the construction materials, final nominal design schematic, as well as formwork and rigging required to safely meet the desired dimensions of the concrete structure. The report also details the type and methods of forming the concrete specimen as well as information on how the rebar and simulated defects were embedded. Details on how the resulting specimen was transported, safely anchored, and marked to allow access for systematic comparative NDE testing of defects in a representative NPP containment wall concrete specimen are also given. Data collection using the MIRA Ultrasonic NDE equipment and

  6. Development of the GOSAT-2 FTS-2 Simulator and Preliminary Sensitivity Analysis for CO2 Retrieval

    Science.gov (United States)

    Kamei, A.; Yoshida, Y.; Dupuy, E.; Hiraki, K.; Yokota, Y.; Oishi, Y.; Murakami, K.; Morino, I.; Matsunaga, T.

    2013-12-01

    The Greenhouse Gases Observing Satellite-2 (GOSAT-2), which is a successor mission to the GOSAT, is planned to be launched in FY 2017. The Fourier Transform Spectrometer-2 (FTS-2) onboard the GOSAT-2 is a primary sensor to observe infrared light reflected and emitted from the Earth's surface and atmosphere. The FTS-2 obtains high-spectral resolution spectra with four bands from near to short-wavelength infrared (SWIR) region and one band in the thermal infrared (TIR) region. The column amounts of carbon dioxide (CO2) and methane (CH4) are retrieved from the obtained radiance spectra with SWIR bands. Compared to the FTS onboard the GOSAT, the FTS-2 includes an additional SWIR band to allow for carbon monoxide (CO) measurement. We have been developing a tool, named GOSAT-2 FTS-2 simulator, which is capable of simulating the spectral radiance data observed by the FTS-2 using the Pstar2 radiative transfer code. The purpose of the GOSAT-2 FTS-2 simulator is to obtain data which is exploited in the sensor specification, the optimization of parameters for Level 1 processing, and the improvement of Level 2 algorithms. The GOSAT-2 FTS-2 simulator, composed of the six components: 1) Overall control, 2) Onboarding platform, 3) Spectral radiance calculation, 4) Fourier transform, 5) L1B processing, and 6) L1B data output, has been installed on the GOSAT Research Computation Facility (GOSAT RCF), which is a large-scale, high-performance, and energy-efficient computer. We present the progress in the development of the GOSAT-2 FTS-2 simulator and the preliminary sensitivity analysis, relating to the engineering parameters, the aerosols and clouds, and so on, on the Level 1 processing for CO2 retrieval from the obtained data by simulating the FTS-2 SWIR observation using the GOSAT-2 FTS-2 simulator.

  7. How Well Can a Computer Program Teach German Culture? Some Preliminary Findings from EthnoDeutsch.

    Science.gov (United States)

    Ashby, Wendy; Ostertag, Veronica

    2002-01-01

    Investigates the effectiveness of an interactive, computer-mediated instructional segment designed to educate students about ethnicity in German-speaking countries. Fifty-two intermediate German students worked with computer-mediated segments and rated the segments' effectiveness on a Likert-scale questionnaire. (AS)

  8. Preliminary Computational Fluid Dynamics (CFD) Simulation of EIIB Push Barge in Shallow Water

    Science.gov (United States)

    Beneš, Petr; Kollárik, Róbert

    2011-12-01

    This study presents preliminary CFD simulation of EIIb push barge in inland conditions using CFD software Ansys Fluent. The RANSE (Reynolds Averaged Navier-Stokes Equation) methods are used for the viscosity solution of turbulent flow around the ship hull. Different RANSE methods are used for the comparison of their results in ship resistance calculations, for selecting the appropriate and removing inappropriate methods. This study further familiarizes on the creation of geometrical model which considers exact water depth to vessel draft ratio in shallow water conditions, grid generation, setting mathematical model in Fluent and evaluation of the simulations results.

  9. Preliminary Coupling of MATRA Code for Multi-physics Analysis

    International Nuclear Information System (INIS)

    Kim, Seongjin; Choi, Jinyoung; Yang, Yongsik; Kwon, Hyouk; Hwang, Daehyun

    2014-01-01

    The boundary conditions such as the inlet temperature, mass flux, averaged heat flux, power distributions of the rods, and core geometry is given by constant values or functions of time. These conditions are separately calculated and provided by other codes, such as a neutronics or a system codes, into the MATRA code. In addition, the coupling of several codes in the different physics field is focused and embodied. In this study, multiphysics coupling methods were developed for a subchannel code (MATRA) with neutronics codes (MASTER, DeCART) and a fuel performance code (FRAPCON-3). Preliminary evaluation results for representative sample cases are presented. The MASTER and DeCART codes provide the power distribution of the rods in the core to the MATRA code. In case of the FRAPCON-3 code, the variation of the rod diameter induced by the thermal expansion is yielded and provided. The MATRA code transfers the thermal-hydraulic conditions that each code needs. Moreover, the coupling method with each code is described

  10. Seismic response of transamerical building. I. Data and preliminary analysis

    Science.gov (United States)

    Celebi, M.; Safak, E.

    1991-01-01

    The objective of this paper is to present preliminary analyses of a set of acceleration response records obtained during the October 17, 1989 Loma Prieta earthquake (Ms = 7.1) from the 60-story vertically tapered, pyramid-shaped Trans-america Building-a landmark of San Francisco. The building was instrumented in 1985 with 22 channels of synchronized sensors consisting of 13 uniaxial accelerometers deployed throughout the structure and connected to a central recording system and three triaxial strong-motion accelerographs at three different levels of the structure. No free-field accelerographs are at the site. The acceleration records permit the study of the behavior of this unique structure. The predominant translational response of the building and the associated frequency at approximately 0.28 Hz are identified from the records and their Fourier amplitude spectra. The records do not indicate any significant torsional motion. However, there is rocking type soil-structure interaction, and an associated frequency of approximately 2.0 Hz is identified from the Fourier amplitude spectra of the differential motions between the ground level and that at the basement. In addition, the response spectra for the basement motions indicate significant resonance in both directions at a period of approximately 0.5 seconds.

  11. Preliminary Analysis of Species Partitioning in the DWPF Melter

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kesterson, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Johnson, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-07-15

    The work described in this report is preliminary in nature since its goal was to demonstrate the feasibility of estimating the off-gas entrainment rates from the Defense Waste Processing Facility (DWPF) melter based on a simple mass balance using measured feed and glass pour stream compositions and timeaveraged melter operating data over the duration of one canister-filling cycle. The only case considered in this study involved the SB6 pour stream sample taken while Canister #3472 was being filled over a 20-hour period on 12/20/2010, approximately three months after the bubblers were installed. The analytical results for that pour stream sample provided the necessary glass composition data for the mass balance calculations. To estimate the “matching” feed composition, which is not necessarily the same as that of the Melter Feed Tank (MFT) batch being fed at the time of pour stream sampling, a mixing model was developed involving three preceding MFT batches as well as the one being fed at that time based on the assumption of perfect mixing in the glass pool but with an induction period to account for the process delays involved in the calcination/fusion step in the cold cap and the melter turnover.

  12. Monitoring the Microgravity Environment Quality On-board the International Space Station Using Soft Computing Techniques. Part 2; Preliminary System Performance Results

    Science.gov (United States)

    Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.

    2002-01-01

    This paper presents the preliminary performance results of the artificial intelligence monitoring system in full operational mode using near real time acceleration data downlinked from the International Space Station. Preliminary microgravity environment characterization analysis result for the International Space Station (Increment-2), using the monitoring system is presented. Also, comparison between the system predicted performance based on ground test data for the US laboratory "Destiny" module and actual on-orbit performance, using measured acceleration data from the U.S. laboratory module of the International Space Station is presented. Finally, preliminary on-orbit disturbance magnitude levels are presented for the Experiment of Physics of Colloids in Space, which are compared with on ground test data. The ground test data for the Experiment of Physics of Colloids in Space were acquired from the Microgravity Emission Laboratory, located at the NASA Glenn Research Center, Cleveland, Ohio. The artificial intelligence was developed by the NASA Glenn Principal Investigator Microgravity Services Project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment of time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a dynamic graphical display, implemented in Java, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, structural modes, etc., and decide whether or not to run their experiments, whenever that is an option, based on the acceleration magnitude and frequency sensitivity associated with that experiment. This monitoring system detects primarily the vibratory disturbance sources. The system has built-in capability to detect both known

  13. Analysis on the security of cloud computing

    Science.gov (United States)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  14. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  15. Challenge for knowledge information processing systems (preliminary report on Fifth Generation Computer Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Moto-oka, T

    1982-01-01

    The author explains the reasons, aims and strategies for the Fifth Generation Computer Project in Japan. The project aims to introduce a radical new breed of computer by 1990. This article outlines the economic and social reasons for the project. It describes the impacts and effects that these computers are expected to have. The areas of technology which will form the contents of the research and development are highlighted. These are areas such as VLSI technology, speech and image understanding systems, artificial intelligence and advanced architecture design. Finally a schedule for completion of research is given which aims for a completed project by 1990.

  16. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  17. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  18. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  19. Preliminary Evaluation of the Computer-Based Tactics Certification Course--Principles of War Module

    National Research Council Canada - National Science Library

    Pleban, Robert

    1997-01-01

    This report describes a portion of the U.S. Army Research Institute for the Behavioral and Social Sciences Infantry Forces Research Unit's work in the formative evaluation of the computer based Tactics Certification Course (TCC...

  20. Preliminary Experimental Analysis of Soil Stabilizers for Contamination Control

    International Nuclear Information System (INIS)

    Lagos, L.; Varona, J.; Zidan, A.; Gudavalli, R.; Wu, Kuang-His

    2006-01-01

    A major focus of Department of Energy's (DOE's) environmental management mission at the Hanford site involves characterizing and remediating contaminated soil and groundwater; stabilizing contaminated soil; remediating disposal sites; decontaminating and decommissioning structures, and demolishing former plutonium production process buildings, nuclear reactors, and separation plants; maintaining inactive waste sites; transitioning facilities into the surveillance and maintenance program; and mitigating effects to biological and cultural resources from site development and environmental cleanup and restoration activities. For example, a total of 470,914 metric tons of contaminated soil from 100 Areas remediation activities were disposed at the Environmental Restoration Disposal Facility (ERDF) during 2004. The Applied Research Center (ARC) at Florida International University (FIU) is supporting the Hanford's site remediation program by analyzing the effectiveness of several soil stabilizers (fixatives) for contamination control during excavation activities. The study is focusing on determining the effects of varying soil conditions, temperature, humidity and wind velocity on the effectiveness of the candidate stabilizers. The test matrix consists of a soil penetration-depth study, wind tunnel experiments for determination of threshold velocity, and temperature and moisture-controlled drying/curing experiments. These three set of experiments are designed to verify performance metrics, as well as provide insight into what fundamental forces are altered by the use of the stabilizer. This paper only presents the preliminary results obtained during wind tunnel experiments using dry Hanford soil samples (with 2.7% moisture by weight). These dry soil samples were exposed to varying wind speeds from 2.22 m/sec to 8.88 m/sec. Furthermore, airborne particulate data was collected for the dry Hanford soil experiments using an aerosol analyzer instrument. (authors)

  1. Preliminary study of elemental analysis of hydroxyapatite used neutron activation analysis method

    International Nuclear Information System (INIS)

    Yustinus Purwamargapratala; Rina Mulyaningsih

    2010-01-01

    Preliminary study has been carried out elemental analysis of hydroxyapatite synthesized using the method of neutron activation analysis. Hydroxyapatite is the main component constituent of bones and teeth which can be synthesized from limestone and phosphoric acid. Hydroxyapatite can be used as a bone substitute material and human and animal teeth. Tests on the metal content is necessary to prevent the risk of damage to bones and teeth due to contamination. Results of analysis using neutron activation analysis method with samples irradiated at the neutron flux 10"3 n.det"-"1cm"-"2 for one minute, the impurities of Al (48.60±6.47 mg/kg), CI (38.00±7.47 mg/kg), Mn (1.05±0.19 mg/kg), and Mg (2095.30±203.66 mg/kg), were detected, whereas with irradiation time for 10 minutes and 40 minutes with a time decay of three days there were K (103.89 ± 26.82 mg/kg), Br (1617.06 ± 193.66 mg/kg), and Na (125.10±9.57 mg/kg). These results indicate that there is impurity Al, CI, Mn, Mg, Br, K and Na, although in very small amounts and do not cause damage to bones and teeth. (author)

  2. Development of Graphical Solution for Computer-Assisted Fault Diagnosis: Preliminary Study

    International Nuclear Information System (INIS)

    Yoon, Han Bean; Yun, Seung Man; Han, Jong Chul

    2009-01-01

    We have developed software for converting the volumetric voxel data obtained from X-ray computed tomography(CT) into computer-aided design(CAD) data. The developed software can used for non-destructive testing and evaluation, reverse engineering, and rapid prototyping, etc. The main algorithms employed in the software are image reconstruction, volume rendering, segmentation, and mesh data generation. The feasibility of the developed software is demonstrated with the CT data of human maxilla and mandible bones

  3. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  4. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  5. Chemical Analysis of the Moon at the Surveyor VI Landing Site: Preliminary Results.

    Science.gov (United States)

    Turkevich, A L; Patterson, J H; Franzgrote, E J

    1968-06-07

    The alpha-scattering experiment aboard soft-landing Surveyor VI has provided a chemical analysis of the surface of the moon in Sinus Medii. The preliminary results indicate that, within experimental errors, the composition is the same as that found by Surveyor V in Mare Tranquillitatis. This finding suggests that large portions of the lunar maria resemble basalt in composition.

  6. Chemical Analysis of the Moon at the Surveyor VII Landing Site: Preliminary Results.

    Science.gov (United States)

    Turkevich, A L; Franzgrote, E J; Patterson, J H

    1968-10-04

    The alpha-scattering experiment aboard Surveyor VII has provided a chemical analysis of the moon in the area of the crater Tycho. The preliminary results indicate a chemical composition similar to that already found at two mare sites, but with a lower concentration of elements of the iron group (titanium through copper).

  7. Current Mooring Design in Partner WECs and Candidates for Preliminary Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Ferri, Francesco; Kofoed, Jens Peter

    This report is the combined report of Commercial Milestone "CM1: Design and Cost of Current Mooring Solutions of Partner WECs" and Milestone "M3: Mooring Solutions for Preliminary Analysis" of the EUDP project "Mooring Solutions for Large Wave Energy Converters". The report covers a description o...

  8. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  9. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    International Nuclear Information System (INIS)

    Gaschott, L.J.

    1995-01-01

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility

  10. A Preliminary Analysis of the Outcomes of Students Assisted by VET FEE-HELP: Summary

    Science.gov (United States)

    National Centre for Vocational Education Research (NCVER), 2015

    2015-01-01

    This summary highlights the key findings from the report "A preliminary analysis of the outcomes of students assisted by VET FEE-HELP". VET FEE-HELP is an income-contingent loan scheme that assists eligible students undertaking certain vocational education training (VET) courses with an approved provider by paying for all or part of…

  11. Expression, purification, crystallization and preliminary X-ray analysis of Aeromonas hydrophilia metallo-β-lactamase

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Nandini, E-mail: nandini-sharma@merck.com; Toney, Jeffrey H.; Fitzgerald, Paula M. D.

    2005-02-01

    Crystallization and preliminary X-ray analysis of the CphA metallo-β-lactamase from A. hydrophilia are described. The crystals belonged to space group P2{sub 1}2{sub 1}2, with unit-cell parameters a = 40.75, b = 42.05, c = 128.88 Å, and diffract to 1.8 Å.

  12. Preliminary safety analysis of unscrammed events for KLFR

    International Nuclear Information System (INIS)

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  13. Preliminary analysis of in-reactor behavior of three MOX fuel rods in the halden reactor

    International Nuclear Information System (INIS)

    Koo, Yang Hyun; Lee, Byung Ho; Sohn, Dong Seong; Joo, Hyung Kook

    1999-09-01

    Preliminary analysis of in-reactor thermal performance for three MOX fuel rods that are going to be irradiated in the Halden reactor from the first quarter of the year 2000 have been conducted by using the computer code COSMOS. Using the assumption that microstructure of MOX fuel fabricated by SBR and dry milling method is the same, parametric studies have been carried out considering four kinds of uncertainties, which are thermal conductivity, linear power, manufacturing parameters, and model constant, to investigate the effect of each of uncertainty on in-reactor behavior. It is found that the uncertainty of model constants for FGR has a greatest impact of the all because the amount of gas released to the gap is one of the parameters that dominantly affects the gap conductance. The parametric analysis shows that, tn the case of MOX-1, calculational results vary widely depending on the choice of model constants for FGR. Therefore, the model constants for FGR for the present test need to be established through the measured fuel centerline temperature, rod internal pressure, stack length if any, and finally thermal conductivity derived from measured data during irradiation. On the other hand, the difference in thermal performance of MOX-3 resulting from the choice of FGR model constants is not so large as that for MOX-1. This might arise, since the temperature of the MOX-3 is high, the capacity of grain boundaries to retain gas atoms is not sufficient enough to accommodate the large amount of gas atoms reaching the grain boundaries through diffusion. (Author). 20 refs., 7 tabs., 47 figs

  14. Preliminary analysis of in-reactor behavior of three MOX fuel rods in the halden reactor

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Yang Hyun; Lee, Byung Ho; Sohn, Dong Seong; Joo, Hyung Kook

    1999-09-01

    Preliminary analysis of in-reactor thermal performance for three MOX fuel rods that are going to be irradiated in the Halden reactor from the first quarter of the year 2000 have been conducted by using the computer code COSMOS. Using the assumption that microstructure of MOX fuel fabricated by SBR and dry milling method is the same, parametric studies have been carried out considering four kinds of uncertainties, which are thermal conductivity, linear power, manufacturing parameters, and model constant, to investigate the effect of each of uncertainty on in-reactor behavior. It is found that the uncertainty of model constants for FGR has a greatest impact of the all because the amount of gas released to the gap is one of the parameters that dominantlyaffects the gap conductance. The parametric analysis shows that, tn the case of MOX-1, calculational results vary widely depending on the choice of model constants for FGR. Therefore, the model constants for FGR for the present test need to be established through the measured fuel centerline temperature, rod internal pressure, stack length if any, and finally thermal conductivity derived from measured data during irradiation. On the other hand, the difference in thermal performance of MOX-3 resulting from the choice of FGR model constants is not so large as that for MOX-1. This might arise, since the temperature of the MOX-3 is high, the capacity of grain boundaries to retain gas atoms is not sufficient enough to accommodate the large amount of gas atoms reaching the grain boundaries through diffusion. (Author). 20 refs., 7 tabs., 47 figs.

  15. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  16. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Bae, Jun Ho; Park, Joo Hwan

    2010-01-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect the detailed shape of rod bundle on the numerical computation due to a lot of computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers, bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve the complex geometry such as a fuel rod bundle. In front of applying the method to the problem of 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to the simple geometry. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for the future works

  17. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  18. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  19. Preliminary design analysis of the ALT-II limiter for TEXTOR

    International Nuclear Information System (INIS)

    Koski, J.A.; Boyd, R.D.; Kempka, S.M.; Romig, A.D. Jr.; Smith, M.F.; Watson, R.D.; Whitley, J.B.; Conn, R.W.; Grotz, S.P.

    1984-01-01

    Installation of a large toroidal belt pump limiter, Advanced Limiter Test II (ALT-II), on the TEXTOR tokamak at Juelich, FRG is anticipated for early 1986. This paper discusses the preliminary mechanical design and materials considerations undertaken as part of the feasibility study phase for ALT-II. Since the actively cooled limiter blade is the component in direct contact with the plasma edge, and thus subject to the severe plasma environment, most preliminary design efforts have concentrated on analysis of the blade. The screening process which led to the recommended preliminary design consisting of a dispersion strenghthened copper or OFHC copper cover plate over an austenitic stainless steel base plate is discussed. A 1 to 3 mm thick low atomic number coating consisting of a graded plasma-sprayed Silicon Carbide-Aluminium composite is recommended subject to further experiment and evaluation. Thermal-hydraulic and stress analyses of the limiter blade are also discussed. (orig.)

  20. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Park, Joo Hwan

    2010-09-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect numerical computations on the detailed shape of rod bundle due to challenges with computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers and bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve complex geometry such as a fuel rod bundle. Before applying a solution to the problem of the 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to simple geometry. The split channel method has been proposed with the aim of computing the fully shaped CANDU fuel channel with detailed components. The validity was tested by applying the method to the single channel problem. The average temperature have similar values for the considered two methods, while the local temperature shows a slight difference by the effect of conduction heat transfer in the solid region of a rod. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for future work

  1. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  2. Failure mode analysis of preliminary design of ITER divertor impurity monitor

    International Nuclear Information System (INIS)

    Kitazawa, Sin-iti; Ogawa, Hiroaki

    2016-01-01

    Highlights: • Divertor impurity influx monitor for ITER (DIM) is procured by JADA. • DIM is designed to observe light from nuclear fusion plasma directly. • DIM is under preliminary design phase. • Failure mode of DIM was prepared for RAMI analysis. • RAMI analysis on DIM was performed to reduce technical risks. - Abstract: The objective of the divertor impurity influx monitor (DIM) for ITER is to measure the parameters of impurities and hydrogen isotopes (tritium, deuterium, and hydrogen) in divertor plasma using visible and UV spectroscopic techniques in the 200–1000 nm wavelength range. In ITER, special provisions are required to ensure accuracy and full functionality of the diagnostic components under harsh conditions (high temperature, high magnetic field, high vacuum condition, and high radiation field). Japan Domestic Agency is preparing the preliminary design of the ITER DIM system, which will be installed in the upper, equatorial and lower ports. The optical and mechanical designs of the DIM are conducted to fit ITER’s requirements. The optical and mechanical designs meet the requirements of spatial resolution. Some auxiliary systems were examined via prototyping. The preliminary design of the ITER DIM system was evaluated by RAMI analysis. The availability of the designed system is adequately high to satisfy the project requirements. However, some equipment does not have certain designs, and this may cause potential technical risks. The preliminary design should be modified to reduce technical risks and to prepare the final design.

  3. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  4. Summary of the Preliminary Analysis of Savannah River Depleted Uranium Trioxide

    International Nuclear Information System (INIS)

    2010-01-01

    This report summarizes a preliminary special analysis of the Savannah River Depleted Uranium Trioxide waste stream (SVRSURANIUM03, Revision 2). The analysis is considered preliminary because a final waste profile has not been submitted for review. The special analysis is performed to determine the acceptability of the waste stream for shallow land burial at the Area 5 Radioactive Waste Management Site (RWMS) at the Nevada National Security Site (NNSS). The Savannah River Depleted Uranium Trioxide waste stream requires a special analysis because the waste stream's sum of fractions exceeds one. The 99Tc activity concentration is 98 percent of the NNSS Waste Acceptance Criteria and the largest single contributor to the sum of fractions.

  5. Bioelectrical impedance analysis for bovine milk: Preliminary results

    Science.gov (United States)

    Bertemes-Filho, P.; Valicheski, R.; Pereira, R. M.; Paterno, A. S.

    2010-04-01

    This work reports the investigation and analysis of bovine milk quality by using biological impedance measurements using electrical impedance spectroscopy (EIS). The samples were distinguished by a first chemical analysis using Fourier transform midinfrared spectroscopy (FTIR) and flow citometry. A set of milk samples (100ml each) obtained from 17 different cows in lactation with and without mastitis were analyzed with the proposed technique using EIS. The samples were adulterated by adding distilled water and hydrogen peroxide in a controlled manner. FTIR spectroscopy and flow cytometry were performed, and impedance measurements were made in a frequency range from 500Hz up to 1MHz with an implemented EIS system. The system's phase shift was compensated by measuring saline solutions. It was possible to show that the results obtained with the Bioelectrical Impedance Analysis (BIA) technique may detect changes in the milk caused by mastitis and the presence of water and hydrogen peroxide in the bovine milk.

  6. Computer aided instruction. Preliminary experience in the Radiological Sciences Institute of the University of Milan

    International Nuclear Information System (INIS)

    Gardani, G.; Bertoli, M.A.; Bellomi, M.

    1987-01-01

    Computerised instruction means teaching by computer using a program that alternates information with self-checking multiple choice questions. This system was used to create a fully computerized lesson on the diagnosis and treatment of breast cancer which was then tested on a small group of madical students attending the Radiology School of the Milan University Institute of Radiological Sciences. At the end of the test, the students were asked to complete a questionnaire which was then analysed. The computer lesson consisted of 66 text messages and 21 self-checking questions. It aroused considerable interest, though the most common reason was curiosity about a novel system. The degree of fatigue caused was modest despite the fact that the computer lesson was at least as demanding as a traditional lesson, if not more so. The level of learning was considered high and optimised by the use of self-checking questions that were considered an essential element. However no student agreed to sit an official examination, even interactively, using the computer

  7. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mathematics; Anderson, D.R. [Sandia National Labs., Albuquerque, NM (United States). WIPP Performance Assessments Departments; Baker, B.L. [Technadyne Engineering Consultants, Albuquerque, NM (United States)] [and others

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  8. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs

  9. A Solar Powered Wireless Computer Mouse: Design, Assembly and Preliminary Testing of 15 Prototypes

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.; Reich, N.H.; Alsema, E.A.; Netten, M.P.; Veefkind, M.; Silvester, S.; Elzen, B.; Verwaal, M.

    2007-01-01

    The concept and design of a solar powered wireless computer mouse has been completed, and 15 prototypes have been successfully assembled. After necessary cutting, the crystalline silicon cells show satisfactory efficiency: up to 14% when implemented into the mouse device. The implemented voltage

  10. Isolation and preliminary function analysis of a Na + /H + antiporter ...

    African Journals Online (AJOL)

    A full-length cDNA Na+/H+ antiporter gene (MzNHX1) was isolated from Malus zumi according to the homologous Na+/H+ antiporter gene region in plants. Sequence analysis indicated that the cDNA was 2062 bp in length, including an open reading frame (ORF) of 1629 bp, which encoded a predicted polypeptide of 542 ...

  11. A Preliminary Analysis of a Behavioral Classrooms Needs Assessment

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McCray, Cynthia; Lamkins, Carol; Taubman, Mitchell; McEachin, John; Cihon, Joseph H.

    2016-01-01

    Today many special education classrooms implement procedures based upon the principles of Applied Behavior Analysis (ABA) to establish educationally relevant skills and decrease aberrant behaviors. However, it is difficult for school staff and consultants to evaluate the implementation of various components of ABA and general classroom set up. In…

  12. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  13. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  14. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  15. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  16. Preliminary analysis of a 1:4 scale prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Luk, V.K.; Hessheimer, M.F.

    1997-01-01

    Sandia National Laboratories is conducting a research program to investigate the integrity of nuclear containment structures. As part of the program Sandia will construct an instrumented 1:4 scale model of a prestressed concrete containment vessel (PCCV) for pressurized water reactors (PWR), which will be pressure tested up to its ultimate capacity. One of the key program objectives is to develop validated methods to predict the structural performance of containment vessels when subjected to beyond design basis loadings. Analytical prediction of structural performance requires a stepwise, systematic approach that addresses all potential failure modes. The analysis effort includes two and three-dimensional nonlinear finite element analyses of the PCCV test model to evaluate its structural performance under very high internal pressurization. Such analyses have been performed using the nonlinear concrete constitutive model, ANACAP-U, in conjunction with the ABAQUS general purpose finite element code. The analysis effort is carried out in three phases: preliminary analysis; pretest prediction; and post-test data interpretation and analysis evaluation. The preliminary analysis phase serves to provide instrumentation support and identify candidate failure modes. The associated tasks include the preliminary prediction of failure pressure and probable failure locations and the development of models to be used in the detailed failure analyses. This paper describes the modeling approaches and some of the results obtained in the first phase of the analysis effort

  17. Pilot Workload and Speech Analysis: A Preliminary Investigation

    Science.gov (United States)

    Bittner, Rachel M.; Begault, Durand R.; Christopher, Bonny R.

    2013-01-01

    Prior research has questioned the effectiveness of speech analysis to measure the stress, workload, truthfulness, or emotional state of a talker. The question remains regarding the utility of speech analysis for restricted vocabularies such as those used in aviation communications. A part-task experiment was conducted in which participants performed Air Traffic Control read-backs in different workload environments. Participant's subjective workload and the speech qualities of fundamental frequency (F0) and articulation rate were evaluated. A significant increase in subjective workload rating was found for high workload segments. F0 was found to be significantly higher during high workload while articulation rates were found to be significantly slower. No correlation was found to exist between subjective workload and F0 or articulation rate.

  18. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  19. Preliminary analysis of productivity of fruiting fungi on Strzeleckie meadows

    Directory of Open Access Journals (Sweden)

    Barbara Sadowska

    2014-11-01

    Full Text Available Analysis demonstrated that the fresh ahd dry weight as well as the ash content of fungal fruit bodies collected on a forest-surrounded unmown meadow (Stellario-Deschampsietum Freitag 1957 and Caricetum elatae W.Koch 1926 were lower than the same values for a plot of exploited mown meadow and higher than on an exploited unmown meadow (Arrhenatheretum medioeuropaeum (Br.-Bl. Oberd. 1952.

  20. Job Search Success in Local Labour Markets - A Preliminary Analysis

    OpenAIRE

    Greig, Malcolm; McQuaid, Ronald W.

    2001-01-01

    This study tests the appropriateness of current government employment policies, in particular the New Deal, in targeting specific groups of unemployed jobseekers. A sample of 169 unemployed jobseekers is divided into those who were successful and unsuccessful in finding employment and each group is analysed in terms of their attributes. A factor analysis of these attributes is then carried out in order to develop typical profiles of unsuccessful jobseekers who are possibly in need of special ...

  1. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  2. Cone-Beam Computed Tomography Evaluation of Mental Foramen Variations: A Preliminary Study

    International Nuclear Information System (INIS)

    Sheikhi, Mahnaz; Karbasi Kheir, Mitra; Hekmatian, Ehsan

    2015-01-01

    Background. Mental foramen is important in surgical operations of premolars because it transfers the mental nerves and vessels. This study evaluated the variations of mental foramen by cone-beam computed tomography among a selected Iranian population. Materials and Methods. A total number of 180 cone-beam computed tomography projections were analyzed in terms of shape, size, direction, and horizontal and vertical positions of mental foramen in the right and left sides. Results. The most common shape was oval, opening direction was posterior-superior, horizontal position was in line with second premolar, and vertical position was apical to the adjacent dental root. The mean of foremen diameter was 3.59 mm. Conclusion. In addition to the most common types of mental foramen, other variations exist, too. Hence, it reflects the significance of preoperative radiographic examinations, especially 3-dimensional images to prevent nerve damage

  3. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    International Nuclear Information System (INIS)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O'Connell, R.A.; Luck, D.; Camli, U.; King, L.N.

    1991-01-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy

  4. Preliminary experimentally-validated forced and mixed convection computational simulations of the Rotatable Buoyancy Tunnel

    International Nuclear Information System (INIS)

    Clifford, Corey E.; Kimber, Mark L.

    2015-01-01

    Although computational fluid dynamics (CFD) has not been directly utilized to perform safety analyses of nuclear reactors in the United States, several vendors are considering adopting commercial numerical packages for current and future projects. To ensure the accuracy of these computational models, it is imperative to validate the assumptions and approximations built into commercial CFD codes against physical data from flows analogous to those in modern nuclear reactors. To this end, researchers at Utah State University (USU) have constructed the Rotatable Buoyancy Tunnel (RoBuT) test facility, which is designed to provide flow and thermal validation data for CFD simulations of forced and mixed convection scenarios. In order to evaluate the ability of current CFD codes to capture the complex physics associated with these types of flows, a computational model of the RoBuT test facility is created using the ANSYS Fluent commercial CFD code. The numerical RoBuT model is analyzed at identical conditions to several experimental trials undertaken at USU. Each experiment is reconstructed numerically and evaluated with the second-order Reynolds stress model (RSM). Two different thermal boundary conditions at the heated surface of the RoBuT test section are investigated: constant temperature (isothermal) and constant surface heat flux (isoflux). Additionally, the fluid velocity at the inlet of the test section is varied in an effort to modify the relative importance of natural convection heat transfer from the heated wall of the RoBuT. Mean velocity, both in the streamwise and transverse directions, as well as components of the Reynolds stress tensor at three points downstream of the RoBuT test section inlet are compared to results obtained from experimental trials. Early computational results obtained from this research initiative are in good agreement with experimental data obtained from the RoBuT facility and both the experimental data and numerical method can be used

  5. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  6. Preliminary RAMI analysis of DFLL TBS for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dagui [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); University of Science and Technology of China, Hefei, Anhui, 230031 (China); Yuan, Run [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Wang, Jiaqun, E-mail: jiaqun.wang@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Wang, Fang; Wang, Jin [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China)

    2016-11-15

    Highlights: • We performed the functional analysis of the DFLL TBS. • We performed a failure mode analysis of the DFLL TBS. • We estimated the reliability and availability of the DFLL TBS. • The ITER RAMI approach was applied to the DFLL TBS for technical risk control in the design phase. - Abstract: ITER is the first fusion machine fully designed to prove the physics and technological basis for next fusion power plants. Among the main technical objectives of ITER is to test and validate design concepts of tritium breeding blankets relevant to the fusion power plants. To achieve this goal, China has proposed the dual functional lithium-lead test blanket module (DFLL TBM) concept design. The DFLL TBM and its associated ancillary system were called DFLL TBS. The DFLL TBS play a key role in next fusion reactor. In order to ensure reliable and available of DFLL TBS, the risk control project of DFLL TBS has been put on the schedule. As the stage of the ITER technical risk control policy, the RAMI (Reliability, Availability, Maintainability, Inspectability) approach was used to control the technical risk of ITER. In this paper, the RAMI approach was performed on the conceptual design of DFLL TBS. A functional breakdown was prepared on DFLL TBS, and the system was divided into 3 main functions and 72 basic functions. Based on the result of functional breakdown of DFLL TBS, the reliability block diagrams were prepared to estimate the reliability and availability of each function under the stipulated operating conditions. The inherent availability of the DFLL TBS expected after implementation of mitigation actions was calculated to be 98.57% over 2 years based on the ITER reliability database. A Failure Modes Effects and Criticality Analysis (FMECA) was performed with criticality charts highlighting the risk level of the different failure modes with regard to their probability of occurrence and their effects on the availability.

  7. Preliminary safety analysis for key design features of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  8. Preliminary report on the PIXE analysis of the squid statoliths

    International Nuclear Information System (INIS)

    Ikeda, Yuzuru; Arai, Nobuaki; Sakamoto, Wataru; Murayama, Tatsuro; Maeda, Kuniko; Yoshida, Koji.

    1996-01-01

    Micro trace elements in the squid statolith, a calcareous stone which acts as a balancer and hearing, was analyzed with Particle Induced X-ray Emission (PIXE) for the Japanese common squid for the first time. Calcium is the main component of the squid statoliths, which means that squid statolith is the pure calcified structure similar to the fish otolith. Beside Ca, Sr was detected with strong dosage, and some other elements as Mn, Fe, Cu, Zn and As were also detected. Possible assumption of intake of microelements to the statoliths and the suitability of PIXE for statoliths analysis are discussed. (author)

  9. Macroalgae as a Biomass Feedstock: A Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roesijadi, Guritno; Jones, Susanne B.; Snowden-Swan, Lesley J.; Zhu, Yunhua

    2010-09-26

    A thorough of macroalgae analysis as a biofuels feedstock is warranted due to the size of this biomass resource and the need to consider all potential sources of feedstock to meet current biomass production goals. Understanding how to harness this untapped biomass resource will require additional research and development. A detailed assessment of environmental resources, cultivation and harvesting technology, conversion to fuels, connectivity with existing energy supply chains, and the associated economic and life cycle analyses will facilitate evaluation of this potentially important biomass resource.

  10. Preliminary safety analysis report for the Waste Characterization Facility

    International Nuclear Information System (INIS)

    1994-10-01

    This safety analysis report outlines the safety concerns associated with the Waste Characterization Facility located in the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. The three main objectives of the report are to: define and document a safety basis for the Waste Characterization Facility activities; demonstrate how the activities will be carried out to adequately protect the workers, public, and environment; and provide a basis for review and acceptance of the identified risk that the managers, operators, and owners will assume. 142 refs., 38 figs., 39 tabs

  11. Preliminary results of very fast computation of Moment Magnitude and focal mechanism in the context of tsunami warning

    Science.gov (United States)

    Schindelé, François; Roch, Julien; Rivera, Luis

    2015-04-01

    Various methodologies were recently developed to compute the moment magnitude and the focal mechanism, thanks to the real time access to numerous broad-band seismic data. Several methods were implemented at the CENALT, in particular the W-Phase method developed by H. Kanamori and L. Rivera. For earthquakes of magnitudes in the range 6.5-9.0, this method provides accurate results in less than 40 minutes. The context of the tsunami warning in Mediterranean, a small basin impacted in less than one hour, and with small sources but some with high tsunami potential (Boumerdes 2003), a comprehensive tsunami warning system in that region should include very fast computation of the seismic parameters. The results of the values of Mw, the focal depth and the type of fault (reverse, normal, strike-slip) are the most relevant parameters expected for the tsunami warning. Preliminary results will be presented using data in the North-eastern and Mediterranean region for the recent period 2010-2014. This work is funded by project ASTARTE - - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  12. Preliminary uranium enrichment analysis results using cadmium zinc telluride detectors

    International Nuclear Information System (INIS)

    Lavietes, A.D.; McQuaid, J.H.; Paulus, T.J.

    1995-01-01

    Lawrence Livermore National Laboratory (LLNL) and EG ampersand G ORTEC have jointly developed a portable ambient-temperature detection system that can be used in a number of application scenarios. The detection system uses a planar cadmium zinc telluride (CZT) detector with custom-designed detector support electronics developed at LLNL and is based on the recently released MicroNOMAD multichannel analyzer (MCA) produced by ORTEC. Spectral analysis is performed using software developed at LLNL that was originally designed for use with high-purity germanium (HPGe) detector systems. In one application, the CZT detection system determines uranium enrichments ranging from less than 3% to over 75% to within accuracies of 20%. The analysis was performed using sample sizes of 200 g or larger and acquisition times of 30 min. The authors have demonstrated the capabilities of this system by analyzing the spectra gathered by the CZT detection system from uranium sources of several enrichments. These experiments demonstrate that current CZT detectors can, in some cases, approach performance criteria that were previously the exclusive domain of larger HPGe detector systems

  13. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  14. Analysis On Security Of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Muhammad Zunnurain Hussain

    2017-01-01

    Full Text Available In this paper Author will be discussing the security issues and challenges faced by the industry in securing the cloud computing and how these problems can be tackled. Cloud computing is modern technique of sharing resources like data sharing file sharing basically sharing of resources without launching own infrastructure and using some third party resources to avoid huge investment . It is very challenging these days to secure the communication between two users although people use different encryption techniques 1.

  15. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  16. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  17. Yucca Mountain transportation routes: Preliminary characterization and risk analysis

    International Nuclear Information System (INIS)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R.

    1991-01-01

    In this study, rail and highway routes which may be used for shipments of high-level nuclear waste to a proposed repository at Yucca Mountain, Nevada are characterized. This characterization facilitates three types of impact analysis: comparative study, limited worst-case assessment, and more sophisticated probabilistic risk assessment techniques. Data for relative and absolute impact measures are provided to support comparisons of routes based on selected characteristics. A worst-case scenario assessment is included to determine potentially critical and most likely places for accidents or incidents to occur. The assessment facilitated by the data in this study is limited because impact measures are restricted to the identification of potential areas or persons affected. No attempt is made to quantify the magnitude of these impacts. Most likely locations for accidents to occur are determined relative to other locations within the scope of this study. Independent factors and historical trends used to identify these likely locations are only proxies for accident probability

  18. City of Hoboken Energy Surety Analysis: Preliminary Design Summary

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Baca, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Schenkman, Benjamin L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Readiness and Sustainment Technology Dept.; Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Readiness and Sustainment Technology Dept.; Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electric Power Systems Research Dept.; Henry, Jordan M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Critical Infrastructure Systems Dept.; Jensen, Richard Pearson [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geomechanics Dept.

    2014-09-01

    In 2012, Hurricane Sandy devastated much of the U.S. northeast coastal areas. Among those hardest hit was the small community of Hoboken, New Jersey, located on the banks of the Hudson River across from Manhattan. This report describes a city-wide electrical infrastructure design that uses microgrids and other infrastructure to ensure the city retains functionality should such an event occur in the future. The designs ensure that up to 55 critical buildings will retain power during blackout or flooded conditions and include analysis for microgrid architectures, performance parameters, system control, renewable energy integration, and financial opportunities (while grid connected). The results presented here are not binding and are subject to change based on input from the Hoboken stakeholders, the integrator selected to manage and implement the microgrid, or other subject matter experts during the detailed (final) phase of the design effort.

  19. The Σ − D relation for planetary nebulae: Preliminary analysis

    Directory of Open Access Journals (Sweden)

    Urošević D.

    2007-01-01

    Full Text Available An analysis of the relation between radio surface brightness and diameter, so-called Σ − D relation, for planetary nebulae (PNe is presented: i the theoretical Σ − D relation for the evolution of bremsstrahlung surface brightness is derived; ii contrary to the results obtained earlier for the Galactic supernova remnant (SNR samples, our results show that the updated sample of Galactic PNe does not severely suffer from volume selection effect - Malmquist bias (same as for the extragalactic SNR samples and; iii we conclude that the empirical S − D relation for PNe derived in this paper is not useful for valid determination of distances for all observed PNe with unknown distances. .

  20. Feature Extraction on Brain Computer Interfaces using Discrete Dyadic Wavelet Transform: Preliminary Results

    International Nuclear Information System (INIS)

    Gareis, I; Gentiletti, G; Acevedo, R; Rufiner, L

    2011-01-01

    The purpose of this work is to evaluate different feature extraction alternatives to detect the event related evoked potential signal on brain computer interfaces, trying to minimize the time employed and the classification error, in terms of sensibility and specificity of the method, looking for alternatives to coherent averaging. In this context the results obtained performing the feature extraction using discrete dyadic wavelet transform using different mother wavelets are presented. For the classification a single layer perceptron was used. The results obtained with and without the wavelet decomposition were compared; showing an improvement on the classification rate, the specificity and the sensibility for the feature vectors obtained using some mother wavelets.

  1. Preliminary analysis of Psoroptes ovis transcriptome in different developmental stages

    Directory of Open Access Journals (Sweden)

    Man-Li He

    2016-11-01

    Full Text Available Abstract Background Psoroptic mange is a chronic, refractory, contagious and infectious disease mainly caused by the mange mite Psoroptes ovis, which can infect horses, sheep, buffaloes, rabbits, other domestic animals, deer, wild camels, foxes, minks, lemurs, alpacas, elks and other wild animals. Features of the disease include intense pruritus and dermatitis, depilation and hyperkeratosis, which ultimately result in emaciation or death caused by secondary bacterial infections. The infestation is usually transmitted by close contact between animals. Psoroptic mange is widespread in the world. In this paper, the transcriptome of P. ovis is described following sequencing and analysis of transcripts from samples of larvae (i.e. the Pso_L group and nymphs and adults (i.e. the Pso_N_A group. The study describes differentially expressed genes (DEGs and genes encoding allergens, which help understanding the biology of P. ovis and lay foundations for the development of vaccine antigens and drug target screening. Methods The transcriptome of P. ovis was assembled and analyzed using bioinformatic tools. The unigenes of P. ovis from each developmental stage and the unigenes differentially between developmental stages were compared with allergen protein sequences contained in the allergen database website to predict potential allergens. Results We identified 38,836 unigenes, whose mean length was 825 bp. On the basis of sequence similarity with seven databases, a total of 17,366 unigenes were annotated. A total of 1,316 DEGs were identified, including 496 upregulated and 820 downregulated in the Pso_L group compared with the Pso_N_A group. We predicted 205 allergens genes in the two developmental stages similar to genes from other mites and ticks, of these, 14 were among the upregulated DEGs and 26 among the downregulated DEGs. Conclusion This study provides a reference transcriptome of P. ovis in absence of a reference genome. The analysis of DEGs and

  2. Preliminary analysis of a new IAEA lichen AQCS material

    International Nuclear Information System (INIS)

    Grass, F.; Bichler, M.; Dorner, J.; Ismail, S.; Kregshammer, P.; Zamini, S.; Gwozdz, R.

    2000-01-01

    Lichen with a higher content on interesting trace elements were analyzed by activation analysis and by X-RF measurements on pressed lichen samples. The activation analyses were performed in three different ways: Short-time AA in the Fast Irradiation and Measurement System. Up to 580mg of lichen were irradiated 5-300s in polyethylene containers. Single spectra and spectra of 6 samples were summed up and evaluated. Longer irradiation at the ASTRA-Reactor: 2h at 8E13/s cm 2 . 100-150mg of lichen were irradiated in quartz suprasil vials. Longer irradiation at the Institute's TRIGA-Reactor: 6-7h at 1.8E12/s cm 2 , sample size: 7-48g of lichen were irradiated in polyethylene containers and after irradiation transferred to new measurement containers and measured in a device constructed by Gwozdz. The X-RF analysis was performed with a Spectrace 5000 energy dispersive X-ray fluorescence analyzer with a rhodium anode tube for excitation. From the activation analyses, the following elements were determined: Ag, Al, As, Au, Ba, Br, Ca, Cd, Ce, Cl, Co, Cr, Cs, Cu, Dy, Eu, Fe, Hf, Hg, I, K, La, Lu, Mg, Mn, Mo, Na, Nd, Ni, Rb, Sb, Sc, Se, Sr, Ta, Tb, Th, Ti, U, V, Yb, Zn. From the X-RF measurements, the elements Ag, Al, Ba, Br, Ca, Cd, Cu, Fe, I, K, Mg, Mn, P, Pb, Rb, S, Sb, Si, Sn, Sr, Ti, Y, Zn, and Zr were evaluated. From the X-RF data as well as from the AA-data of samples of different weight it is apparent that milling to a particle size of 200m is not sufficient for all elements, especially not for gold, cadmium, and cobalt which may be present as nuggets or accessory heavy minerals. It is therefore advisable to mill the sample to a particle size which is an order of magnitude smaller and remove the not adhering dust, even if this lowers the content of these elements. (author)

  3. Reconstruction for interior region-of-interest inverse geometry computed tomography: preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Su; Kim, Tae Ho; Kim, Kyeong Hyeon; Yoon, Do Kun; Suh, Tae Suk [Dept. of Biomedical Engineering, Research Institute of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul (Korea, Republic of); Kang, Seong Hee [Dept. of Radiation Oncology, Seoul National University Hospital, Seoul (Korea, Republic of); Cho, Min Seok [Dept. of Radiation Oncology, Asan Medical Center, Seoul (Korea, Republic of); Noh, Yu Yoon [Dept. of Radiation Oncology, Eulji University Hospital, Daejeon (Korea, Republic of)

    2017-04-15

    The inverse geometry computed tomography (IGCT) composed of multiple source and small size detector has several merits such as reduction of scatter effect and large volumetric imaging within one rotation without cone-beam artifact, compared to conventional cone-beam computed tomography (CBCT). By using this multi-source characteristics, we intend to present a selective and multiple interior region-of-interest (ROI) imaging method by using a designed source on-off sequence of IGCT. ROI-IGCT showed comparable image quality and has the capability to provide multi ROI image within a rotation. In this regard, it seems to be useful for diagnostic or image guidance for radiotherapy. ROI-IGCT showed comparable image quality and has the capability to provide multi ROI image within a rotation. Projection of ROI-IGCT is performed by selective irradiation, hence unnecessary imaging dose to non-interest region can be reduced. In this regard, it seems to be useful for diagnostic or image guidance for radiotherapy.

  4. Preparing computers for affective communication: a psychophysiological concept and preliminary results.

    Science.gov (United States)

    Whang, Min Cheol; Lim, Joa Sang; Boucsein, Wolfram

    Despite rapid advances in technology, computers remain incapable of responding to human emotions. An exploratory study was conducted to find out what physiological parameters might be useful to differentiate among 4 emotional states, based on 2 dimensions: pleasantness versus unpleasantness and arousal versus relaxation. The 4 emotions were induced by exposing 26 undergraduate students to different combinations of olfactory and auditory stimuli, selected in a pretest from 12 stimuli by subjective ratings of arousal and valence. Changes in electroencephalographic (EEG), heart rate variability, and electrodermal measures were used to differentiate the 4 emotions. EEG activity separates pleasantness from unpleasantness only in the aroused but not in the relaxed domain, where electrodermal parameters are the differentiating ones. All three classes of parameters contribute to a separation between arousal and relaxation in the positive valence domain, whereas the latency of the electrodermal response is the only differentiating parameter in the negative domain. We discuss how such a psychophysiological approach may be incorporated into a systemic model of a computer responsive to affective communication from the user.

  5. Preliminary results of standard quantitative analysis by ED-XRF

    Energy Technology Data Exchange (ETDEWEB)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A., E-mail: alellara@hotmail.com [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Fisica; Denyak, Valeriy, E-mail: denyak@gmail.com [Instituto de Pesquisa Pele Pequeno Principe (IPPP), Curitiba, PR (Brazil)

    2013-07-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step.

  6. A preliminary study of DTI Fingerprinting on stroke analysis.

    Science.gov (United States)

    Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo

    2014-01-01

    DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.

  7. FFTF vertical sodium storage tank preliminary thermal analysis

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    In the FFTF Shutdown Program, sodium from the primary and secondary heat transport loops, Interim Decay Storage (IDS), and Fuel Storage Facility (FSF) will be transferred to four large storage tanks for temporary storage. Three of the storage tanks will be cylindrical vertical tanks having a diameter of 28 feet, height of 22 feet and fabricated from carbon steel. The fourth tank is a horizontal cylindrical tank but is not the subject of this report. The storage tanks will be located near the FFTF in the 400 Area and rest on a steel-lined concrete slab in an enclosed building. The purpose of this work is to document the thermal analyses that were performed to ensure that the vertical FFTF sodium storage tank design is feasible from a thermal standpoint. The key criterion for this analysis is the time to heat up the storage tank containing frozen sodium at ambient temperature to 400 F. Normal operating conditions include an ambient temperature range of 32 F to 120 F. A key parameter in the evaluation of the sodium storage tank is the type of insulation. The baseline case assumed six inches of calcium silicate insulation. An alternate case assumed refractory fiber (Cerablanket) insulation also with a thickness of six inches. Both cases assumed a total electrical trace heat load of 60 kW, with 24 kW evenly distributed on the bottom head and 36 kW evenly distributed on the tank side wall

  8. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  9. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  10. Preliminary analysis of space mission applications for electromagnetic launchers

    Science.gov (United States)

    Miller, L. A.; Rice, E. E.; Earhart, R. W.; Conlon, R. J.

    1984-01-01

    The technical and economic feasibility of using electromagnetically launched EML payloads propelled from the Earth's surface to LEO, GEO, lunar orbit, or to interplanetary space was assessed. Analyses of the designs of rail accelerators and coaxial magnetic accelerators show that each is capable of launching to space payloads of 800 KG or more. A hybrid launcher in which EML is used for the first 2 KM/sec followed by chemical rocket stages was also tested. A cost estimates study shows that one to two EML launches per day are needed to break even, compared to a four-stage rocket. Development models are discussed for: (1) Earth orbital missions; (2) lunar base supply mission; (3) solar system escape mission; (4) Earth escape missions; (5) suborbital missions; (6) electromagnetic boost missions; and (7) space-based missions. Safety factors, environmental impacts, and EML systems analysis are discussed. Alternate systems examined include electrothermal thrustors, an EML rocket gun; an EML theta gun, and Soviet electromagnetic accelerators.

  11. A simplified procedure of linear regression in a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Silvia Facchinetti

    2013-05-01

    Full Text Available The analysis of a statistical large data-set can be led by the study of a particularly interesting variable Y – regressed – and an explicative variable X, chosen among the remained variables, conjointly observed. The study gives a simplified procedure to obtain the functional link of the variables y=y(x by a partition of the data-set into m subsets, in which the observations are synthesized by location indices (mean or median of X and Y. Polynomial models for y(x of order r are considered to verify the characteristics of the given procedure, in particular we assume r= 1 and 2. The distributions of the parameter estimators are obtained by simulation, when the fitting is done for m= r + 1. Comparisons of the results, in terms of distribution and efficiency, are made with the results obtained by the ordinary least square methods. The study also gives some considerations on the consistency of the estimated parameters obtained by the given procedure.

  12. Preliminary Analysis of Slope Stability in Kuok and Surrounding Areas

    Directory of Open Access Journals (Sweden)

    Dewandra Bagus Eka Putra

    2016-12-01

    Full Text Available The level of slope influenced by the condition of the rocks beneath the surface. On high level of slopes, amount of surface runoff and water transport energy is also enlarged. This caused by greater gravity, in line with the surface tilt from the horizontal plane. In other words, topsoil eroded more and more. When the slope becomes twice as steep, then the amount of erosion per unit area be 2.0 - 2.5 times more. Kuok and surrounding area is the road access between the West Sumatra and Riau which plays an important role economies of both provinces. The purpose of this study is to map the locations that have fairly steep slopes and potential mode of landslides. Based on SRTM data obtained,  the roads in Kuok area has a minimum elevation of + 33 m and a maximum  + 217.329 m. Rugged road conditions with slope ranging from 24.08 ° to 44.68 ° causing this area having frequent landslides. The result of slope stability analysis in a slope near the Water Power Plant Koto Panjang, indicated that mode of active failure is toppling failure or rock fall and the potential zone of failure is in the center part of the slope.

  13. Electrical field of electrical appliances versus distance: A preliminary analysis

    International Nuclear Information System (INIS)

    Mustafa, Nur Badariah Ahmad; Nordin, Farah Hani; Ismail, Fakaruddin Ali Ahmad; Alkahtani, Ammar Ahmed; Balasubramaniam, Nagaletchumi; Hock, Goh Chin; Shariff, Z A M

    2013-01-01

    Every household electrical appliance that is plugged in emits electric field even if it is not operating. The source where the appliance is plugged into and the components of household electrical appliance contribute to electric field emission. The electric field may cause unknown disturbance to the environment or also affect the human health and the effect might depends on the strength of the electric field emitted by the appliance. This paper will investigate the strength of the electric field emitted by four different electrical appliances using spectrum analyser. The strength will be captured at three different distances; (i) 1m (ii) 2m and (iii) 3m and analysis of the strength of the electrical field is done based on the three different distances. The measurement results show that the strength of the electric field is strongest when it is captured at 1m and the weakest at 3m from the electrical appliance. The results proved that the farther an object is located from the electrical appliance; the less effect the magnetic field has.

  14. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van

  15. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  16. Gas cooled fast reactor 2400 MWTh, status on the conceptual design studies and preliminary safety analysis

    International Nuclear Information System (INIS)

    Malo, J.Y.; Alpy, N.; Bentivoglio, F.

    2009-01-01

    The Gas cooled Fast Reactor (GFR) is considered by the French Commissariat a l'Energie Atomique as a promising concept, combining the benefits of fast spectrum and high temperature, using Helium as coolant. A status on the GFR preliminary viability was made at the end of 2007, ending the pre-conceptual design phase. A consistent overall systems arrangement was proposed and a preliminary safety analysis based on operating transient calculations and a simplified PSA had established a global confidence in the feasibility and safety of this baseline concept. Its potential for attractive performances had been pointed out. Compare to the more mature Sodium Fast Reactor technology, no demonstrator has ever been built and the feasibility demonstration will required a longer lead time. The next main project milestone is related to the GFR viability, scheduled in 2012. The current studies consist in revisiting the reactor reference design options as selected at the end of 2007. Most of them are being consolidated by going more in depth in the analysis. Some possible alternatives are assessed. The paper will give a status on the last studies performed on the core design and corresponding neutronics and cycle performance, the Decay Heat Removal strategy and preliminary safety analysis, systems design and balance of plant... This paper is complementary to the Icapp'09 papers 9062 dealing with the Gas cooled Fast Reactor Demonstrator ALLEGRO and 9378 related to GFR transients analysis. (author)

  17. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  18. Clinical registry for rheumatoid arthritis; a preliminary analysis

    International Nuclear Information System (INIS)

    Fakhr, A.; Hakim, F.; Zaidi, S.K.; Sharif, A.

    2017-01-01

    To establish a clinical registry for Rheumatoid Arthritis and delineate the most common symptoms that rheumatoid arthritis (RA) patients experience in our set up. Study Design: Cross sectional study. Place and Duration of Study: Study was carried out at Military Hospital (MH) Rawalpindi at Rheumatology Department during the period of Jan 2013 to Jun 2015. Material and Methods: A clinical registry for Rheumatoid Arthritis was developed as per criteria jointly developed by American College of Rheumatology (ACR) along with European League against Rheumatism (EULAR) (2010). Fifty-eight patients were registered after their informed consent and approval by Military Hospital (MH) Rawalpindi ethical committee. Age, gender and relevant clinical parameters of RA patients were recorded on case report forms and stored for analysis in the RA registry in Excel 2010. The figures were reported in frequencies and percentages. Results: Multiple joint pains (48.28%), fever (24.14%), morning stiffness of joints (22.41%) were the most common symptoms in RA patients. Other clinical manifestations included painful bilateral swollen joints (13.79%), pain in different parts of the body (10.34%), Raynaud's phenomenon (10.34%), malaise (8.62%), swollen body parts (8.62%), ulcers (8.62%), fatigue (6.90%), nodules on skin/elbow/interphalangeal joints (6.90%), deformities of fingers/ hand (3.45%), redness of eyes (3.45%), body rash (3.45%), inability to walk (3.45%), cervical lymphadenopathy (1.72%), stiffness of spine (1.72%) and myalgias (1.72%). Conclusion: It is concluded that multiple joint pains, fever and morning stiffness of joints are the most common symptoms of RA patients. (author)

  19. Diisocyanate emission from a paint product: a preliminary analysis.

    Science.gov (United States)

    Jarand, Curtis W; Akapo, Samuel O; Swenson, Lonie J; Kelman, Bruce J

    2002-07-01

    Exposure of workers to diisocyanates in the polyurethane foam manufacturing industry is well documented. However, very little quantitative data have been published on exposure to diisocyanates from the use of paints and coatings. The purpose of this study was to evaluate emission of 2,4-toluene diisocyanate, 2,6-toluene diisocyanate (2,6-TDI), and isophorone diisocyanate from a commercially available two-stage concrete coating and sealant. A laboratory model of an outdoor deck coating process was developed and diisocyanate concentrations determined by derivatization with 1-(2-methoxyphenol)-piperazine and subsequent high performance liquid chromatographic analysis with UV detection. The detection limit for 2,4-toluene diisocyanate and 2,6-toluene diisocyanate urea derivatives was 0.6 microg TDI/gm wet product, and 0.54 microg IPDI/gm wet product for the isophorone diisocyanate urea derivative. No 2,4-toluene diisocyanate or isophorone diisocyanate was detected in the mixed product. A maximum mean 2,6-TDI emission rate of 0.32 microg of 2,6-TDI/gram of wet product applied/hour was observed for the 1-hour sampling time, 0.38 microg of 2,6-TDI/gram of wet product applied/hour was observed for the 5-hour sampling time, and 0.02 micrpg of 2,6-TDI/gram of wet product applied/hour was observed for the 15-hour sampling time. The decrease in rate of 2,6-TDI emission over the 15-hour period indicates that emission of 2,6-TDI is virtually complete after 5 hours. These emission rates should allow industrial hygienists to calculate exposures to isocyanates emitted from at least one curing sealant.

  20. Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results

    Science.gov (United States)

    Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason

    2001-01-01

    This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.

  1. Crystallization and preliminary X-ray diffraction analysis of rat autotaxin

    International Nuclear Information System (INIS)

    Day, Jacqueline E.; Hall, Troii; Pegg, Lyle E.; Benson, Timothy E.; Hausmann, Jens; Kamtekar, Satwik

    2010-01-01

    Autotaxin (ATX), a pyrophosphatase/phosphodiesterase enzyme, is a promising drug target for many indications and is only distantly related to enzymes of previously determined structure. Here, the cloning, expression, purification, crystallization and preliminary diffraction analysis of ATX are reported. Rat autotaxin has been cloned, expressed, purified to homogeneity and crystallized via hanging-drop vapour diffusion using PEG 3350 as precipitant and ammonium iodide and sodium thiocyanate as salts. The crystals diffracted to a maximum resolution of 2.05 Å and belonged to space group P1, with unit-cell parameters a = 53.8, b = 63.3, c = 70.5 Å, α = 98.8, β = 106.2, γ = 99.8°. Preliminary X-ray diffraction analysis indicated the presence of one molecule per asymmetric unit, with a solvent content of 47%

  2. Relative risk analysis in regulating the use of radiation-emitting medical devices. A preliminary application

    Energy Technology Data Exchange (ETDEWEB)

    Jones, E.D.; Banks, W.W.; Altenbach, T.J.; Fischer, L.E. [Lawrence Livermore National Lab., CA (United States)

    1995-09-01

    This report describes a preliminary application of an analysis approach for assessing relative risks in the use of radiation- emitting medical devices. Results are presented on human-initiated actions and failure modes that are most likely to occur in the use of the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step in a US Nuclear Regulatory Commission (NRC) plan to evaluate the potential role of risk analysis in regulating the use of nuclear medical devices. For this preliminary application of risk assessment, the focus was to develop a basic process using existing techniques for identifying the most likely risk contributors and their relative importance. The approach taken developed relative risk rankings and profiles that incorporated the type and quality of data available and could present results in an easily understood form. This work was performed by the Lawrence Livermore National Laboratory for the NRC.

  3. Crystallization and preliminary X-ray diffraction analysis of West Nile virus

    International Nuclear Information System (INIS)

    Kaufmann, Bärbel; Plevka, Pavel; Kuhn, Richard J.; Rossmann, Michael G.

    2010-01-01

    Crystals of infectious West Nile virus were obtained and diffracted at best to about 25 Å resolution. Preliminary analysis of the diffraction pattern suggested tight hexagonal packing of the intact virus. West Nile virus, a human pathogen, is closely related to other medically important flaviviruses of global impact such as dengue virus. The infectious virus was purified from cell culture using polyethylene glycol (PEG) precipitation and density-gradient centrifugation. Thin amorphously shaped crystals of the lipid-enveloped virus were grown in quartz capillaries equilibrated by vapor diffusion. Crystal diffraction extended at best to a resolution of about 25 Å using synchrotron radiation. A preliminary analysis of the diffraction images indicated that the crystals had unit-cell parameters a ≃ b ≃ 480 Å, γ = 120°, suggesting a tight hexagonal packing of one virus particle per unit cell

  4. Preliminary Dynamic Feasibility and Analysis of a Spherical, Wind-Driven (Tumbleweed), Martian Rover

    Science.gov (United States)

    Flick, John J.; Toniolo, Matthew D.

    2005-01-01

    The process and findings are presented from a preliminary feasibility study examining the dynamics characteristics of a spherical wind-driven (or Tumbleweed) rover, which is intended for exploration of the Martian surface. The results of an initial feasibility study involving several worst-case mobility situations that a Tumbleweed rover might encounter on the surface of Mars are discussed. Additional topics include the evaluation of several commercially available analysis software packages that were examined as possible platforms for the development of a Monte Carlo Tumbleweed mission simulation tool. This evaluation lead to the development of the Mars Tumbleweed Monte Carlo Simulator (or Tumbleweed Simulator) using the Vortex physics software package from CM-Labs, Inc. Discussions regarding the development and evaluation of the Tumbleweed Simulator, as well as the results of a preliminary analysis using the tool are also presented. Finally, a brief conclusions section is presented.

  5. Preliminary X-ray analysis of twinned crystals of sarcosine dimethylglycine methyltransferase from Halorhodospira halochoris

    International Nuclear Information System (INIS)

    Kallio, Juha Pekka; Jänis, Janne; Nyyssölä, Antti; Hakulinen, Nina; Rouvinen, Juha

    2009-01-01

    The crystallization and preliminary X-ray diffraction analysis of sarcosine dimethylglycine methyltransferase from H. halochoris is reported. Sarcosine dimethylglycine methyltransferase (EC 2.1.1.157) is an enzyme from the extremely halophilic anaerobic bacterium Halorhodospira halochoris. This enzyme catalyzes the twofold methylation of sarcosine to betaine, with S-adenosylmethionine (AdoMet) as the methyl-group donor. This study presents the crystallization and preliminary X-ray analysis of recombinant sarcosine dimethylglycine methyltransferase produced in Escherichia coli. Mass spectroscopy was used to determine the purity and homogeneity of the enzyme material. Two different crystal forms, which initially appeared to be hexagonal and tetragonal, were obtained. However, on analyzing the diffraction data it was discovered that both crystal forms were pseudo-merohedrally twinned. The true crystal systems were monoclinic and orthorhombic. The monoclinic crystal diffracted to a maximum of 2.15 Å resolution and the orthorhombic crystal diffracted to 1.8 Å resolution

  6. Preliminary Disposal Analysis for Selected Accelerator Production of Tritium Waste Streams

    International Nuclear Information System (INIS)

    Ades, M.J.; England, J.L.

    1998-06-01

    A preliminary analysis was performed for two selected Accelerator Production of Tritium (APT) generated mixed and low-level waste streams to determine if one mixed low-level waste (MLLW) stream that includes the Mixed Waste Lead (MWL) can be disposed of at the Nevada Test Site (NTS) and at the Hanford Site and if one low-level radioactive waste (LLW) stream, that includes the Tungsten waste stream (TWS) generated by the Tungsten Neutron Source modules and used in the Target/Blanket cavity vessel, can be disposed of in the LLW Vaults at the Savannah River Plant (SRP). The preliminary disposal analysis that the radionuclide concentrations of the two selected APT waste streams are not in full compliance with the Waste Acceptance Criteria (WAC) and the Performance Assessment (PA) radionuclide limits of the disposal sites considered

  7. Relative risk analysis in regulating the use of radiation-emitting medical devices. A preliminary application

    International Nuclear Information System (INIS)

    Jones, E.D.; Banks, W.W.; Altenbach, T.J.; Fischer, L.E.

    1995-09-01

    This report describes a preliminary application of an analysis approach for assessing relative risks in the use of radiation- emitting medical devices. Results are presented on human-initiated actions and failure modes that are most likely to occur in the use of the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step in a US Nuclear Regulatory Commission (NRC) plan to evaluate the potential role of risk analysis in regulating the use of nuclear medical devices. For this preliminary application of risk assessment, the focus was to develop a basic process using existing techniques for identifying the most likely risk contributors and their relative importance. The approach taken developed relative risk rankings and profiles that incorporated the type and quality of data available and could present results in an easily understood form. This work was performed by the Lawrence Livermore National Laboratory for the NRC

  8. National Data Center Preparedness Exercise 2015 (NPE 2015): MY-NDC Preliminary Analysis Result

    International Nuclear Information System (INIS)

    Faisal Izwan Abdul Rashid; Muhammed Zulfakar Zolkaffly

    2016-01-01

    Malaysia has established the CTBT National Data Centre (MY-NDC) in December 2005. MY-NDC is tasked to perform Comprehensive Nuclear-Test-Ban-Treaty (CTBT) data management as well as provide information for Treaty related events to Nuclear Malaysia as CTBT National Authority. In 2015, MY-NDC has participated in the National Data Centre Preparedness Exercise 2015 (NPE 2015). This paper aims at presenting MY-NDC preliminary analysis result of NPE 2015. In NPE 2015, MY-NDC has performed five different analyses, namely, radionuclide, atmospheric transport modelling (ATM), data fusion, seismic analysis and site forensics. The preliminary findings show the hypothetical scenario in NPE 2015 most probably is an uncontained event resulted high release of radionuclide to the air. (author)

  9. Preliminary safety analysis of the HTTR-IS nuclear hydrogen production system

    International Nuclear Information System (INIS)

    Sato, Hiroyuki; Ohashi, Hirofumi; Tazawa, Yujiro; Tachibana, Yukio; Sakaba, Nariaki

    2010-06-01

    Japan Atomic Energy Agency is planning to demonstrate hydrogen production by thermochemical water-splitting IS process utilizing heat from the high-temperature gas-cooled reactor HTTR (HTTR-IS system). The previous study identified that the HTTR modification due to the coupling of hydrogen production plant requires an additional safety review since the scenario and quantitative values of the evaluation items would be altered from the original HTTR safety review. Hence, preliminary safety analyses are conducted by using the system analysis code. Calculation results showed that evaluation items such as a coolant pressure, temperatures of heat transfer tubes at the pressure boundary, etc., did not exceed allowable values. Also, the peak fuel temperature did not exceed allowable value and therefore the reactor core was not damaged and cooled sufficiently. This report compiles calculation conditions, event scenarios and the calculation results of the preliminary safety analysis. (author)

  10. Preliminary Hazards Analysis of K-Basin Fuel Encapsulation and Storage

    International Nuclear Information System (INIS)

    Strickland, G.C.

    1994-01-01

    This Preliminary Hazards Analysis (PHA) systematically examines the K-Basin facilities and their supporting systems for hazards created by abnormal operating conditions and external events (e.g., earthquakes) which have the potential for causing undesirable consequences to the facility worker, the onsite individual, or the public. The operational activities examined are fuel encapsulation, fuel storage and cooling. Encapsulation of sludges in the basins is not examined. A team of individuals from Westinghouse produced a set of Hazards and Operability (HAZOP) tables documenting their examination of abnormal process conditions in the systems and activities examined in K-Basins. The purpose of this report is to reevaluate and update the HAZOP in the original Preliminary Hazard Analysis of K-Basin Fuel Encapsulation and Storage originally developed in 1991

  11. Computational Analysis of SAXS Data Acquisition.

    Science.gov (United States)

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.

  12. Single photon emission computed tomography study of human pulmonary perfusion: preliminary findings

    Energy Technology Data Exchange (ETDEWEB)

    Carratu, L; Sofia, M [Naples Univ. (Italy). Facolta di Medicina e Chirurgia; Salvatore, M; Muto, P; Ariemma, G [Istituto Nazionale per la Prevenzione, Lo Studio e La Cura dei Tumori Fondazione Pascale, Naples (Italy); Lopez-Majano, V [Cook County Hospital, Chicago, IL (USA). Nuclear Medicine Div.

    1984-02-01

    Single photon emission computed tomography (SPECT) was performed with /sup 99/Tcsup(m)-albumin macroaggregates to study human pulmonary perfusion in healthy subjects and patients with respiratory diseases such as chronic obstructive pulmonary disease (COPD) and lung neoplasms. The reconstructed SPECT data was displayed in coronal, transverse, sagittal plane sections and compared to conventional perfusion scans. The SPECT data gave more complicated anatomical information about the extent of damage and morphology of the pulmonary vascular bed. In healthy subjects and COPD patients, qualitative and quantitative assessment of pulmonary perfusion could be obtained from serial SPECT scans with respect to distribution and relative concentration of the injected radiopharmaceutical. Furthermore, SPECT of pulmonary perfusion has been useful in detecting the extent of damage to the pulmonary circulation. This is useful for the preoperative evaluation and staging of lung cancer.

  13. Skin lesions diagnostics by on diffuse reflection spectres using computational algorithms: a preliminary study

    International Nuclear Information System (INIS)

    Orozco-Guillen, E.E.; Delgado-Atencio, J.A.; Vazquez-Montiel, S.; Castro-Ramos, J.; Villanueva-Luna, E.; Gutierrez-Delgado, F.

    2009-01-01

    The determination of diffuse reflection spectrum on human skin in the spectral range from 400nm-1000nm using an optical fiber spectrometers is a non-invasive technique widely used to study the optical parameters of this tissue, provides information about the absorption and scattering properties of light that can be employed to study the morphology and physiology of the tissue and to detect and diagnose skin diseases in early stages. In this paper a computational algorithm for the selection of the most important attributes of diffuse reflection spectra of human skin obtained with an experimental system that basically consists of a spectrometer, a white light source and bifurcated fiber optic probe that allows send and collect light. To classify the spectral signal was designed a Matlab2006 graphical interface which use support vector machines and algorithm for selecting attributes that allows to achieve a sensitivity and specificity exceeding 80% and 85% of accuracy in the classification. (Author)

  14. Computational fluid dynamics in three dimensional angiography: Preliminary hemodynamic results of various proximal geometry

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ha Youn; Park, Sung Tae; Bae, Won Kyoung; Goo, Dong Erk [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2014-12-15

    We studied the influence of proximal geometry on the results of computational fluid dynamics (CFD). We made five models of different proximal geometry from three dimensional angiography of 63-year-old women with intracranial aneurysm. CFD results were analyzed as peak systolic velocity (PSV) at inlet and outlet as well as flow velocity profile at proximal level of internal carotid artery (ICA) aneurysm. Modified model of cavernous one with proximal tubing showed faster PSV at outlet than that at inlet. The PSV of outlets of other models were slower than that of inlets. The flow velocity profiles at immediate proximal to ICA aneurysm showed similar patterns in all models, suggesting that proximal vessel geometries could affect CFD results.

  15. Preliminary spatial analysis of combined BATSE/Ulysses gamma-ray burst locations

    International Nuclear Information System (INIS)

    Kippen, R. Marc; Hurley, Kevin; Pendleton, Geoffrey N.

    1998-01-01

    We present the preliminary spatial analysis of 278 bursts that have been localized by BATSE and the two-spacecraft Compton/Ulysses Interplanetary Network. The large number and superior accuracy of the combined BATSE/Ulysses locations provides improved sensitivity to small-angle source properties. We find that the locations are consistent with large- and small-scale isotropy, with no significant small-angle clustering. We constrain the fraction of sources in clusters and discuss the implications for burst repetition

  16. Sludge Treatment Project Engineered Container Retrieval And Transfer System Preliminary Design Hazard Analysis Supplement 1

    International Nuclear Information System (INIS)

    Franz, G.R.; Meichle, R.H.

    2011-01-01

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  17. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2010-06-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  18. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2007-08-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  19. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    International Nuclear Information System (INIS)

    Lee C. Cadwallader

    2007-01-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with 'generic' component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance

  20. Preliminary crystallographic analysis of a possible transcription factor encoded by the mimivirus L544 gene

    International Nuclear Information System (INIS)

    Ciaccafava, Alexandre; Lartigue, Audrey; Mansuelle, Pascal; Jeudy, Sandra; Abergel, Chantal

    2011-01-01

    The mimivirus L544 gene product was expressed in E. coli and crystallized; preliminary phasing of a MAD data set was performed using the selenium signal present in a crystal of recombinant selenomethionine-substituted protein. Mimivirus is the prototype of a new family (the Mimiviridae) of nucleocytoplasmic large DNA viruses (NCLDVs), which already include the Poxviridae, Iridoviridae, Phycodnaviridae and Asfarviridae. Mimivirus specifically replicates in cells from the genus Acanthamoeba. Proteomic analysis of purified mimivirus particles revealed the presence of many subunits of the DNA-directed RNA polymerase II complex. A fully functional pre-transcriptional complex appears to be loaded in the virions, allowing mimivirus to initiate transcription within the host cytoplasm immediately upon infection independently of the host nuclear apparatus. To fully understand this process, a systematic study of mimivirus proteins that are predicted (by bioinformatics) or suspected (by proteomic analysis) to be involved in transcription was initiated by cloning and expressing them in Escherichia coli in order to determine their three-dimensional structures. Here, preliminary crystallographic analysis of the recombinant L544 protein is reported. The crystals belonged to the orthorhombic space group C222 1 with one monomer per asymmetric unit. A MAD data set was used for preliminary phasing using the selenium signal present in a selenomethionine-substituted protein crystal

  1. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  2. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...

  3. Computer programs simplify optical system analysis

    Science.gov (United States)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  4. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...

  5. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  6. Individualized computer-aided education in mammography based on user modeling: concept and preliminary experiments.

    Science.gov (United States)

    Mazurowski, Maciej A; Baker, Jay A; Barnhart, Huiman X; Tourassi, Georgia D

    2010-03-01

    The authors propose the framework for an individualized adaptive computer-aided educational system in mammography that is based on user modeling. The underlying hypothesis is that user models can be developed to capture the individual error making patterns of radiologists-in-training. In this pilot study, the authors test the above hypothesis for the task of breast cancer diagnosis in mammograms. The concept of a user model was formalized as the function that relates image features to the likelihood/extent of the diagnostic error made by a radiologist-in-training and therefore to the level of difficulty that a case will pose to the radiologist-in-training (or "user"). Then, machine learning algorithms were implemented to build such user models. Specifically, the authors explored k-nearest neighbor, artificial neural networks, and multiple regression for the task of building the model using observer data collected from ten Radiology residents at Duke University Medical Center for the problem of breast mass diagnosis in mammograms. For each resident, a user-specific model was constructed that predicts the user's expected level of difficulty for each presented case based on two BI-RADS image features. In the experiments, leave-one-out data handling scheme was applied to assign each case to a low-predicted-difficulty or a high-predicted-difficulty group for each resident based on each of the three user models. To evaluate whether the user model is useful in predicting difficulty, the authors performed statistical tests using the generalized estimating equations approach to determine whether the mean actual error is the same or not between the low-predicted-difficulty group and the high-predicted-difficulty group. When the results for all observers were pulled together, the actual errors made by residents were statistically significantly higher for cases in the high-predicted-difficulty group than for cases in the low-predicted-difficulty group for all modeling

  7. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  8. The opinions of the kindergarten teachers in relation to the introduction of computers to nursery schools: Preliminary approach

    Directory of Open Access Journals (Sweden)

    Irene Sivropoulou

    2009-03-01

    Full Text Available Computers were introduced in Greek kindergartens of our country with the new curricula for kindergarten (Inter-disciplinary Integrated Framework of Study Programs OFFICIAL JOURNAL OF THE HELLENIC REPUBLIC (376΄t.B/18-10-2001, article 6 in order to contribute to the spherical growth of children and to extend their learning. In other words it is intended that the computer will increase the interests and the motives for learning, to encourage active learning, to strengthen the dynamics of visualization, the importance of feedback, the possibility of monitoring and the possibility of connecting the school activities with extra curricula activities in order to strengthen the social and cultural dimension of kindergarten. Nevertheless technology cannot in itself, bring the sought after change in preschool education. Kindergarten teachers are the key for the successful use of computers in kindergarten. However, while kindergarten teachers in certain countries approve of the introduction and use of computers and believe that education with computers is developmentally suitable for small children, in other countries the attitude of kindergarten teachers towards computers is rather negative. This negative attitude of kindergarten teachers relates to their knowledge of computers and how often they use them or is it related to cultural factors and the prevailing educational philosophies? These questions led us to attempt to investigate the opinions of kindergarten teachers in Thessaloniki in regard to the introduction of new technologies in kindergarten. The research is made up of three interactive parts. It begins with the theoretical discussion about the introduction of computers in kindergarten, an investigation of the opinions of 122 kindergarten teachers using a questionnaire made up of 33 questions follows and it ends with the interpretative analysis.

  9. Use of ultrafast computed tomography to quantitate regional myocardial perfusion: a preliminary report

    International Nuclear Information System (INIS)

    Rumberger, J.A.; Feiring, A.J.; Lipton, M.J.; Higgins, C.B.; Ell, S.R.; Marcus, M.L.

    1987-01-01

    The purpose of this study was to assess the potential for rapid acquisition computed axial tomography (Imatron C-100) to quantify regional myocardial perfusion. Myocardial and left ventricular cavity contrast clearance curves were constructed after injecting nonionic contrast (1 ml/kg over 2 to 3 seconds) into the inferior vena cava of six anesthetized, closed chest dogs (n = 14). Independent myocardial perfusion measurements were obtained by coincident injection of radiolabeled microspheres into the left atrium during control, intermediate and maximal myocardial vasodilation with adenosine (0.5 to 1.0 mg/kg per min, intravenously, respectively). At each flow state, 40 serial short-axis scans of the left ventricle were taken near end-diastole at the midpapillary muscle level. Contrast clearance curves were generated and analyzed from the left ventricular cavity and posterior papillary muscle regions after excluding contrast recirculation and minimizing partial volume effects. The area under the curve (gamma variate function) was determined for a region of interest placed within the left ventricular cavity. Characteristics of contrast clearance data from the posterior papillary muscle region that were evaluated included the peak myocardial opacification, area under the contrast clearance curve and a contrast clearance time defined by the full width/half maximal extent of the clearance curve. Myocardial perfusion (microspheres) ranged from 35 to 450 ml/100 g per min (mean 167 +/- 125)

  10. Preliminary Investigation of Time Remaining Display on the Computer-based Emergency Operating Procedure

    Science.gov (United States)

    Suryono, T. J.; Gofuku, A.

    2018-02-01

    One of the important thing in the mitigation of accidents in nuclear power plant accidents is time management. The accidents should be resolved as soon as possible in order to prevent the core melting and the release of radioactive material to the environment. In this case, operators should follow the emergency operating procedure related with the accident, in step by step order and in allowable time. Nowadays, the advanced main control rooms are equipped with computer-based procedures (CBPs) which is make it easier for operators to do their tasks of monitoring and controlling the reactor. However, most of the CBPs do not include the time remaining display feature which informs operators of time available for them to execute procedure steps and warns them if the they reach the time limit. Furthermore, the feature will increase the awareness of operators about their current situation in the procedure. This paper investigates this issue. The simplified of emergency operating procedure (EOP) of steam generator tube rupture (SGTR) accident of PWR plant is applied. In addition, the sequence of actions on each step of the procedure is modelled using multilevel flow modelling (MFM) and influenced propagation rule. The prediction of action time on each step is acquired based on similar case accidents and the Support Vector Regression. The derived time will be processed and then displayed on a CBP user interface.

  11. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  12. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  13. Computational analysis of ozonation in bubble columns

    International Nuclear Information System (INIS)

    Quinones-Bolanos, E.; Zhou, H.; Otten, L.

    2002-01-01

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  14. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    Science.gov (United States)

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements. © 2011 IEEE

  15. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  16. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion.

    Science.gov (United States)

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao; Lu, Jun-Ying; Zeng, Yan-Hong; Meng, Fan-Jie; Cao, Bin; Zi, Xue-Rong; Han, Shu-Ming; Zhang, Yu-Huan

    2013-09-01

    Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 × d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l × h × d): V = 0.56 × (l × h × d) + 39.44 (r = 0.92, P = 0.000). The 64-slice CT volume-rendering technique can

  17. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion

    International Nuclear Information System (INIS)

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao

    2013-01-01

    Background: Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. Purpose: To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. Material and Methods: The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. Results: After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 X d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l X h X d): V = 0.56 X (l X h X d) + 39.44 (r = 0.92, P = 0

  18. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Zhi-Jun [Dept. of Radiology, North China Petroleum Bureau General Hospital, Renqiu, Hebei (China)], e-mail: Gzj3@163.com; Lin, Qiang [Dept. of Oncology, North China Petroleum Bureau General Hospital, Renqiu, Hebei (China); Liu, Hai-Tao [Dept. of General Surgery, North China Petroleum Bureau General Hospital, Renqiu, Hebei (China)] [and others])

    2013-09-15

    Background: Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. Purpose: To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. Material and Methods: The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. Results: After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 X d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l X h X d): V = 0.56 X (l X h X d) + 39.44 (r = 0.92, P = 0

  19. Preliminary hazard analysis for the Brayton Isotope Ground Demonstration System (including vacuum test chamber)

    International Nuclear Information System (INIS)

    Miller, L.G.

    1975-01-01

    The Preliminary Hazard Analysis (PHA) of the BIPS-GDS is a tabular summary of hazards and undesired events which may lead to system damage or failure and/or hazard to personnel. The PHA reviews the GDS as it is envisioned to operate in the Vacuum Test Chamber (VTC) of the GDS Test Facility. The VTC and other equipment which will comprise the test facility are presently in an early stage of preliminary design and will undoubtedly undergo numerous changes before the design is frozen. The PHA and the FMECA to follow are intended to aid the design effort by identifying areas of concern which are critical to the safety and reliability of the BIPS-GDS and test facility

  20. Expression, purification, crystallization and preliminary crystallographic analysis of the proliferation-associated protein Ebp1

    International Nuclear Information System (INIS)

    Kowalinski, Eva; Bange, Gert; Wild, Klemens; Sinning, Irmgard

    2007-01-01

    Preliminary X-ray analysis of the proliferation-associated protein Ebp1 from Homo sapiens is provided. ErbB-3-binding protein 1 (Ebp1) is a member of the family of proliferation-associated 2G4 proteins (PA2G4s) and plays a role in cellular growth and differentiation. Ligand-induced activation of the transmembrane receptor ErbB3 leads to dissociation of Ebp1 from the receptor in a phosphorylation-dependent manner. The non-associated protein is involved in transcriptional and translational regulation in the cell. Here, the overexpression, purification, crystallization and preliminary crystallographic studies of Ebp1 from Homo sapiens are reported. Initially observed crystals were improved by serial seeding to single crystals suitable for data collection. The optimized crystals belong to the tetragonal space group P4 1 2 1 2 or P4 3 2 1 2 and diffracted to a resolution of 1.6 Å

  1. Most significant preliminary results of the probabilistic safety analysis on the Juragua nuclear power plant

    International Nuclear Information System (INIS)

    Perdomo, Manuel

    1995-01-01

    Since 1990 the Group for PSA Development and Applications (GDA/APS) is working on the Level-1 PSA for the Juragua-1 NPP, as a part of an IAEA Technical Assistance Project. The main objective of this study, which is still under way, is to assess, in a preliminary way, the Reactor design safety to find its potential 'weak points' at the construction stage, using a eneric data base. At the same time, the study allows the PSA team to familiarize with the plant design and analysis techniques for the future operational PSA of the plant. This paper presents the most significant preliminary results of the study, which reveal some advantages of the safety characteristics of the plant design in comparison with the homologous VVER-440 reactors and some areas, where including slight modifications would improve the plant safety, considering the level of detail at which the study is carried out. (author). 13 refs, 1 fig, 2 tabs

  2. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  3. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  4. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  5. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  6. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  7. Isogeometric analysis : a calculus for computational mechanics

    NARCIS (Netherlands)

    Benson, D.J.; Borst, de R.; Hughes, T.J.R.; Scott, M.A.; Verhoosel, C.V.; Topping, B.H.V.; Adam, J.M.; Pallarés, F.J.; Bru, R.; Romero, M.L.

    2010-01-01

    The first paper on isogeometric analysis appeared only five years ago [1], and the first book appeared last year [2]. Progress has been rapid. Isogeometric analysis has been applied to a wide variety of problems in solids, fluids and fluid-structure interactions. Superior accuracy to traditional

  8. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  9. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  10. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  11. Development and preliminary user testing of the DCIDA (Dynamic computer interactive decision application) for 'nudging' patients towards high quality decisions.

    Science.gov (United States)

    Bansback, Nick; Li, Linda C; Lynd, Larry; Bryan, Stirling

    2014-08-01

    Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a 'think aloud' protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68-85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature). Seven subjects (25%) changed their

  12. Computer-assisted quantification of interstitial lung disease associated with rheumatoid arthritis: Preliminary technical validation

    International Nuclear Information System (INIS)

    Marten, K.; Dicken, V.; Kneitz, C.; Hoehmann, M.; Kenn, W.; Hahn, D.; Engelke, C.

    2009-01-01

    Purpose: To validate a threshold-based prototype software application (MeVis PULMO 3D) for quantification of chronic interstitial lung disease (ILD) in patients with rheumatoid arthritis (RA) using variable threshold settings for segmentation of diseased lung areas. Methods: Twenty-two patients with rheumatoid arthritis were included and underwent thin-section CT (4 x 1.25 mm collimation). CT scans were assessed by two observers for extent of ILD (EoILD), and twice by MeVis PULMO 3D for each protocol. MeVis PULMO 3D used four segmentation threshold (ST) settings (ST = -740, -780, -800 and -840 HU). Pulmonary function tests were obtained in all patients. Statistical evaluation used 95% limits of agreement (LoA) and linear regression analysis. Results: There was total concordance between the software measurements. Interobserver agreement was good (LoA = -28.36 to 17.58%). EoILD by readers correlated strongly with DL CO (r = -0.702, p CO at ST of -800 HU (r = -0.44, -0.49, -0.58 and -0.57 for ST = -740, -780, -800 and -840, respectively; p = 0.007-0.05) and moderately with FVC (r = -0.44, -0.51, -0.59 and -0.45 for ST = -740, -780, -800 and -840), respectively; p = 0.007-0.05). Conclusion: The MeVis PULMO 3D system used holds promise to become a valuable instrument for quantification of chronic ILD in patients with RA when using the threshold value of -800 HU, with evidence of the closest correlations, both with human observers and physiologic impairment.

  13. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  15. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  16. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  19. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  20. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  1. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  2. Preliminary design and off-design performance analysis of an Organic Rankine Cycle for geothermal sources

    International Nuclear Information System (INIS)

    Hu, Dongshuai; Li, Saili; Zheng, Ya; Wang, Jiangfeng; Dai, Yiping

    2015-01-01

    Highlights: • A method for preliminary design and performance prediction is established. • Preliminary data of radial inflow turbine and plate heat exchanger are obtained. • Off-design performance curves of critical components are researched. • Performance maps in sliding pressure operation are illustrated. - Abstract: Geothermal fluid of 90 °C and 10 kg/s can be exploited together with oil in Huabei Oilfield of China. Organic Rankine Cycle is regarded as a reasonable method to utilize these geothermal sources. This study conducts a detailed design and off-design performance analysis based on the preliminary design of turbines and heat exchangers. The radial inflow turbine and plate heat exchanger are selected in this paper. Sliding pressure operation is applied in the simulation and three parameters are considered: geothermal fluid mass flow rate, geothermal fluid temperature and condensing pressure. The results indicate that in all considered conditions the designed radial inflow turbine has smooth off-design performance and no choke or supersonic flow are found at the nozzle and rotor exit. The lager geothermal fluid mass flow rate, the higher geothermal fluid temperature and the lower condensing pressure contribute to the increase of cycle efficiency and net power. Performance maps are illustrated to make system meet different load requirements especially when the geothermal fluid temperature and condensing pressure deviate from the design condition. This model can be used to provide basic data for future detailed design, and predict off-design performance in the initial design phase

  3. System Matrix Analysis for Computed Tomography Imaging

    Science.gov (United States)

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  4. Computational analysis of sequence selection mechanisms.

    Science.gov (United States)

    Meyerguz, Leonid; Grasso, Catherine; Kleinberg, Jon; Elber, Ron

    2004-04-01

    Mechanisms leading to gene variations are responsible for the diversity of species and are important components of the theory of evolution. One constraint on gene evolution is that of protein foldability; the three-dimensional shapes of proteins must be thermodynamically stable. We explore the impact of this constraint and calculate properties of foldable sequences using 3660 structures from the Protein Data Bank. We seek a selection function that receives sequences as input, and outputs survival probability based on sequence fitness to structure. We compute the number of sequences that match a particular protein structure with energy lower than the native sequence, the density of the number of sequences, the entropy, and the "selection" temperature. The mechanism of structure selection for sequences longer than 200 amino acids is approximately universal. For shorter sequences, it is not. We speculate on concrete evolutionary mechanisms that show this behavior.

  5. Waste Feed Delivery System Phase 1 Preliminary RAM Analysis [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    DYKES, A.A.

    2000-10-11

    This report presents the updated results of the preliminary reliability, availability, and maintainability (RAM) analysis of selected waste feed delivery (WFD) operations to be performed by the Tank Farm Contractor (TFC) during Phase I activities in support of the Waste Treatment and Immobilization Plant (WTP). For planning purposes, waste feed tanks are being divided into five classes in accordance with the type of waste in each tank and the activities required to retrieve, qualify, and transfer waste feed. This report reflects the baseline design and operating concept, as of the beginning of Fiscal Year 2000, for the delivery of feed from three of these classes, represented by source tanks 241-AN-102, 241-AZ-101 and 241-AN-105. The preliminary RAM analysis quantifies the potential schedule delay associated with operations and maintenance (OBM) field activities needed to accomplish these operations. The RAM analysis is preliminary because the system design, process definition, and activity planning are in a state of evolution. The results are being used to support the continuing development of an O&M Concept tailored to the unique requirements of the WFD Program, which is being documented in various volumes of the Waste Feed Delivery Technical Basis (Carlson. 1999, Rasmussen 1999, and Orme 2000). The waste feed provided to the WTP must: (1) meet limits for chemical and radioactive constituents based on pre-established compositional envelopes (i.e., feed quality); (2) be in acceptable quantities within a prescribed sequence to meet feed quantities; and (3) meet schedule requirements (i.e., feed timing). In the absence of new criteria related to acceptable schedule performance due to the termination of the TWRS Privatization Contract, the original criteria from the Tank Waste Remediation System (77443s) Privatization Contract (DOE 1998) will continue to be used for this analysis.

  6. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  7. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  8. Preliminary analysis for model development of groundwater evolution in Horonobe area

    International Nuclear Information System (INIS)

    Yoshida, Yasushi; Yui, Mikazu

    2003-03-01

    The preliminary analysis for model development of groundwater evolution in Horonobe area was performed with data at D-1, HDB-1 and HDB-2 bore hole where hydrogen / oxygen isotope concentration, mineral property in sedimentary rock and physico-chemical parameters (pH, Eh and ionic concentrations) were measured. As a result of analysis for hydrogen and oxygen isotope concentration, saline water in marine sediment was diluted by the mixing with shallow groundwater and diffusion. And as a result of analysis for a concentration of bicarbonate from the difference of pH values measured between in-situ and under air, the estimated in-situ concentration of bicarbonate differs from that measured under air. And minerals which were assumed to be equilibrium with groundwater were selected by thermodynamic calculation. This report presents the results of preliminary analysis for groundwater evolution by using data derived from D-1, HDB-1 and HDB-2 boring research. In order to establish the model which interprets the groundwater evolution as a next step, data which satisfy the representative in-situ property of groundwater chemistry in Horonobe area are needed. Reliable measurements for physico-chemical parameter and property of minerals in sedimentary rock in dominant layer and at the variety of depth are also needed. (author)

  9. Ares-I-X Vehicle Preliminary Range Safety Malfunction Turn Analysis

    Science.gov (United States)

    Beaty, James R.; Starr, Brett R.; Gowan, John W., Jr.

    2008-01-01

    Ares-I-X is the designation given to the flight test version of the Ares-I rocket (also known as the Crew Launch Vehicle - CLV) being developed by NASA. As part of the preliminary flight plan approval process for the test vehicle, a range safety malfunction turn analysis was performed to support the launch area risk assessment and vehicle destruct criteria development processes. Several vehicle failure scenarios were identified which could cause the vehicle trajectory to deviate from its normal flight path, and the effects of these failures were evaluated with an Ares-I-X 6 degrees-of-freedom (6-DOF) digital simulation, using the Program to Optimize Simulated Trajectories Version 2 (POST2) simulation framework. The Ares-I-X simulation analysis provides output files containing vehicle state information, which are used by other risk assessment and vehicle debris trajectory simulation tools to determine the risk to personnel and facilities in the vicinity of the launch area at Kennedy Space Center (KSC), and to develop the vehicle destruct criteria used by the flight test range safety officer. The simulation analysis approach used for this study is described, including descriptions of the failure modes which were considered and the underlying assumptions and ground rules of the study, and preliminary results are presented, determined by analysis of the trajectory deviation of the failure cases, compared with the expected vehicle trajectory.

  10. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  11. Computational Analysis of Spray Jet Flames

    Science.gov (United States)

    Jain, Utsav

    There is a boost in the utilization of renewable sources of energy but because of high energy density applications, combustion will never be obsolete. Spray combustion is a type of multiphase combustion which has tremendous engineering applications in different fields, varying from energy conversion devices to rocket propulsion system. Developing accurate computational models for turbulent spray combustion is vital for improving the design of combustors and making them energy efficient. Flamelet models have been extensively used for gas phase combustion because of their relatively low computational cost to model the turbulence-chemistry interaction using a low dimensional manifold approach. This framework is designed for gas phase non-premixed combustion and its implementation is not very straight forward for multiphase and multi-regime combustion such as spray combustion. This is because of the use of a conserved scalar and various flamelet related assumptions. Mixture fraction has been popularly employed as a conserved scalar and hence used to parameterize the characteristics of gaseous flamelets. However, for spray combustion, the mixture fraction is not monotonic and does not give a unique mapping in order to parameterize the structure of spray flames. In order to develop a flamelet type model for spray flames, a new variable called the mixing variable is introduced which acts as an ideal conserved scalar and takes into account the convection and evaporation of fuel droplets. In addition to the conserved scalar, it has been observed that though gaseous flamelets can be characterized by the conserved scalar and its dissipation, this might not be true for spray flamelets. Droplet dynamics has a significant influence on the spray flamelet and because of effects such as flame penetration of droplets and oscillation of droplets across the stagnation plane, it becomes important to accommodate their influence in the flamelet formulation. In order to recognize the

  12. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-01-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  13. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  14. Dual-energy bone removal computed tomography (BRCT): preliminary report of efficacy of acute intracranial hemorrhage detection.

    Science.gov (United States)

    Naruto, Norihito; Tannai, Hidenori; Nishikawa, Kazuma; Yamagishi, Kentaro; Hashimoto, Masahiko; Kawabe, Hideto; Kamisaki, Yuichi; Sumiya, Hisashi; Kuroda, Satoshi; Noguchi, Kyo

    2018-02-01

    One of the major applications of dual-energy computed tomography (DECT) is automated bone removal (BR). We hypothesized that the visualization of acute intracranial hemorrhage could be improved on BRCT by removing bone as it has the highest density tissue in the head. This preliminary study evaluated the efficacy of a DE BR algorithm for the head CT of trauma patients. Sixteen patients with acute intracranial hemorrhage within 1 day after head trauma were enrolled in this study. All CT examinations were performed on a dual-source dual-energy CT scanner. BRCT images were generated using the Bone Removal Application. Simulated standard CT and BRCT images were visually reviewed in terms of detectability (presence or absence) of acute hemorrhagic lesions. DECT depicted 28 epidural/subdural hemorrhages, 17 contusional hemorrhages, and 7 subarachnoid hemorrhages. In detecting epidural/subdural hemorrhage, BRCT [28/28 (100%)] was significantly superior to simulated standard CT [17/28 (61%)] (p = .001). In detecting contusional hemorrhage, BRCT [17/17 (100%)] was also significantly superior to simulated standard CT [11/17 (65%)] (p = .0092). BRCT was superior to simulated standard CT in detecting acute intracranial hemorrhage. BRCT could improve the detection of small intracranial hemorrhages, particularly those adjacent to bone, by removing bone that can interfere with the visualization of small acute hemorrhage. In an emergency such as head trauma, BRCT can be used as support imaging in combination with simulated standard CT and bone scale CT, although BRCT cannot replace a simulated standard CT.

  15. Computed tomographic analysis of urinary calculi

    International Nuclear Information System (INIS)

    Naito, Akira; Ito, Katsuhide; Ito, Shouko

    1986-01-01

    Computed tomography (CT) was employed in an effort to analyze the chemical composition of urinary calculi. Twenty-three surgically removed calculi were scanned in a water bath (in vitro study). Forteen of them in the body were scanned (in vivo study). The calculi consisted of four types: mixed calcium oxalate and phosphate, mixed calcium carbonate and phosphate, magnesium ammonium phosphate, and uric acid. The in vitro study showed that the mean and maximum CT values of uric acid stones were significantly lower than those of the other three types of stones. This indicated that stones with less than 450 HU are composed of uric acid. In an in vivo study, CT did not help to differentiate the three types of urinary calculi, except for uric acid stones. Regarding the mean CT values, there was no correlation between in vitro and in vivo studies. An experiment with commercially available drugs showed that CT values of urinary calculi were not dependent upon the composition, but dependent upon the density of the calculi. (Namekawa, K.)

  16. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  17. A preliminary analysis of the risk of transporting nuclear waste to potential candidate commercial repository sites

    International Nuclear Information System (INIS)

    Madsen, M.M.

    1984-01-01

    In accordance with the provisions of the Nuclear Waste Policy Act of 1982, environmental assessments for potential candidate sites are required to provide a basis for selection of the first site for disposal of commercial radioactive waste in deep geologic repositories. A preliminary analysis of the impacts of transportation for each of the five potential sites will be described. Transportation was assumed to be entirely by truck or entirely by rail in order to obtain bounding impacts. This paper presents both radiological and nonradiological risks for the once-through fuel cycle

  18. Foregrounds in the BOOMERANG-LDB data: a preliminary rms analysis

    OpenAIRE

    Masi, S.; Ade, P. A. R.; Bock, J.; Boscaleri, A.; Crill, B. P.; de Bernardis, P.; Ganga, K.; Giacometti, M.; Hivon, E.; Hristov, V. V.; Lange, A. E.; Martinis, L.; Mauskopf, P. D.; Montroy, T.; Netterfield, C. B.

    2000-01-01

    We present a preliminary analysis of the BOOMERanG LDB maps, focused on foregrounds. BOOMERanG detects dust emission at moderately low galactic latitudes ($b > -20^o$) in bands centered at 90, 150, 240, 410 GHz. At higher Galactic latitudes, we use the BOOMERanG data to set conservative upper limits on the level of contamination at 90 and 150 GHz. We find that the mean square signal correlated with the IRAS/DIRBE dust template is less than 3% of the mean square signal due to CMB anisotropy.

  19. Documentation of Hanford Site independent review of the Hanford Waste Vitrification Plant Preliminary Safety Analysis Report

    International Nuclear Information System (INIS)

    Herborn, D.I.

    1991-10-01

    The requirements for Westinghouse Hanford independent review of the Preliminary Safety Analysis Report (PSAR) are contained in Section 1.0, Subsection 4.3 of WCH-CM-4-46. Specifically, this manual requires the following: (1) Formal functional reviews of the HWVP PSAR by the future operating organization (HWVP Operations), and the independent review organizations (HWVP and Environmental Safety Assurance, Environmental Assurance, and Quality Assurance); and (2) Review and approval of the HWVP PSAR by the Tank Waste Disposal (TWD) Subcouncil of the Safety and Environmental Advisory Council (SEAC), which provides independent advice to the Westinghouse Hanford President and executives on matters of safety and environmental protection. 7 refs

  20. Preliminary study of energy confinement data with a statistical analysis system in HL-2A tokamak

    International Nuclear Information System (INIS)

    Xu Yuan; Cui Zhengying; Ji Xiaoquan; Dong Chunfeng; Yang Qingwei; O J W F Kardaun

    2010-01-01

    Taking advantage of the HL-2A experimental data,an energy confinement database facing ITERL DB2.0 version has been originally established. As for this database,a world widely used statistical analysis system (SAS) has been adopted for the first time to analyze and evaluate the confinement data from HL-2A and the research on scaling laws of energy confinement time corresponding to plasma density is developed, some preliminary results having been achieved. Finally, through comparing with both ITER scaling law and previous ASDEX database, the investigation about L-mode confinement quality on HL-2A and influence of temperature on Spitzer resistivity will be discussed. (authors)

  1. Geoscientific long-term prognosis. Preliminary safety analysis for the site Gorleben

    International Nuclear Information System (INIS)

    Mrugalla, Sabine

    2011-07-01

    The preliminary safety analysis of the site Gorleben includes the following chapters: (1) Introduction; (2) Aim and content of the geoscientific long-term prognosis for the site Gorleben; (3) Boundary conditions at the site Gorleben: climate; geomorphology; overlying rocks and adjoining rocks; hydrogeology; salt deposit Gorleben. (4) Probable future geological developments at the site Gorleben: supraregional developments with effects on the site Gorleben; glacial period developments; developments of the geomorphology, overlying and adjoining rocks; future developments of the hydrological systems at the site Gorleben; future saliniferous specific developments of the salt deposit Gorleben. (5) Commentary on the unlikely or excludable developments of the site Gorleben.

  2. Classification and Analysis of Computer Network Traffic

    OpenAIRE

    Bujlow, Tomasz

    2014-01-01

    Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models of traffic for academic purposes. We define the objective of this thesis as finding a way to evaluate the performance of various applications in a high-speed Internet infrastructure. To satisfy the obje...

  3. Basic principles of computers

    International Nuclear Information System (INIS)

    Royal, H.D.; Parker, J.A.; Holmen, B.L.

    1988-01-01

    This chapter presents preliminary concepts of computer operations. It describes the hardware used in a nuclear medicine computer system. It discusses the software necessary for acquisition and analysis of nuclear medicine studies. The chapter outlines the integrated package of hardware and software that is necessary to perform specific functions in nuclear medicine

  4. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  5. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  6. Preliminary Report: Analysis of the baseline study on the prevalence of Salmonella in laying hen flocks of Gallus gallus

    DEFF Research Database (Denmark)

    Hald, Tine

    This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary for the establ......This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary...

  7. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  8. Cafts: computer aided fault tree analysis

    International Nuclear Information System (INIS)

    Poucet, A.

    1985-01-01

    The fault tree technique has become a standard tool for the analysis of safety and reliability of complex system. In spite of the costs, which may be high for a complete and detailed analysis of a complex plant, the fault tree technique is popular and its benefits are fully recognized. Due to this applications of these codes have mostly been restricted to simple academic examples and rarely concern complex, real world systems. In this paper an interactive approach to fault tree construction is presented. The aim is not to replace the analyst, but to offer him an intelligent tool which can assist him in modeling complex systems. Using the CAFTS-method, the analyst interactively constructs a fault tree in two phases: (1) In a first phase he generates an overall failure logic structure of the system; the macrofault tree. In this phase, CAFTS features an expert system approach to assist the analyst. It makes use of a knowledge base containing generic rules on the behavior of subsystems and components; (2) In a second phase the macrofault tree is further refined and transformed in a fully detailed and quantified fault tree. In this phase a library of plant-specific component failure models is used

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. Preliminary phytochemical screening, Antibacterial potential and GC-MS analysis of two medicinal plant extracts.

    Science.gov (United States)

    Vijayaram, Seerangaraj; Kannan, Suruli; Saravanan, Konda Mani; Vasantharaj, Seerangaraj; Sathiyavimal, Selvam; P, Palanisamy Senthilkumar

    2016-05-01

    The presence study was aimed to catalyze the primary metabolites and their confirmation by using GC-MS analysis and antibacterial potential of leaf extract of two important medicinal plant viz., Eucalyptus and Azadirachta indica. The antibacterial potential of the methanol leaf extract of the studied species was tested against Escherichia coli, Pseudomonas aeruginosa, Klebsiellap neumoniae, Streptococcus pyogens, Staphylococcus aureus using by agar well diffusion method. The higher zone of inhibition (16mm) was observed against the bacterium Pseudomonas aeruginosa at 100μl concentration of methanol leaf extract. Preliminary phytochemical analysis of studied species shows that presence of phytochemical compounds like steroids, phenolic compounds and flavonoids. GC-MS analysis confirms the occurrence of 20 different compounds in the methanol leaf extract of the both studied species.

  12. 1972 preliminary safety analysis report based on a conceptual design of a proposed repository in Kansas

    International Nuclear Information System (INIS)

    Blomeke, J.O.

    1977-08-01

    This preliminary safety analysis report is based on a proposed Federal Repository at Lyons, Kansas, for receiving, handling, and depositing radioactive solid wastes in bedded salt during the remainder of this century. The safety analysis applies to a hypothetical site in central Kansas identical to the Lyons site, except that it is free of nearby salt solution-mining operations and bore holes that cannot be plugged to Repository specifications. This PSAR contains much information that also appears in the conceptual design report. Much of the geological-hydrological information was gathered in the Lyons area. This report is organized in 16 sections: considerations leading to the proposed Repository, design requirements and criteria, a description of the Lyons site and its environs, land improvements, support facilities, utilities, different impacts of Repository operations, safety analysis, design confirmation program, operational management, requirements for eventually decommissioning the facility, design criteria for protection from severe natural events, and the proposed program of experimental investigations

  13. Modeling and preliminary thermal analysis of the capsule for a creep test in HANARO

    International Nuclear Information System (INIS)

    Choi, Myoung Hwan; Cho, Man Soon; Choo, Kee Nam; Kang, Young Hwan; Sohn, Jae Min; Shin, Yoon Taeg; Park, Sung Jae; Kim, Bong Goo; Kim, Young Jin

    2005-01-01

    A creep capsule is a device to investigate the creep characteristics of nuclear materials during inpile irradiation tests. To obtain the design data of the capsule through a preliminary thermal analysis, a 2-dimensional model for the cross section of the capsule including the specimens and components is generated, and an analysis using the ANSYS program is performed. The gamma-heating rates of the materials for the HANARO power of 30MW are considered, and the effect of the gap size and the control rod position on the temperature of the specimen is discussed. From the analysis it is found that the gap between the thermal media and the external tube has a significant effect on the temperature of the specimen. The temperature by increasing the position of the control rod is decreased

  14. 1972 preliminary safety analysis report based on a conceptual design of a proposed repository in Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Blomeke, J.O.

    1977-08-01

    This preliminary safety analysis report is based on a proposed Federal Repository at Lyons, Kansas, for receiving, handling, and depositing radioactive solid wastes in bedded salt during the remainder of this century. The safety analysis applies to a hypothetical site in central Kansas identical to the Lyons site, except that it is free of nearby salt solution-mining operations and bore holes that cannot be plugged to Repository specifications. This PSAR contains much information that also appears in the conceptual design report. Much of the geological-hydrological information was gathered in the Lyons area. This report is organized in 16 sections: considerations leading to the proposed Repository, design requirements and criteria, a description of the Lyons site and its environs, land improvements, support facilities, utilities, different impacts of Repository operations, safety analysis, design confirmation program, operational management, requirements for eventually decommissioning the facility, design criteria for protection from severe natural events, and the proposed program of experimental investigations. (DLC)

  15. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  16. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  17. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  18. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  19. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  20. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  1. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  2. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  3. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  4. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  5. Computer programs for analysis of geophysical data

    International Nuclear Information System (INIS)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  6. The purification, crystallization and preliminary X-ray diffraction analysis of dihydrodipicolinate synthase from Clostridium botulinum

    International Nuclear Information System (INIS)

    Dobson, Renwick C. J.; Atkinson, Sarah C.; Gorman, Michael A.; Newman, Janet M.; Parker, Michael W.; Perugini, Matthew A.

    2008-01-01

    Dihydrodipicolinate synthase (DHDPS), an enzyme in the lysine-biosynthetic pathway, is a promising target for antibiotic development against pathogenic bacteria. Here, the expression, purification, crystallization and preliminary diffraction analysis of DHDPS from C. botulinum are reported. In recent years, dihydrodipicolinate synthase (DHDPS; EC 4.2.1.52) has received considerable attention from both mechanistic and structural viewpoints. This enzyme, which is part of the diaminopimelate pathway leading to lysine, couples (S)-aspartate-β-semialdehyde with pyruvate via a Schiff base to a conserved active-site lysine. In this paper, the expression, purification, crystallization and preliminary X-ray diffraction analysis of DHDPS from Clostridium botulinum, an important bacterial pathogen, are presented. The enzyme was crystallized in a number of forms, predominantly using PEG precipitants, with the best crystal diffracting to beyond 1.9 Å resolution and displaying P4 2 2 1 2 symmetry. The unit-cell parameters were a = b = 92.9, c = 60.4 Å. The crystal volume per protein weight (V M ) was 2.07 Å 3 Da −1 , with an estimated solvent content of 41%. The structure of the enzyme will help guide the design of novel therapeutics against the C. botulinum pathogen

  7. A Preliminary Analysis of Reactor Performance Test (LOEP) for a Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonil; Park, Su-Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The final phase of commissioning is reactor performance test, which is to prove the integrated performance and safety of the research reactor at full power with fuel loaded such as neutron power calibration, Control Absorber Rod/Second Shutdown Rod drop time, InC function test, Criticality, Rod worth, Core heat removal with natural mechanism, and so forth. The last test will be safety-related one to assure the result of the safety analysis of the research reactor is marginal enough to be sure about the nuclear safety by showing the reactor satisfies the acceptance criteria of the safety functions such as for reactivity control, maintenance of auxiliaries, reactor pool water inventory control, core heat removal, and confinement isolation. After all, the fuel integrity will be ensured by verifying there is no meaningful change in the radiation levels. To confirm the performance of safety equipment, loss of normal electric power (LOEP), possibly categorized as Anticipated Operational Occurrence (AOO), is selected as a key experiment to figure out how safe the research reactor is before turning over the research reactor to the owner. This paper presents a preliminary analysis of the reactor performance test (LOEP) for a research reactor. The results showed how different the transient between conservative estimate and best estimate will look. Preliminary analyses have shown all probable thermal-hydraulic transient behavior of importance as to opening of flap valve, minimum critical heat flux ratio, the change of flow direction, and important values of thermal-hydraulic parameters.

  8. Crystallization and preliminary X-ray diffraction analysis of diaminopimelate epimerase from Escherichia coli

    International Nuclear Information System (INIS)

    Hor, Lilian; Dobson, Renwick C. J.; Dogovski, Con; Hutton, Craig A.; Perugini, Matthew A.

    2009-01-01

    Diaminopimelate (DAP) epimerase, an enzyme in the lysine-biosynthetic pathway, is a promising target for antibiotic development against pathogenic bacteria. Here, the cloning, expression, purification, crystallization and preliminary diffraction analysis of DAP epimerase from E. coli are reported. Diaminopimelate (DAP) epimerase (EC 5.1.1.7) catalyzes the penultimate step of lysine biosynthesis in bacteria and plants, converting l,l-diaminopimelate to meso-diaminopimelate. Here, the cloning, expression, purification, crystallization and preliminary X-ray diffraction analysis of DAP epimerase from Escherichia coli are presented. Crystals were obtained in space group P4 1 2 1 2 and diffracted to 2.0 Å resolution, with unit-cell parameters a = b = 89.4, c = 179.6 Å. Molecular replacement was conducted using Bacillus anthracis DAP epimerase as a search model and showed the presence of two molecules in the asymmetric unit, with an initial R free of 0.456 and R work of 0.416

  9. Introducing remarks upon the analysis of computer systems performance

    International Nuclear Information System (INIS)

    Baum, D.

    1980-05-01

    Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de

  10. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  11. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  12. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis

    International Nuclear Information System (INIS)

    Gaudio, P; Malizia, A; Gelfusa, M; Poggi, L.A.; Martinelli, E.; Di Natale, C.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis. (paper)

  13. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  14. Three-dimensional reconstruction of colorectal tumors from serial tissue sections by computer graphics: a preliminary study.

    Science.gov (United States)

    Kikuchi, S; Matsuzaki, H; Kondo, K; Ohtani, Y; Ihara, A; Hiki, Y; Kakita, A; Kuwao, S

    2000-01-01

    We present herein the three-dimensional reconstruction of colorectal tumors, with particular reference to growth pattern into each layer of the colorectal wall, and measurement of tumor volume and surface area. Conventional tissue section images of colorectal tumors were analyzed using a computer graphics analysis program. The two-dimensional extent of invasion by each tumor into each layer of intestinal wall were determined from the images of each section. Based on data from multiple sections, tumor and surrounding normal tissue layers were reconstructed three-dimensionally, and volume and surface area of the tumors were determined. Using this technique, three-dimensional morphology of tumor and tumor progression into colorectal wall could be determined. Volume and surface area of the colon tumor were 4871 mm3 and 1741 mm2, respectively. Volume and surface area of the rectal tumor were 1090 mm3 and 877 mm2, respectively. This technique may provide a new approach for pathological analysis of colorectal carcinoma.

  15. The calorimetric spectrum of the electron-capture decay of $^{163}$Ho. A preliminary analysis of the preliminary data

    CERN Document Server

    De Rújula, A.

    2015-01-01

    It is in principle possible to measure directly the electron neutrino mass (or masses and mixing angles) in weak electron-capture decays. The optimal nuclide in this respect is $^{163}$Ho. The favoured experimental technique, currently pursued in various experiments (ECHo, HOLMES and NuMECS) is "calorimetric". The calorimetric energy spectrum is a sum over the unstable vacant orbitals, or "holes", left by the electrons weakly captured by the nucleus. We discuss the current progress in this field and analize the preliminary data. Our conclusion is that, as pointed out by Robertson, the contribution of two-hole states is not negligible. But --in strong contradistinction with the tacit conclusion of previous comparisons of theory and observations-- we find a quite satisfactory agreement. A crucial point is that, in the creation of secondary holes, electron shakeoff and not only electron shakeup must be taken into account.

  16. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  17. Preliminary design and thermal analysis of device for finish cooling Jaffa biscuits in a.d. 'Jaffa'- Crvenka

    Directory of Open Access Journals (Sweden)

    Salemović Duško R.

    2015-01-01

    Full Text Available In this paper preliminary design of device for finish cooling chocolate topping of biscuits in A.D. 'Jaffa'- Crvenka was done. The proposed preliminary design followed by the required technological process of finish cooling biscuits and required parameters of process which was supposed to get and which represented part of project task. Thermal analysis was made and obtained percentage error between surface contact of the air and chocolate topping, obtained from heat balance and geometrical over proposed preliminary design, wasn't more than 0.67%. This is a preliminary design completely justified because using required length of belt conveyor receive required temperature of chocolate topping at the end of the cooling process.

  18. Preliminary Nuclear Analysis for the HANARO Fuel Element with Burnable Absorber

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chul Gyo; Kim, So Young; In, Won Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Burnable absorber is used for reducing reactivity swing and power peaking in high performance research reactors. Development of the HANARO fuel element with burnable absorber was started in the U-Mo fuel development program at HANARO, but detailed full core analysis was not performed because the current HANARO fuel management system is uncertain to analysis the HANARO core with burnable absorber. A sophisticated reactor physics system is required to analysis the core. The McCARD code was selected and the detailed McCARD core models, in which the basic HANARO core model was developed by one of the McCARD developers, are used in this study. The development of nuclear fuel requires a long time and correct developing direction especially by the nuclear analysis. This paper presents a preliminary nuclear analysis to promote the fuel development. Based on the developed fuel, the further nuclear analysis will improve reactor performance and safety. Basic nuclear analysis for the HANARO and the AHR were performed for getting the proper fuel elements with burnable absorber. Addition of 0.3 - 0.4% Cd to the fuel meat is promising for the current HANARO fuel element. Small addition of burnable absorber may not change any fuel characteristics of the HANARO fuel element, but various basic tests and irradiation tests at the HANARO core are required.

  19. Crystallization and preliminary X-ray analysis of isomaltase from Saccharomyces cerevisiae

    International Nuclear Information System (INIS)

    Yamamoto, Keizo; Miyake, Hideo; Kusunoki, Masami; Osaki, Shigeyoshi

    2008-01-01

    The crystallization and preliminary X-ray analysis of isomaltase is reported. Isomaltase from Saccharomyces cerevisiae is an oligo-1,6-glucosidase that preferentially hydrolyzes isomaltose, with little activity towards isomaltotriose or longer oligosaccharides. An amino-acid sequence analysis of the isomaltase revealed that it belongs to glucoside hydrolase family 13. Recombinant isomaltase was purified and crystallized by the hanging-drop vapour-diffusion method with PEG 3350 as the precipitant. The crystals belonged to space group C2, with unit-cell parameters a = 95.67, b = 115.42, c = 61.77 Å, β = 91.17°. X-ray diffraction data were collected to 1.35 Å resolution from a single crystal on a synchrotron-radiation source

  20. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  1. Preliminary Analysis of the Bundle-Duct Interaction for the fuel of SFR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byoung Oon; Cheon, Jin Sik; Hahn, Do Hee; Lee, Chan Bock [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-10-15

    BDI (Bundle-Duct Interaction) occurs in the fuel of SFR (Sodium-cooled Fast Reactor) due to the radial expansion and bowing of a fuel pin bundle. Under the BDI condition, excess cladding strain and hot spots would occur. Therefore, BDI, which is the dominant deformation mechanisms in a fuel pin bundle, should be considered to evaluate the FBR fuel integrity. The analysis codes such as ETOILE and BMBOO, have been developed to evaluate the BDI behavior. The bundle duct interaction model is also being developed for SFR in Korea. This model is based on ANSYS. In this paper, the fuel pin configuration model for the BDI calculation was established. The preliminary analysis of the bundle-duct interaction was performed to evaluate the fuel design concept.

  2. Los Alamos National Laboratory corregated metal pipe saw facility preliminary safety analysis report. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-09-19

    This Preliminary Safety Analysis Report addresses site assessment, facility design and construction, and design operation of the processing systems in the Corrugated Metal Pipe Saw Facility with respect to normal and abnormal conditions. Potential hazards are identified, credible accidents relative to the operation of the facility and the process systems are analyzed, and the consequences of postulated accidents are presented. The risk associated with normal operations, abnormal operations, and natural phenomena are analyzed. The accident analysis presented shows that the impact of the facility will be acceptable for all foreseeable normal and abnormal conditions of operation. Specifically, under normal conditions the facility will have impacts within the limits posted by applicable DOE guidelines, and in accident conditions the facility will similarly meet or exceed the requirements of all applicable standards. 16 figs., 6 tabs.

  3. Preliminary analysis on hybrid Box-Jenkins - GARCH modeling in forecasting gold price

    Science.gov (United States)

    Yaziz, Siti Roslindar; Azizan, Noor Azlinna; Ahmad, Maizah Hura; Zakaria, Roslinazairimah; Agrawal, Manju; Boland, John

    2015-02-01

    Gold has been regarded as a valuable precious metal and the most popular commodity as a healthy return investment. Hence, the analysis and prediction of gold price become very significant to investors. This study is a preliminary analysis on gold price and its volatility that focuses on the performance of hybrid Box-Jenkins models together with GARCH in analyzing and forecasting gold price. The Box-Cox formula is used as the data transformation method due to its potential best practice in normalizing data, stabilizing variance and reduces heteroscedasticity using 41-year daily gold price data series starting 2nd January 1973. Our study indicates that the proposed hybrid model ARIMA-GARCH with t-innovation can be a new potential approach in forecasting gold price. This finding proves the strength of GARCH in handling volatility in the gold price as well as overcomes the non-linear limitation in the Box-Jenkins modeling.

  4. Preliminary X-ray crystallographic analysis of sulfide:quinone oxidoreductase from Acidithiobacillus ferrooxidans

    International Nuclear Information System (INIS)

    Zhang, Yanfei; Cherney, Maia M.; Solomonson, Matthew; Liu, Jianshe; James, Michael N. G.; Weiner, Joel H.

    2009-01-01

    The sulfide:quinone oxidoreductase from A. ferrooxidans ATCC 23270 was overexpressed in E. coli and purified. Crystallization and preliminarily X-ray crystallographic analysis were performed for the recombinant enzyme. The gene product of open reading frame AFE-1293 from Acidithiobacillus ferrooxidans ATCC 23270 is annotated as encoding a sulfide:quinone oxidoreductase, an enzyme that catalyses electron transfer from sulfide to quinone. Following overexpression in Escherichia coli, the enzyme was purified and crystallized using the hanging-drop vapour-diffusion method. The native crystals belonged to the tetragonal space group P4 2 2 1 2, with unit-cell parameters a = b = 131.7, c = 208.8 Å, and diffracted to 2.3 Å resolution. Preliminary crystallographic analysis indicated the presence of a dimer in the asymmetric unit, with an extreme value of the Matthews coefficient (V M ) of 4.53 Å 3 Da −1 and a solvent content of 72.9%

  5. Preliminary code development for seismic signal analysis related to test ban treaty questions

    International Nuclear Information System (INIS)

    Brolley, J.E.

    1977-01-01

    Forensic seismology, from a present day viewpoint, appears to be divided into several areas. Overwhelmingly important, in view of current Complete Test Ban (CTB) discussions, is the seismological study of waves generated in the earth by underground nuclear explosions. Over the last two decades intensive effort has been devoted to developing improved observational apparatus and to the interpretation of the data produced by this equipment. It is clearly desirable to extract the maximum amount of information from seismic signals. It is, therefore, necessary to quantitatively compare various modes of analysis to establish which mode or combination of modes provides the most useful information. Preliminary code development for application of some modern developments in signal processing to seismic signals is described. Applications of noncircular functions are considered and compared with circular function results. The second portion of the discussion concerns maximum entropy analysis. Lastly, the multivariate aspects of the general problem are considered

  6. Treatment by gliding arc of epoxy resin: preliminary analysis of surface modifications

    Science.gov (United States)

    Faubert, F.; Wartel, M.; Pellerin, N.; Pellerin, S.; Cochet, V.; Regnier, E.; Hnatiuc, B.

    2016-12-01

    Treatments with atmospheric pressure non-thermal plasma are easy to implement and inexpensive. Among them gliding arc (GlidArc) remains rarely used in surface treatment of polymers. However, it offers economic and flexible way to treat quickly large areas. In addition the choice of carrier gas makes it possible to bring the active species and other radicals allowing different types of grafting and functionalization of the treated surfaces, for example in order to apply for anti-biofouling prevention. This preliminary work includes analysis of the surface of epoxy resins by infrared spectroscopy: the different affected chemical bonds were studied depending on the duration of treatment. The degree of oxidation (the C/O ratio) is obtained by X-ray microanalysis and contact angle analysis have been performed to determinate the wettability properties of the treated surface. A spectroscopic study of the plasma allows to determine the possible active species in the different zones of the discharge.

  7. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  8. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  9. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  10. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  11. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  12. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  13. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  14. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  15. Preliminary failure modes and effects analysis on Korean HCCR TBS to be tested in ITER

    International Nuclear Information System (INIS)

    Ahn, Mu-Young; Cho, Seungyon; Jin, Hyung Gon; Lee, Dong Won; Park, Yi-Hyun; Lee, Youngmin

    2015-01-01

    Highlights: • Postulated initiating events are identified through failure modes and effects analysis on the current HCCR TBS design. • A set of postulated initiating events are selected for consideration of deterministic analysis. • Accident evolutions on the selected postualted initiating events are qualitatively described for deterministic analysis. - Abstract: Korean Helium cooled ceramic reflector (HCCR) Test blanket system (TBS), which comprises Test blanket module (TBM) and ancillary systems in various locations of ITER building, is operated at high temperature and pressure with decay heat. Therefore, safety is utmost concern in design process and it is required to demonstrate that the HCCR TBS is designed to comply with the safety requirements and guidelines of ITER. Due to complexity of the system with many interfaces with ITER, a systematic approach is necessary for safety analysis. This paper presents preliminary failure modes and effects analysis (FMEA) study performed for the HCCR TBS. FMEA is a systematic methodology in which failure modes for components in the system and their consequences are studied from the bottom-up. Over eighty failure modes have been investigated on the HCCR TBS. The failure modes that have similar consequences are grouped as postulated initiating events (PIEs) and total seven reference accident scenarios are derived from FMEA study for deterministic accident analysis. Failure modes not covered here due to evolving design of the HCCR TBS and uncertainty in maintenance procedures will be studied further in near future.

  16. Preliminary failure modes and effects analysis on Korean HCCR TBS to be tested in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Mu-Young, E-mail: myahn74@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Yi-Hyun; Lee, Youngmin [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Postulated initiating events are identified through failure modes and effects analysis on the current HCCR TBS design. • A set of postulated initiating events are selected for consideration of deterministic analysis. • Accident evolutions on the selected postualted initiating events are qualitatively described for deterministic analysis. - Abstract: Korean Helium cooled ceramic reflector (HCCR) Test blanket system (TBS), which comprises Test blanket module (TBM) and ancillary systems in various locations of ITER building, is operated at high temperature and pressure with decay heat. Therefore, safety is utmost concern in design process and it is required to demonstrate that the HCCR TBS is designed to comply with the safety requirements and guidelines of ITER. Due to complexity of the system with many interfaces with ITER, a systematic approach is necessary for safety analysis. This paper presents preliminary failure modes and effects analysis (FMEA) study performed for the HCCR TBS. FMEA is a systematic methodology in which failure modes for components in the system and their consequences are studied from the bottom-up. Over eighty failure modes have been investigated on the HCCR TBS. The failure modes that have similar consequences are grouped as postulated initiating events (PIEs) and total seven reference accident scenarios are derived from FMEA study for deterministic accident analysis. Failure modes not covered here due to evolving design of the HCCR TBS and uncertainty in maintenance procedures will be studied further in near future.

  17. Expression, purification, crystallization and preliminary crystallographic analysis of the proliferation-associated protein Ebp1

    Energy Technology Data Exchange (ETDEWEB)

    Kowalinski, Eva; Bange, Gert; Wild, Klemens; Sinning, Irmgard, E-mail: irmi.sinning@bzh.uni-heidelberg.de [Heidelberg University Biochemistry Center, INF 328, D-69120 Heidelberg (Germany)

    2007-09-01

    Preliminary X-ray analysis of the proliferation-associated protein Ebp1 from Homo sapiens is provided. ErbB-3-binding protein 1 (Ebp1) is a member of the family of proliferation-associated 2G4 proteins (PA2G4s) and plays a role in cellular growth and differentiation. Ligand-induced activation of the transmembrane receptor ErbB3 leads to dissociation of Ebp1 from the receptor in a phosphorylation-dependent manner. The non-associated protein is involved in transcriptional and translational regulation in the cell. Here, the overexpression, purification, crystallization and preliminary crystallographic studies of Ebp1 from Homo sapiens are reported. Initially observed crystals were improved by serial seeding to single crystals suitable for data collection. The optimized crystals belong to the tetragonal space group P4{sub 1}2{sub 1}2 or P4{sub 3}2{sub 1}2 and diffracted to a resolution of 1.6 Å.

  18. Preliminary Design and Analysis of an In-plane PRSEUS Joint

    Science.gov (United States)

    Lovejoy, Andrew E.; Poplawski, Steven

    2013-01-01

    As part of the National Aeronautics and Space Administration's (NASA's) Environmentally Responsible Aviation (ERA) program, the Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) has been designed, developed and tested. However, PRSEUS development efforts to date have only addressed joints required to transfer bending moments between PRSEUS panels. Development of in-plane joints for the PRSEUS concept is necessary to facilitate in-plane transfer of load from PRSEUS panels to an adjacent structure, such as from a wing panel into a fuselage. This paper presents preliminary design and analysis of an in-plane PRSEUS joint for connecting PRSEUS panels at the termination of the rod-stiffened stringers. Design requirements are provided, the PRSEUS blade joint concept is presented, and preliminary design changes and analyses are carried out to examine the feasibility of the proposed in-plane PRSEUS blade joint. The study conducted herein focuses mainly on the PRSEUS structure on one side of the joint. In particular, the design requirements for the rod shear stress and bolt bearing stress are examined. A PRSEUS blade joint design was developed that demonstrates the feasibility of this in-plane PRSEUS joint concept to terminate the rod-stiffened stringers. The presented design only demonstrates feasibility, therefore, some areas of refinement are presented that would lead to a more optimum and realistic design.

  19. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  20. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  1. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  2. Analysis of Biosignals During Immersion in Computer Games.

    Science.gov (United States)

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  3. PLATO: a computer code for the analysis of fission product plateout in HTGRs

    International Nuclear Information System (INIS)

    Suzuki, Katsuo; Morimoto, Toshio.

    1981-01-01

    The computer code PLATO for estimating plateout activities on surfaces of primary cooling system of HTGRs has been developed, and in this report, analytical model and digital calculation method incorporated in the code are described. The code utilizes the mass transfer model analogous to heat transfer coupled with an expression for adsorption-desorption phenomenon, and is able to analyze plateout behaviours in a closed circuit, like a reactor cooling system, which is constructed from a various kind of components, as well as in an open-ended tube. With the code, fission product concentration in the coolant and plateout amount on the surfaces are calculated along the coolant stream, and total removal rate by the plateout process is also obtained. Comparison of the analytical results with the experimental results, including checks of the effects of some calculation conditions on the results, and preliminary analysis on the VHTR plant have been made. (author)

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  6. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  7. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  8. Documentation of Hanford Site independent review of the Hanford Waste Vitrification Plant Preliminary Safety Analysis Report

    International Nuclear Information System (INIS)

    Herborn, D.I.

    1993-11-01

    Westinghouse Hanford Company (WHC) is the Integrating Contractor for the Hanford Waste Vitrification Plant (HWVP) Project, and as such is responsible for preparation of the HWVP Preliminary Safety Analysis Report (PSAR). The HWVP PSAR was prepared pursuant to the requirements for safety analyses contained in US Department of Energy (DOE) Orders 4700.1, Project Management System (DOE 1987); 5480.5, Safety of Nuclear Facilities (DOE 1986a); 5481.lB, Safety Analysis and Review System (DOE 1986b) which was superseded by DOE order 5480-23, Nuclear Safety Analysis Reports, for nuclear facilities effective April 30, 1992 (DOE 1992); and 6430.lA, General Design Criteria (DOE 1989). The WHC procedures that, in large part, implement these DOE requirements are contained in WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual. This manual describes the overall WHC safety analysis process in terms of requirements for safety analyses, responsibilities of the various contributing organizations, and required reviews and approvals

  9. MAAP4 CANDU analysis of a generic CANDU-6 plant: preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Petoukhov, S.M.; Mathew, P.M

    2001-10-01

    To support the generic probabilistic safety analysis (PSA) program at AECL, in particular to conduct Level 2 PSA analysis of a CANDU 6 plant undergoing a postulated severe accident, the capability to conduct severe accident consequence analysis for a CANDU plant is required. For this purpose, AECL selected MAAP4 CANDU from a number of other severe accident codes. The necessary models for a generic CANDU 6 station have been implemented in the code, and the code version 0.2 beta was tested using station data, which were assembled for a generic CANDU 6 station. This paper describes the preliminary results of the consequence analysis using MAAP4 CANDU for a generic CANDU 6 station, when it undergoes a station blackout and a large loss-of-coolant accident scenario. The analysis results show that the plant response is consistent with the physical phenomena modeled and the failure criteria used. The results also confirm that the CANDU design is robust with respect to severe accidents, which is reflected in the calculated long times that are available for administering accident management measures to arrest the accident progression before the calandria vessel or containment become at risk. (author)

  10. Preliminary Analysis of a Loss of Condenser Vacuum Accident Using the MARS-KS Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jieun Kim; Bang, Young Seok; Oh, Deog Yeon; Kim, Kap; Woo, Sweng-Wong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    In accordance with revision of NUREG-0800 of USNRC, the area of review for loss of condenser vacuum(LOCV) accident has been expanded to analyze both peak pressures of primary and secondary system separately. Currently, the analysis of LOCV accident, which is caused by malfunction of condenser, has been focused to fuel cladding integrity and peak pressure in the primary system. In this paper, accident analysis for LOCV using MARS-KS code were conducted to support the licensing review on transient behavior of secondary system pressure of APR1400 plant. The accident analysis for the loss of condenser vacuum (LOCV) of APR1400 was conducted with the MARS-KS code to support the review on the pressure behavior of primary and secondary system. Total four cases which have different combination of availability of offsite power and the pressurizer spray are considered. The preliminary analysis results shows that the initial conditions or assumptions which concludes the severe consequence are different for each viewpoint, and in some cases, it could be confront with each viewpoint. Therefore, with regard to the each acceptance criteria, figuring out and sensitivity analysis of the initial conditions and assumptions for system pressure would be necessary.

  11. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  12. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  13. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  14. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  15. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  16. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  17. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  18. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  19. Computational content analysis of European Central Bank statements

    NARCIS (Netherlands)

    Milea, D.V.; Almeida, R.J.; Sharef, N.M.; Kaymak, U.; Frasincar, F.

    2012-01-01

    In this paper we present a framework for the computational content analysis of European Central Bank (ECB) statements. Based on this framework, we provide two approaches that can be used in a practical context. Both approaches use the content of ECB statements to predict upward and downward movement

  20. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  1. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  2. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  3. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  4. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  5. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  6. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  7. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  8. Preliminary Analysis of Rapid Condensation Experiment with MARS-KS Code

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Jae Ho; Jun, Hwang Yong; Jeong, Hae Yong [Sejong University, Seoul (Korea, Republic of)

    2016-05-15

    In the present study, the rapid condensation experiment performed in MANOTEA facility is analyzed with the MARS-KS code. It is known that there exists some limitation with a system code to predict this kind of a very active condensation due to direct mixing of cold injection flow and steam. Through the analysis we investigated the applicability of MARS-KS code for the design of various passive safety systems in the future. The configuration of the experimental facility MANOTEA, which has been constructed at the University of Maryland - United States Naval Academy, is described and the modeling approach using the MARS-KS code is also provided. The preliminary result shows that the MARS-KS predicts the general trend of pressure and temperature in the condensing part correctly. However, it is also found that there exist some limitations in the simulation such as an unexpected pressure peak or a sudden temperature change.

  9. Preliminary analysis of surface radiation measurements recorded at the Nansen ice sheet (Antarctica)

    International Nuclear Information System (INIS)

    Bonafe', U.; Dalpane, E.; Georgiadis, T.; Pitacco, A.

    1996-01-01

    An experiment on radiation and surface energy balance was conducted during the 9. Italian expedition in Antarctica at the Nancen ice sheet, a glacier situated close to the Italian base at Terra Nova Bay, to correlate surface balances to the formation and development of katabatic winds. Measurements were taken by radiometers covering the whole spectra of solar and terrestrial emissions and by fast sensors of atmospheric wind velocity and humidity for the application of the eddy correlation technique. A preliminary analysis of the radiometric data collected in order to quantify the major components of radiative energy balance during the Antarctic summer in clear sky conditions is reported and discussed. The findings show the very low available energy (mean about 1 W/m 2 ), in terms of net radiation, for the physical processes such as sensible- and latent-heat fluxes. Long-wave radiation balance was applied to estimate the reliability of the Swinbank's parametrization, relative to general conditions of the atmosphere

  10. Crystallization and preliminary X-ray analysis of Escherichia coli RNase G

    International Nuclear Information System (INIS)

    Fang, Pengfei; Wang, Jing; Li, Xu; Guo, Min; Xing, Li; Cao, Xu; Zhu, Yi; Gao, Yan; Niu, Liwen; Teng, Maikun

    2009-01-01

    Full-length E. coli RNase G was overexpressed, purified and crystallized. Diffraction data were collected to a resolution of 3.4 Å. The homologous RNases RNase E and RNase G are widely distributed in bacteria and function in many important physiological processes, including mRNA degradation, rRNA maturation and so on. In this study, the crystallization and preliminary X-ray analysis of RNase G from Escherichia coli is described. Purified recombinant E. coli RNase G, which has 497 amino acids, was crystallized in the cubic space group F432, with unit-cell parameters a = b = c = 219.84 Å. X-ray diffraction data were collected to a resolution of 3.4 Å

  11. Crystallization and preliminary X-ray diffraction analysis of the middle domain of Paip1

    International Nuclear Information System (INIS)

    Kanaan, Ahmad Seif; Frank, Filipp; Maedler-Kron, Chelsea; Verma, Karan; Sonenberg, Nahum; Nagar, Bhushan

    2009-01-01

    The crystallization of the putative MIF4G domain of Paip1 is described. The crystals belonged to the monoclinic space group P2 1 and diffracted X-rays to beyond 2.2 Å resolution. The poly(A)-binding protein (PABP) simultaneously interacts with the poly(A) tail of mRNAs and the scaffolding protein eIF4G to mediate mRNA circularization, resulting in stimulation of protein translation. PABP is regulated by the PABP-interacting protein Paip1. Paip1 is thought to act as a translational activator in 5′ cap-dependent translation by interacting with PABP and the initiation factors eIF4A and eIF3. Here, the crystallization and preliminary diffraction analysis of the middle domain of Paip1 (Paip1M), which produces crystals that diffract to a resolution of 2.2 Å, are presented

  12. Preliminary analysis of the effect of the grid spacers on the reflood heat transfer

    International Nuclear Information System (INIS)

    Sugimoto, Jun; Murao, Yoshio

    1982-02-01

    The results are described about the preliminary analysis of the effect of the grid spacers on the heat transfer during reflood phase of a PWR LOCA. Experiments at JAERI and other facilities showed substantial heat transfer enhancement near the grid spacers. The heat transfer enhancement decreases with the distance from the grid spacers in the downstream region of the grid spacers. Several mechanisms are discussed about the heat transfer enhancement near the grid spacers. A model of a coalescence of the water droplets downstream the spacers is proposed based on the review of the experimental data. The heat transfer correlation for the saturated film boiling is utilized to quantify the heat transfer augmentation by the grid spacers. (author)

  13. A preliminary analysis of the groundwater recharge to the Karoo formations, mid-Zambesi basin, Zimbabwe

    DEFF Research Database (Denmark)

    Larsen, Flemming; Owen, R.; Dahlin, T.

    2002-01-01

    A multi-disciplinary study is being carried out on recharge to the Karoo sandstone aquifer in the western part of Zimbabwe, where recharge is controlled by the presence of a thick, confining basalt layer. The aquifer is geographically extensive, and has been identified throughout the southern part......, before it dips below an impervious basalt cover. However, resistivity profiling shows that the basalt at the basin margin is weathered and fractured, and probably permeable, while the basalt deeper into the basin is fresh, solid and impermeable. Field and laboratory analysis of 22 groundwater samples......–130 mm/yr, with an average value of 25 mm/yr. Preliminary results of recharge estimate using 36Cl data suggests lower direct infiltration rates, but further studies are needed. The combination of hydro-chemical, isotopic and geophysical investigations show that the recharge area extends well beyond...

  14. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Hart, Reid; Athalye, Rahul A.; Rosenberg, Michael I.; Richman, Eric E.; Winiarski, David W.

    2014-03-01

    Section 304(b) of the Energy Conservation and Production Act (ECPA), as amended, requires the Secretary of Energy to make a determination each time a revised version of ASHRAE Standard 90.1 is published with respect to whether the revised standard would improve energy efficiency in commercial buildings. When the U.S. Department of Energy (DOE) issues an affirmative determination on Standard 90.1, states are statutorily required to certify within two years that they have reviewed and updated the commercial provisions of their building energy code, with respect to energy efficiency, to meet or exceed the revised standard. This report provides a preliminary qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition).

  15. Crystallization and preliminary X-ray analysis of Leishmania major glyoxalase I

    Energy Technology Data Exchange (ETDEWEB)

    Ariza, Antonio; Vickers, Tim J.; Greig, Neil; Fairlamb, Alan H.; Bond, Charles S., E-mail: c.s.bond@dundee.ac.uk [Division of Biological Chemistry and Molecular Microbiology, Wellcome Trust Biocentre, School of Life Sciences, University of Dundee, Dundee DD1 5EH,Scotland (United Kingdom)

    2005-08-01

    The detoxification enzyme glyoxalase I from L. major has been crystallized. Preliminary molecular-replacement calculations indicate the presence of three glyoxalase I dimers in the asymmetric unit. Glyoxalase I (GLO1) is a putative drug target for trypanosomatids, which are pathogenic protozoa that include the causative agents of leishmaniasis. Significant sequence and functional differences between Leishmania major and human GLO1 suggest that it may make a suitable template for rational inhibitor design. L. major GLO1 was crystallized in two forms: the first is extremely disordered and does not diffract, while the second, an orthorhombic form, produces diffraction to 2.0 Å. Molecular-replacement calculations indicate that there are three GLO1 dimers in the asymmetric unit, which take up a helical arrangement with their molecular dyads arranged approximately perpendicular to the c axis. Further analysis of these data are under way.

  16. Preliminary analysis of a membrane-based atmosphere-control subsystem

    Science.gov (United States)

    Mccray, Scott B.; Newbold, David D.; Ray, Rod; Ogle, Kathryn

    1993-01-01

    Controlled ecological life supprot systems will require subsystems for maintaining the consentrations of atmospheric gases within acceptable ranges in human habitat chambers and plant growth chambers. The goal of this work was to develop a membrane-based atmosphere comntrol (MBAC) subsystem that allows the controlled exchange of atmospheric componets (e.g., oxygen, carbon dioxide, and water vapor) between these chambers. The MBAC subsystem promises to offer a simple, nonenergy intensive method to separate, store and exchange atmospheric components, producing optimal concentrations of components in each chamber. In this paper, the results of a preliminary analysis of the MBAC subsystem for control of oxygen and nitrogen are presented. Additionally, the MBAC subsystem and its operation are described.

  17. Expression, crystallization and preliminary X-ray analysis of the phosphoribosylglycinamide formyltransferase from Streptococcus mutans

    International Nuclear Information System (INIS)

    Zhai, Fangli; Liu, Xiaojuan; Ruan, Jing; Li, Jing; Liu, Zhenlong; Hu, Yulin; Li, Shentao

    2011-01-01

    Phosphoribosylglycinamide formyltransferase (PurN) from Streptococcus mutans was expressed in E. coli, purified and studied crystallographically. Phosphoribosylglycinamide formyltransferase (PurN) from Streptococcus mutans was recombinantly expressed in Escherichia coli. An effective purification protocol was established. The purified protein, which had a purity of >95%, was identified by SDS–PAGE and MALDI–TOF MS. The protein was crystallized using the vapour-diffusion method in hanging-drop mode with PEG 3350 as the primary precipitant. X-ray diffraction data were collected to 2.1 Å resolution. Preliminary X-ray analysis indicated that the crystal belonged to space group P2 1 2 1 2 1 , with unit-cell parameters a = 52.25, b = 63.29, c = 131.81 Å

  18. Purification, crystallization and preliminary X-ray crystallographic analysis of chitinase from Bacillus cereus NCTU2

    Energy Technology Data Exchange (ETDEWEB)

    Kuo, Chueh-Yuan [Life Science Group, Research Division, National Synchrotron Radiation Research Center, Hsinchu 30076,Taiwan (China); Institute of Bioinformatics and Structural Biology, National Tsing-Hua University, Hsinchu 30013,Taiwan (China); Wu, Yue-Jin [Department of Applied Chemistry, National Chiao Tung University, Hsinchu 30010,Taiwan (China); Hsieh, Yin-Cheng; Guan, Hong-Hsiang [Life Science Group, Research Division, National Synchrotron Radiation Research Center, Hsinchu 30076,Taiwan (China); Institute of Bioinformatics and Structural Biology, National Tsing-Hua University, Hsinchu 30013,Taiwan (China); Tsai, Huei-Ju [Department of Applied Chemistry, National Chiao Tung University, Hsinchu 30010,Taiwan (China); Lin, Yi-Hung; Huang, Yen-Chieh; Liu, Ming-Yih [Life Science Group, Research Division, National Synchrotron Radiation Research Center, Hsinchu 30076,Taiwan (China); Li, Yaw-Kuen, E-mail: ykl@cc.nctu.edu.tw [Department of Applied Chemistry, National Chiao Tung University, Hsinchu 30010,Taiwan (China); Chen, Chun-Jung, E-mail: ykl@cc.nctu.edu.tw [Life Science Group, Research Division, National Synchrotron Radiation Research Center, Hsinchu 30076,Taiwan (China); Department of Physics, National Tsing-Hua University, Hsinchu 30013,Taiwan (China)

    2006-09-01

    The crystallization of B. cereus chitinase is reported. Chitinases (EC 3.2.1.14) are found in a broad range of organisms, including bacteria, fungi and higher plants, and play different roles depending on their origin. A chitinase from Bacillus cereus NCTU2 (ChiNCTU2) capable of hydrolyzing chitin as a carbon and nitrogen nutrient has been identified as a member of the family 18 glycoside hydrolases. ChiNCTU2 of molecular weight 36 kDa has been crystallized using the hanging-drop vapour-diffusion method. According to the diffraction of chitinase crystals at 1.10 Å resolution, the crystal belongs to space group P2{sub 1}, with unit-cell parameters a = 50.79, b = 48.79, c = 66.87 Å, β = 99.31°. Preliminary analysis indicates there is one chitinase molecule in the asymmetric unit, with a solvent content of 43.4%.

  19. Containment failure modes preliminary analysis for Atucha-I nuclear power plant during severe accidents

    International Nuclear Information System (INIS)

    Baron, J.; Caballero, C.; Zarate, S.M.

    1997-01-01

    The present work has the objective to analyze the containment behavior of the Atucha-I nuclear power plant during a severe accident, as part of a probabilistic safety assessment (PSA). Initially, a generic description of the containment failure modes considered in other PSAs is performed. Then, the possible containment failure modes for Atucha I are qualitatively analyzed, according to it design peculiarities. These failure modes involve some substantial differences from other PSAs, due to the particular design of Atucha I. Among others, it is studied the influence of: moderator/coolant separation, existence of cooling Zircaloy channels, existence of filling bodies inside the pressure vessel, reactor cavity geometry, on-line refueling mode, and existence of a double shell containment (steel and concrete) with an annular separation room. As a functions of the before mentioning analysis, a series of parameters to be taken into account is defined, on a preliminary basis, for definition of the plant damage states. (author) [es

  20. Visual Assessment on Coastal Cruise Tourism: A Preliminary Planning Using Importance Performance Analysis

    Science.gov (United States)

    Trisutomo, S.

    2017-07-01

    Importance-Performance Analysis (IPA) has been widely applied in many cases. In this research, IPA was applied to measure perceive on coastal tourism objects and its possibility to be developed as coastal cruise tourism in Makassar. Three objects, i.e. Akkarena recreational site, Losari public space at waterfront, and Paotere traditional Phinisi ships port, were selected and assessed visually from water area by a group of purposive resource persons. The importance and performance of 10 attributes of each site were scored using Likert scale from 1 to 5. Data were processed by SPSS-21 than resulted Cartesian graph which the scores were divided in four quadrants: Quadrant I concentric here, Quadrant II keep up the good work, Quadrant III low priority, and Quadrant IV possible overkill. The attributes in each quadrant could be considered as the platform for preliminary planning of coastal cruise tour in Makassar