WorldWideScience

Sample records for maccs code version

  1. MELCOR Accident Consequence Code System (MACCS)

    Energy Technology Data Exchange (ETDEWEB)

    Jow, H.N.; Sprung, J.L.; Ritchie, L.T. (Sandia National Labs., Albuquerque, NM (USA)); Rollstin, J.A. (GRAM, Inc., Albuquerque, NM (USA)); Chanin, D.I. (Technadyne Engineering Consultants, Inc., Albuquerque, NM (USA))

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management. 59 refs., 14 figs., 15 tabs.

  2. MELCOR Accident Consequence Code System (MACCS)

    Energy Technology Data Exchange (ETDEWEB)

    Rollstin, J.A. (GRAM, Inc., Albuquerque, NM (USA)); Chanin, D.I. (Technadyne Engineering Consultants, Inc., Albuquerque, NM (USA)); Jow, H.N. (Sandia National Labs., Albuquerque, NM (USA))

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projections, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management.

  3. MELCOR Accident Consequence Code System (MACCS)

    Energy Technology Data Exchange (ETDEWEB)

    Chanin, D.I. (Technadyne Engineering Consultants, Inc., Albuquerque, NM (USA)); Sprung, J.L.; Ritchie, L.T.; Jow, Hong-Nian (Sandia National Labs., Albuquerque, NM (USA))

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previous CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. This document, Volume 1, the Users's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems.

  4. Comparison of MACCS users calculations for the international comparison exercise on probabilistic accident consequence assessment code, October 1989--June 1993

    Energy Technology Data Exchange (ETDEWEB)

    Neymotin, L. [Brookhaven National Lab., Upton, NY (United States)

    1994-04-01

    Over the past several years, the OECD/NEA and CEC sponsored an international program intercomparing a group of six probabilistic consequence assessment (PCA) codes designed to simulate health and economic consequences of radioactive releases into atmosphere of radioactive materials following severe accidents at nuclear power plants (NPPs): ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this effort, two separate groups performed similar calculations using the MACCS and COSYMA codes. Results produced in the MACCS Users Group (Greece, Italy, Spain, and USA) calculations and their comparison are contained in the present report. Version 1.5.11.1 of the MACCS code was used for the calculations. Good agreement between the results produced in the four participating calculations has been reached, with the exception of the results related to the ingestion pathway dose predictions. The main reason for the scatter in those particular results is attributed to the lack of a straightforward implementation of the specifications for agricultural production and counter-measures criteria provided for the exercise. A significantly smaller scatter in predictions of other consequences was successfully explained by differences in meteorological files and weather sampling, grids, rain distance intervals, dispersion model options, and population distributions.

  5. Application of the MACCS code to DOE production reactor operation

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K.R.; East, J.M. (Westinghouse Savannah River Co., Aiken, SC (United States))

    1991-01-01

    A three-level probabilistic risk assessment (PRA) of the special materials production reactor operation at the US Department of Energy's (DOE's) Savannah River site (SRS) has been completed. The goals of this analysis were to: (1) analyze existing margins of safety provided by the heavy water reactor (HWR) design challenged by postulated severe accidents; (2) compare measures of risk to the general public and on-site workers to guideline values, as well as to those posed by commercial reactor operation; and (3) develop the methodology and data base necessary to determine the equipment, human actions, and engineering systems that contribute significantly to ensuring overall plant safety. In particular, the third point provides the most tangible benefit of a PRA since the process yields a prioritized approach to increasing safety through design and operating practices. This paper describes key aspects of the consequence analysis portion of the SRS PRA: Given the radiological releases quantified through the level-2 PRA analysis, the consequences to the off-site general public and to the on-site SRS workforce are calculated. This analysis, the third level of the PRA, is conducted primarily with the MACCS 1.5 code. The level-3 PRA yields a probabilistic assessment of health and economic effects based on meteorological conditions sampled from site-specific data.

  6. Code manual for MACCS2: Volume 1, user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Chanin, D.I.; Young, M.L.

    1997-03-01

    This report describes the use of the MACCS2 code. The document is primarily a user`s guide, though some model description information is included. MACCS2 represents a major enhancement of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, distributed by government code centers since 1990, was developed to evaluate the impacts of severe accidents at nuclear power plants on the surrounding public. The principal phenomena considered are atmospheric transport and deposition under time-variant meteorology, short- and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. No other U.S. code that is publicly available at present offers all these capabilities. MACCS2 was developed as a general-purpose tool applicable to diverse reactor and nonreactor facilities licensed by the Nuclear Regulatory Commission or operated by the Department of Energy or the Department of Defense. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency-response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. Other improvements are in the areas of phenomenological modeling and new output options. Initial installation of the code, written in FORTRAN 77, requires a 486 or higher IBM-compatible PC with 8 MB of RAM.

  7. SNPs in the coding region of the metastasis-inducing gene MACC1 and clinical outcome in colorectal cancer

    Directory of Open Access Journals (Sweden)

    Schmid Felicitas

    2012-07-01

    Full Text Available Abstract Background Colorectal cancer is one of the main cancers in the Western world. About 90% of the deaths arise from formation of distant metastasis. The expression of the newly identified gene metastasis associated in colon cancer 1 (MACC1 is a prognostic indicator for colon cancer metastasis. Here, we analyzed for the first time the impact of single nucleotide polymorphisms (SNPs in the coding region of MACC1 for clinical outcome of colorectal cancer patients. Additionally, we screened met proto-oncogene (Met, the transcriptional target gene of MACC1, for mutations. Methods We sequenced the coding exons of MACC1 in 154 colorectal tumors (stages I, II and III and the crucial exons of Met in 60 colorectal tumors (stages I, II and III. We analyzed the association of MACC1 polymorphisms with clinical data, including metachronous metastasis, UICC stages, tumor invasion, lymph node metastasis and patients’ survival (n = 154, stages I, II and III. Furthermore, we performed biological assays in order to evaluate the functional impact of MACC1 SNPs on the motility of colorectal cancer cells. Results We genotyped three MACC1 SNPs in the coding region. Thirteen % of the tumors had the genotype cg (rs4721888, L31V, 48% a ct genotype (rs975263, S515L and 84% a gc or cc genotype (rs3735615, R804T. We found no association of these SNPs with clinicopathological parameters or with patients’ survival, when analyzing the entire patients’ cohort. An increased risk for a shorter metastasis-free survival of patients with a ct genotype (rs975263 was observed in younger colon cancer patients with stage I or II (P = 0.041, n = 18. In cell culture, MACC1 SNPs did not affect MACC1-induced cell motility and proliferation. Conclusion In summary, the identification of coding MACC1 SNPs in primary colorectal tumors does not improve the prediction for metastasis formation or for patients’ survival compared to MACC1 expression analysis alone. The ct genotype (rs

  8. MACCS2 development and verification efforts

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  9. Reevaluation of the emergency planning zone for nuclear power plants in Taiwan using MACCS2 code.

    Science.gov (United States)

    Wu, Jay; Yang, Yung-Muh; Chen, Ing-Jane; Chen, Huan-Tong; Chuang, Keh-Shih

    2006-04-01

    According to government regulations, the emergency planning zone (EPZ) of a nuclear power plant (NPP) must be designated before operation and reevaluated every 5 years. Corresponding emergency response planning (ERP) has to be made in advance to guarantee that all necessary resources are available under accidental releases of radioisotope. In this study, the EPZ for each of the three operating NPPs, Chinshan, Kuosheng, and Maanshan, in Taiwan was reevaluated using the MELCOR Accident Consequence Code System 2 (MACCS2) developed by Sandia National Laboratory. Meteorological data around the nuclear power plant were collected during 2003. The source term data including inventory, sensible heat content, and timing duration, were based on previous PRA information of each plant. The effective dose equivalent and thyroid dose together with the related individual risk and societal risk were calculated. By comparing the results to the protective action guide and related safety criteria, 1.5, 1.5, and 4.5km were estimated for Chinshan, Kuosheng, and Maanshan NPPs, respectively. We suggest that a radius of 5.0km is a reasonably conservative value of EPZ for each of the three operating NPPs in Taiwan.

  10. Review of the chronic exposure pathways models in MACCS (MELCOR Accident Consequence Code System) and several other well-known probabilistic risk assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Tveten, U. (Institutt for Energiteknikk, Kjeller (Norway))

    1990-06-01

    The purpose of this report is to document the results of the work performed by the author in connection with the following task, performed for US Nuclear Regulatory Commission, (USNRC) Office of Nuclear Regulatory Research, Division of Systems Research: MACCS Chronic Exposure Pathway Models: Review the chronic exposure pathway models implemented in the MELCOR Accident Consequence Code System (MACCS) and compare those models to the chronic exposure pathway models implemented in similar codes developed in countries that are members of the OECD. The chronic exposures concerned are via: the terrestrial food pathways, the water pathways, the long-term groundshine pathway, and the inhalation of resuspended radionuclides pathway. The USNRC has indicated during discussions of the task that the major effort should be spent on the terrestrial food pathways. There is one chapter for each of the categories of chronic exposure pathways listed above.

  11. Application of Korean Specific Data to Economic Cost Estimation by KOSCA-MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sun Yeong; Jang, Seung-Cheol [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Default values for various data provided by MACCS2(MELCOR Accident Consequence Code System Version 2) such as population, weather, food, and economic cost are far from current domestic condition. In the case of economic cost data, related default values came from MACCS and WASH-1400. KAERI (Korea Atomic Energy Research Institute) has been developed a Korean-specific level 3 PSA (Probabilistic Safety Assessment) code package based on MACCS2 to reflect domestic condition for off-site consequence analysis. To this end, we performed a study on the domestic specific technical issues for level 3 PSA, which are a dose conversion factor, food chain model, atmospheric dispersion model, and domestic-specific economic effect model. Based on the study, we developed a level 3 PSA code, so-called KOSCAMACCS2 (Korean-specific Off-Site Consequence Analysis based on MACCS2). The purpose of this paper is to introduce economic cost variable provided by KOSCA-MACCS2 and application of Korean-specific data to the related economic cost estimation with KOSCA-MACCS2. In this paper, we introduced economic cost variable provided by KOSCA-MACCS2 and suggested the application plan of Korean-specific data to the related economic cost estimation. To this end, we considered data sources for those economic cost variables to reflect Korea-specific features such as data by Statistics Korea or Bank of Korea etc. For the decontamination related variables, we applied foreign literatures to apply data, which are Extern-E and UNESCO Chernobyl Forum data. Based on the data resources we estimated data for input variables related to economic cost estimation.

  12. BENCHMARKING UPGRADED HOTSPOT DOSE CALCULATIONS AGAINST MACCS2 RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Brotherton, Kevin

    2009-04-30

    The radiological consequence of interest for a documented safety analysis (DSA) is the centerline Total Effective Dose Equivalent (TEDE) incurred by the Maximally Exposed Offsite Individual (MOI) evaluated at the 95th percentile consequence level. An upgraded version of HotSpot (Version 2.07) has been developed with the capabilities to read site meteorological data and perform the necessary statistical calculations to determine the 95th percentile consequence result. These capabilities should allow HotSpot to join MACCS2 (Version 1.13.1) and GENII (Version 1.485) as radiological consequence toolbox codes in the Department of Energy (DOE) Safety Software Central Registry. Using the same meteorological data file, scenarios involving a one curie release of {sup 239}Pu were modeled in both HotSpot and MACCS2. Several sets of release conditions were modeled, and the results compared. In each case, input parameter specifications for each code were chosen to match one another as much as the codes would allow. The results from the two codes are in excellent agreement. Slight differences observed in results are explained by algorithm differences.

  13. ALTERNATIVES OF MACCS2 IN LANL DISPERSION ANALYSIS FOR ONSITE AND OFFSITE DOSES

    Energy Technology Data Exchange (ETDEWEB)

    Wang, John HC [Los Alamos National Laboratory

    2012-05-01

    In modeling atmospheric dispersion to determine accidental release of radiological material, one of the common statistical analysis tools used at Los Alamos National Laboratory (LANL) is MELCOR Accident Consequence Code System, Version 2 (MACCS2). MACCS2, however, has some limitations and shortfalls for both onsite and offsite applications. Alternative computer codes, which could provide more realistic calculations, are being investigated for use at LANL. In the Yucca Mountain Project (YMP), the suitability of MACCS2 for the calculation of onsite worker doses was a concern; therefore, ARCON96 was chosen to replace MACCS2. YMP's use of ARCON96 provided results which clearly demonstrated the program's merit for onsite worker safety analyses in a wide range of complex configurations and scenarios. For offsite public exposures, the conservatism of MACCS2 on the treatment of turbulence phenomena at LANL is examined in this paper. The results show a factor of at least two conservatism in calculated public doses. The new EPA air quality model, AERMOD, which implements advanced meteorological turbulence calculations, is a good candidate for LANL applications to provide more confidence in the accuracy of offsite public dose projections.

  14. A comparative study of worker and general public risks from nuclear facility operation using MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    East, J.M.; O`Kula, K.R. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1995-12-31

    Over the last five years, the US Department of Energy (DOE) has attempted to establish quantitative risk indices as minimum acceptance criteria for assurance of safe operation of its nuclear facilities. The risk indices serve as aiming points or targets to include consideration of all aspects of operation including normal conditions as well as abnormal, design basis events, and beyond-design basis events. Although initial focus of the application of these safety targets had been on DOE`s reactors, more recent assessments have also considered non-reactor facilities including those encompassing storage and nuclear processing activities. Regardless of the facility`s primary function, accident progression, event tree/fault tree logic models, and probabilistic (dose) consequence assessment model must be implemented to yield a fully integrated analysis of facility operation. The primary tool for probabilistic consequence assessment in the U.S. is the MELCOR Accident Consequence Code System (MACCS). In this study, two version of MACCS are applied to representative source terms developed in the safety analysis associated with a waste processing facility at the Westinghouse Savannah River Company`s (WSRC`s) Savannah River Site (SRS). The MACCS versions are used to estimate population dose and subsequent health effects to workers and the general public from the SRS referenced facility operation. When combined with the frequency of occurrence evaluation, the margin of compliance with the safety targets may be quantified. Additionally, numerical evaluation of the safety targets with the two code versions will serve as an indicator of the impact of the enhancements made to MACCS relative to earlier baseline software.

  15. Tandem Mirror Reactor Systems Code (Version I)

    Energy Technology Data Exchange (ETDEWEB)

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.; Barrett, R.J.; Gorker, G.E.; Spampinaton, P.T.; Bulmer, R.H.; Dorn, D.W.; Perkins, L.J.; Ghose, S.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost.

  16. A comparative study of worker and general public risks from nuclear facility operation using MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    East, J.M.; O`Kula, K.R.

    1994-10-01

    Over the last five years, the US Department of Energy (DOE) has attempted to establish quantitative risk indices as minimum acceptance criteria for assurance of safe operation of its nuclear facilities. The risk indices serve as aiming points or targets to include consideration of all aspects of operation including normal conditions as well as abnormal, design basis events, and beyond-design basis events. Although initial focus of the application of these safety targets had been on DOE`s reactors, more recent assessments have also considered non-reactor facilities including those encompassing storage and nuclear processing activities. Regardless of the facility`s primary function, accident progression, event tree/fault tree logic models, and probabilistic (dose) consequence assessment model must be implemented to yield a fully integrated analysis of facility operation. The primary tool for probabilistic consequence assessment in the US is the MELCOR Accident Consequence Code System (MACCS). In this study, two version of MACCS are applied to representative source terms developed in the safety analysis associated with a waste processing facility at the Westinghouse Savannah River Company`s (WSRC`s) Savannah River Site (SRS). The MACCS versions are used to estimate population dose and subsequent health effects to workers and the general public from the SRS referenced facility operation. When combined with the frequency of occurrence evaluation, the margin of compliance with the safety targets may be quantified.

  17. Version-Centric Visualization of Code Evolution

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.; Chaudron, M.

    2005-01-01

    The source code of software systems changes many times during the system lifecycle. We study how developers can get insight in these changes in order to understand the project context and the product artifacts. For this we propose new techniques for code evolution representation and visualization in

  18. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  19. Fast Scattering Code (FSC) User's Manual: Version 2

    Science.gov (United States)

    Tinetti, Ana F.; Dun, M. H.; Pope, D. Stuart

    2006-01-01

    The Fast Scattering Code (version 2.0) is a computer program for predicting the three-dimensional scattered acoustic field produced by the interaction of known, time-harmonic, incident sound with aerostructures in the presence of potential background flow. The FSC has been developed for use as an aeroacoustic analysis tool for assessing global effects on noise radiation and scattering caused by changes in configuration (geometry, component placement) and operating conditions (background flow, excitation frequency).

  20. An improved version of the MICROX-2 code

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, D. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-11-01

    The MICROX-2 code prepares broad group neutron cross sections for use in diffusion- and/or transport-theory codes from an input library of fine group and pointwise cross sections. The neutron weighting spectrum is obtained by solving the B{sub 1} neutron balance equations at about 10000 energies in a one-dimensional (planar, spherical or cylindrical), two-region unit cell. The regions are coupled by collision probabilities based upon spatially flat neutron emission. Energy dependent Dancoff factors and bucklings correct the one-dimensional calculations for multi-dimensional lattice effects. A critical buckling search option is also included. The inner region may include two different types of fuel particles (grains). This report describes the present PSI FORTRAN 90 version of the MICROX-2 code which operates on CRAY computers and IBM PC`s. The equations which are solved in the various energy ranges are given along with descriptions of various changes that have been made in the present PSI version of the code. A completely re-written description of the user input is also included. (author) 7 figs., 4 tabs., 59 refs.

  1. Input-output model for MACCS nuclear accident impacts estimation¹

    Energy Technology Data Exchange (ETDEWEB)

    Outkin, Alexander V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bixler, Nathan E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vargas, Vanessa N [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-27

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domestic product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.

  2. Comparison of the MACCS2 atmospheric transport model with Lagrangian puff models as applied to deterministic and probabilistic safety analysis.

    Science.gov (United States)

    Till, John E; Rood, Arthur S; Garzon, Caroline D; Lagdon, Richard H

    2014-09-01

    The suitability of a new facility in terms of potential impacts from routine and accidental releases is typically evaluated using conservative models and assumptions to assure dose standards are not exceeded. However, overly conservative dose estimates that exceed target doses can result in unnecessary and costly facility design changes. This paper examines one such case involving the U.S. Department of Energy's pretreatment facility of the Waste Treatment and Immobilization Plant (WTP). The MELCOR Accident Consequence Code System Version 2 (MACCS2) was run using conservative parameter values in prescribed guidance to demonstrate that the dose from a postulated airborne release would not exceed the guideline dose of 0.25 Sv. External review of default model parameters identified the deposition velocity of 1.0 cm s as being non-conservative. The deposition velocity calculated using resistance models was in the range of 0.1 to 0.3 cm s-1. A value of 0.1 cm s-1 would result in the dose guideline being exceeded. To test the overall conservatism of the MACCS2 transport model, the 95th percentile hourly average dispersion factor based on one year of meteorological data was compared to dispersion factors generated from two state-of-the-art Lagrangian puff models. The 95th percentile dispersion factor from MACCS2 was a factor of 3 to 6 higher compared to those of the Lagrangian puff models at a distance of 9.3 km and a deposition velocity of 0.1 cm s-1. Thus, the inherent conservatism in MACCS2 more than compensated for the high deposition velocity used in the assessment. Applications of models like MACCS2 with a conservative set of parameters are essentially screening calculations, and failure to meet dose criteria should not trigger facility design changes but prompt a more in-depth analysis using probabilistic methods with a defined margin of safety in the target dose. A sample application of the probabilistic approach is provided.

  3. Study on the code system for the off-site consequences assessment of severe nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sora; Mn, Byung Il; Park, Ki Hyun; Yang, Byung Mo; Suh, Kyung Suk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-12-15

    The importance of severe nuclear accidents and probabilistic safety assessment (PSA) were brought to international attention with the occurrence of severe nuclear accidents caused by the extreme natural disaster at Fukushima Daiichi nuclear power plant in Japan. In Korea, studies on level 3 PSA had made little progress until recently. The code systems of level 3 PSA, MACCS2 (MELCORE Accident Consequence Code System 2, US), COSYMA (COde SYstem from MAria, EU) and OSCAAR (Off-Site Consequence Analysis code for Atmospheric Releases in reactor accidents, JAPAN), were reviewed in this study, and the disadvantages and limitations of MACCS2 were also analyzed. Experts from Korea and abroad pointed out that the limitations of MACCS2 include the following: MACCS2 cannot simulate multi-unit accidents/release from spent fuel pools, and its atmospheric dispersion is based on a simple Gaussian plume model. Some of these limitations have been improved in the updated versions of MACCS2. The absence of a marine and aquatic dispersion model and the limited simulating range of food-chain and economic models are also important aspects that need to be improved. This paper is expected to be utilized as basic research material for developing a Korean code system for assessing off-site consequences of severe nuclear accidents.

  4. Version 3.0 of code Java for 3D simulation of the CCA model

    Science.gov (United States)

    Zhang, Kebo; Zuo, Junsen; Dou, Yifeng; Li, Chao; Xiong, Hailing

    2016-10-01

    In this paper we provide a new version of program for replacing the previous version. The frequency of traversing the clusters-list was reduced, and some code blocks were optimized properly; in addition, we appended and revised the comments of the source code for some methods or attributes. The compared experimental results show that new version has better time efficiency than the previous version.

  5. Estimation of Intervention Distances for Urgent Protective Actions Using Comparative Approach of MACCS and InterRAS

    OpenAIRE

    Mazzammal Hussain; Salah Ud-Din Khan; Waqar A. Adil Syed; Shahab Ud-Din Khan

    2014-01-01

    Distances for taking evacuation as a protective measure during early phase of a nuclear accident have been approximated using MELCOR Accident Consequence Code System (MACCS). As a reference data, the source term of Pakistan Research Reactor 1 (PARR-1) and meteorological data of Islamabad, Pakistan, have been considered. Based on comparison with published data and international radiological assessment (InterRAS) code results, it is concluded that MACCS is a rational tool for estimation of urge...

  6. A Web Server for MACCS Magnetometer Data

    Science.gov (United States)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  7. Establishment of Infrastructure for Domestic-Specific Level 3 PSA based on MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung-Cheol; Han, Seok-Jung; Choi, Sun-Yeong; Lee, Seung-Jun [KAERI, Daejeon (Korea, Republic of); Kim, Wan-Seob [Korea Reliability Technology and System, Daejeon (Korea, Republic of)

    2015-05-15

    Research activities related to the Level 3 PSA have naturally disappeared since the use of risk surrogates. Recently, Level 3 PSA was only performed to the extent of the purpose of operating license for the plant under construction. Since the Fukushima accident, concern about a comprehensive site-specific Level 3 PSA has been raised for some compelling reasons, especially the evaluation of the domestic multi-unit site risk effect including other site radiological sources (e.g., spent fuel pool, multi-units). Unfortunately, there are no domestic-specific consequence analysis code and input database required to perform a site-specific Level 3 PSA. The paper focuses on the development of the input data management system for domestic-specific Level 3 PSA based MACCS2 (MELCOR Accident Consequence Code System). The authors call it KOSCA-MACCS2 (Korea Off-Site Consequence Analysis based in MACCS2). It serves as an integrated platform for a domestic-specific Level 3 PSA. Also, it provides the pre-processing modules to automatically generate MACCS2 input from diverse types of the domestic-specific data including numerical map data, e.g., meteorological data, numerical population map, digital land use map, economic statistics and so on. Note that some functions should be still developed and added on it, e.g., post-processing module to convert MACCS2 outputs to graphic report forms, and so on. Henceforth, it is necessary to develop a Korean-specific Level 3 PSA code as a substitution for the foreign software, MACCS2.

  8. The LIONS code (version 1.0); Programme LIONS (version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Bertrand, P.

    1993-12-31

    The new LIONS code (Lancement d`IONS or Ion Launching), a dynamical code implemented in the SPIRaL project for the CIME cyclotron studies, is presented. The various software involves a 3D magnetostatic code, 2D or 3D electrostatic codes for generation of realistic field maps, and several dynamical codes for studying the behaviour of the reference particle from the cyclotron center up to the ejection and for launching particles packets complying with given correlations. Its interactions with the other codes are described. The LIONS code, written in Fortran 90 is already used in studying the CIME cyclotron, from the center to the ejection. It is designed to be used, with minor modifications, in other contexts such as for the simulation of mass spectrometer facilities.

  9. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Shapiro, A.; Huria, H.C.; Cho, K.W. [Cincinnati Univ., OH (United States)

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing to disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.

  10. Tracking code patterns over multiple software versions with Herodotos

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Lawall, Julia; Muller, Gilles

    2010-01-01

    . In this case, it is useful to study the occurrences of such patterns, to identify properties such as when and why they are introduced, how long they persist, and the reasons why they are corrected. To enable studying pattern occurrences over time, we propose a tool, Herodotos, that semi-automatically tracks...... pattern occurrences over multiple versions of a software project, independent of other changes in the source files. Guided by a user-provided configuration file, Herodotos builds various graphs showing the evolution of the pattern occurrences and computes some statistics. We have evaluated this approach...

  11. SITA version 0. A simulation and code testing assistant for TOUGH2 and MARNIE

    Energy Technology Data Exchange (ETDEWEB)

    Seher, Holger; Navarro, Martin

    2016-06-15

    High quality standards have to be met by those numerical codes that are applied in long-term safety assessments for deep geological repositories for radioactive waste. The software environment SITA (''a simulation and code testing assistant for TOUGH2 and MARNIE'') has been developed by GRS in order to perform automated regression testing for the flow and transport simulators TOUGH2 and MARNIE. GRS uses the codes TOUGH2 and MARNIE in order to assess the performance of deep geological repositories for radioactive waste. With SITA, simulation results of TOUGH2 and MARNIE can be compared to analytical solutions and simulations results of other code versions. SITA uses data interfaces to operate with codes whose input and output depends on the code version. The present report is part of a wider GRS programme to assure and improve the quality of TOUGH2 and MARNIE. It addresses users as well as administrators of SITA.

  12. Development of environmental dose assessment system (EDAS) code of PC version

    CERN Document Server

    Taki, M; Kobayashi, H; Yamaguchi, T

    2003-01-01

    A computer code (EDAS) was developed to assess the public dose for the safety assessment to get the license of nuclear reactor operation. This code system is used for the safety analysis of public around the nuclear reactor in normal operation and severe accident. This code was revised and composed for personal computer user according to the Nuclear Safety Guidelines reflected the ICRP1990 recommendation. These guidelines are revised by Nuclear Safety Commission on March, 2001, which are 'Weather analysis guideline for the safety assessment of nuclear power reactor', 'Public dose around the facility assessment guideline corresponding to the objective value for nuclear power light water reactor' and 'Public dose assessment guideline for safety review of nuclear power light water reactor'. This code has been already opened for public user by JAERI, and English version code and user manual are also prepared. This English version code is helpful for international cooperation concerning the nuclear safety assessme...

  13. Estimation of Intervention Distances for Urgent Protective Actions Using Comparative Approach of MACCS and InterRAS

    Directory of Open Access Journals (Sweden)

    Mazzammal Hussain

    2014-01-01

    Full Text Available Distances for taking evacuation as a protective measure during early phase of a nuclear accident have been approximated using MELCOR Accident Consequence Code System (MACCS. As a reference data, the source term of Pakistan Research Reactor 1 (PARR-1 and meteorological data of Islamabad, Pakistan, have been considered. Based on comparison with published data and international radiological assessment (InterRAS code results, it is concluded that MACCS is a rational tool for estimation of urgent protective actions during early phase of nuclear accident by taking into account the variations in meteorological and release concentrations parameters.

  14. Global Reactive Gases in the MACC project

    Science.gov (United States)

    Schultz, M. G.

    2012-04-01

    In preparation for the planned atmospheric service component of the European Global Monitoring for Environment and Security (GMES) initiative, the EU FP7 project Monitoring of Atmospheric Composition and Climate (MACC) developed a preoperational data assimilation and modelling system for monitoring and forecasting of reactive gases, greenhouse gases and aerosols. The project is coordinated by the European Centre for Medium-Range Weather Forecast (ECMWF) and the system is built on ECMWF's Integrated Forecasting System (IFS) which has been coupled to the chemistry transport models MOZART-3 and TM5. In order to provide daily forecasts of up to 96 hours for global reactive gases, various satellite retrieval products for ozone (total column and profile data), CO, NO2, CH2O and SO2 are either actively assimilated or passively monitored. The MACC system is routinely evaluated with in-situ data from ground-based stations, ozone sondes and aircraft measurements, and with independent satellite retrievals. Global MACC reactive gases forecasts are used in the planning and analysis of large international field campaigns and to provide dynamical chemical boundary conditions to regional air quality models worldwide. Several case studies of outstanding air pollution events have been performed, and they demonstrate the strengths and weaknesses of chemical data assimilation based on current satellite data products. Besides the regular analyses and forecasts of the tropospheric chemical composition, the MACC system is also used to monitor the evolution of stratospheric ozone. A comprehensive reanalysis simulation from 2003 to 2010 provides new insights into the interannual variability of the atmospheric chemical composition.

  15. ORIGEN2: a revised and updated version of the Oak Ridge isotope generation and depletion code

    Energy Technology Data Exchange (ETDEWEB)

    Croff, A.G.

    1980-07-01

    ORIGEN2 is a versatile point depletion and decay computer code for use in simulating nuclear fuel cycles and calculating the nuclide compositions of materials contained therein. This code represents a revision and update of the original ORIGEN computer code which has been distributed world-wide beginning in the early 1970s. The purpose of this report is to give a summary description of a revised and updated version of the original ORIGEN computer code, which has been designated ORIGEN2. A detailed description of the computer code ORIGEN2 is presented. The methods used by ORIGEN2 to solve the nuclear depletion and decay equations are included. Input information necessary to use ORIGEN2 that has not been documented in supporting reports is documented.

  16. Regional Atmospheric Transport Code for Hanford Emission Tracking, Version 2(RATCHET2)

    Energy Technology Data Exchange (ETDEWEB)

    Ramsdell, James V.; Rishel, Jeremy P.

    2006-07-01

    This manual describes the atmospheric model and computer code for the Atmospheric Transport Module within SAC. The Atmospheric Transport Module, called RATCHET2, calculates the time-integrated air concentration and surface deposition of airborne contaminants to the soil. The RATCHET2 code is an adaptation of the Regional Atmospheric Transport Code for Hanford Emissions Tracking (RATCHET). The original RATCHET code was developed to perform the atmospheric transport for the Hanford Environmental Dose Reconstruction Project. Fundamentally, the two sets of codes are identical; no capabilities have been deleted from the original version of RATCHET. Most modifications are generally limited to revision of the run-specification file to streamline the simulation process for SAC.

  17. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  18. Development of environmental dose assessment system (EDAS) code of PC version

    Energy Technology Data Exchange (ETDEWEB)

    Taki, Mitsumasa; Kikuchi, Masamitsu; Kobayashi, Hideo; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-05-01

    A computer code (EDAS) was developed to assess the public dose for the safety assessment to get the license of nuclear reactor operation. This code system is used for the safety analysis of public around the nuclear reactor in normal operation and severe accident. This code was revised and composed for personal computer user according to the Nuclear Safety Guidelines reflected the ICRP1990 recommendation. These guidelines are revised by Nuclear Safety Commission on March, 2001, which are 'Weather analysis guideline for the safety assessment of nuclear power reactor', 'Public dose around the facility assessment guideline corresponding to the objective value for nuclear power light water reactor' and 'Public dose assessment guideline for safety review of nuclear power light water reactor'. This code has been already opened for public user by JAERI, and English version code and user manual are also prepared. This English version code is helpful for international cooperation concerning the nuclear safety assessment with JAERI. (author)

  19. User's Guide for TOUGH2-MP - A Massively Parallel Version of the TOUGH2 Code

    Energy Technology Data Exchange (ETDEWEB)

    Earth Sciences Division; Zhang, Keni; Zhang, Keni; Wu, Yu-Shu; Pruess, Karsten

    2008-05-27

    TOUGH2-MP is a massively parallel (MP) version of the TOUGH2 code, designed for computationally efficient parallel simulation of isothermal and nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. In recent years, computational requirements have become increasingly intensive in large or highly nonlinear problems for applications in areas such as radioactive waste disposal, CO2 geological sequestration, environmental assessment and remediation, reservoir engineering, and groundwater hydrology. The primary objective of developing the parallel-simulation capability is to significantly improve the computational performance of the TOUGH2 family of codes. The particular goal for the parallel simulator is to achieve orders-of-magnitude improvement in computational time for models with ever-increasing complexity. TOUGH2-MP is designed to perform parallel simulation on multi-CPU computational platforms. An earlier version of TOUGH2-MP (V1.0) was based on the TOUGH2 Version 1.4 with EOS3, EOS9, and T2R3D modules, a software previously qualified for applications in the Yucca Mountain project, and was designed for execution on CRAY T3E and IBM SP supercomputers. The current version of TOUGH2-MP (V2.0) includes all fluid property modules of the standard version TOUGH2 V2.0. It provides computationally efficient capabilities using supercomputers, Linux clusters, or multi-core PCs, and also offers many user-friendly features. The parallel simulator inherits all process capabilities from V2.0 together with additional capabilities for handling fractured media from V1.4. This report provides a quick starting guide on how to set up and run the TOUGH2-MP program for users with a basic knowledge of running the (standard) version TOUGH2 code, The report also gives a brief technical description of the code, including a discussion of parallel methodology, code structure, as well as mathematical and numerical methods used

  20. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  1. A Spanish version for the new ERA-EDTA coding system for primary renal disease.

    Science.gov (United States)

    Zurriaga, Óscar; López-Briones, Carmen; Martín Escobar, Eduardo; Saracho-Rotaeche, Ramón; Moina Eguren, Íñigo; Pallardó Mateu, Luis; Abad Díez, José María; Sánchez Miret, José Ignacio

    2015-01-01

    The European Renal Association and the European Dialysis and Transplant Association (ERA-EDTA) have issued an English-language new coding system for primary kidney disease (PKD) aimed at solving the problems that were identified in the list of "Primary renal diagnoses" that has been in use for over 40 years. In the context of Registro Español de Enfermos Renales (Spanish Registry of Renal Patients, [REER]), the need for a translation and adaptation of terms, definitions and notes for the new ERA-EDTA codes was perceived in order to help those who have Spanish as their working language when using such codes. Bilingual nephrologists contributed a professional translation and were involved in a terminological adaptation process, which included a number of phases to contrast translation outputs. Codes, paragraphs, definitions and diagnostic criteria were reviewed and agreements and disagreements aroused for each term were labelled. Finally, the version that was accepted by a majority of reviewers was agreed. A wide agreement was reached in the first review phase, with only 5 points of discrepancy remaining, which were agreed on in the final phase. Translation and adaptation into Spanish represent an improvement that will help to introduce and use the new coding system for PKD, as it can help reducing the time devoted to coding and also the period of adaptation of health workers to the new codes. Copyright © 2015 The Authors. Published by Elsevier España, S.L.U. All rights reserved.

  2. MACC/MACC-II : the case for composition measurements from the geostationary orbit

    Science.gov (United States)

    Peuch, V.; Lahoz, W.; Orphal, J.; Attie, J. E.; El Amraoui, L.; MACC; MACC-II consortia

    2011-12-01

    The MACC (Monitoring Atmospheric Composition and Climate, 2009-2011) and MACC-II (2011-2014) European projects are operating pre-operational services for atmospheric composition and solar/UV radiation in the context of the GMES (Global Monitoring for Environment and Security) program. The services are provided using advanced assimilation and forecast systems: one for the global scale, based upon the Integrated Forecast System (IFS) of ECMWF, and an ensemble of seven regional air quality models (CHIMERE, EMEP, EURAD, LOTOS-EUROS, MATCH, MOCAGE, SILAM) covering Europe. The products comprise analyses, forecasts, delayed mode analyses (taking into account validated data that are not available in Near-Real-Time) and multi-year re-analyses. The large amount of satellite and in-situ composition data currently used for assimilation and/or validation will be briefly presented: greenhouse gases, aerosol, reactive gases and surface air quality. A significant effort in MACC/MACC-II is devoted to models and products evaluation as well as to assimilation of data of various kinds. This effort is hampered by the current global observing system of atmospheric composition, as the overall dataset is mostly comprised of surface in-situ sites (WMO/GAW, air quality networks...) and of satellite products with limited vertical information and time coverage (LEO orbits) -with only few exceptions among which ozone sondes, lidars and MOZAIC-IAGOS aircraft data. The projects for GEO or quasi-GEO composition monitoring developing in Europe (Sentinel-4 and IRS on-board Meteosat Third Generation; the MAGEAQ mission concept), in North America (GEO-CAPE in the US PHEMOS in Canada) and in Asia (GEMS in Korea; projects in Japan and China) are thus regarded as a much needed additional observational component. Combining high spatial and temporal resolutions (as well as some vertical profiling capability in the case of multi-spectral instruments), such instruments would allow to evaluate and

  3. A user's manual for the Electromagnetic Surface Patch code: ESP version 3

    Science.gov (United States)

    Newman, E. H.; Dilsavor, R. L.

    1987-01-01

    This report serves as a user's manual for Version III of the Electromagnetic Surface Patch Code or ESP code. ESP is user-oriented, based on the method of moments (MM) for treating geometries consisting of an interconnection of thin wires and perfectly conducting polygonal plates. Wire/plate junctions must be about 0.1 lambda or more from any plate edge. Several plates may intersect along a common edge. Excitation may be by either a delta-gap voltage generator or by a plane wave. The thin wires may have finite conductivity and also may contain lumped loads. The code computes most of the usual quantities of interest such as current distribution, input impedance, radiation efficiency, mutual coupling, far zone gain patterns (both polarizations) and radar-cross-section (both/cross polarizations).

  4. High Resolution Air Quality Forecasts in the Western Mediterranean area within the MACC, MACC-II and MACC-III European projects

    Energy Technology Data Exchange (ETDEWEB)

    Cansado, A.; Martinez, I.; Morales, T.

    2015-07-01

    The European Earth observation programme Copernicus, formerly known as GMES (Global Monitoring for Environment and Security) is establishing a core global and regional environmental atmospheric service as a component of the Europe’s Copernicus/GMES initiative through successive R&D projects led by ECMWF (European Center for Medium-range Weather Forecasting) and funded by the 6th and 7th European Framework Programme for Research and Horizon 2020 Programme: GEMS, MACC, MACC-II and MACC-III. AEMET (Spanish State Meteorological Agency) has participated in the projects MACC and MACC-II and continues participating in MACC-III (http://atmosphere.copernicus.eu). AEMET has contributed to those projects by generating highresolution (0.05 degrees) daily air-quality forecasts for the Western Mediterranean up to 48 hours aiming to analyse the dependence of the quality of forecasts on resolution. We monitor the evolution of different chemical species such as NO2, O3, CO y SO2 at surface and different vertical levels using the global model MOCAGE and the MACC Regional Ensemble forecasts as chemical boundary conditions. We will show different case-studies, where the considered chemical species present high values and will show a validation of the air-quality by comparing to some of the available air-quality observations (EMEP/GAW, regional -autonomous communities- and local -city councils- air-quality monitoring networks) over the forecast domain. The aim of our participation in these projects is helping to improve the understanding of the processes involved in the air-quality forecast in the Mediterranean where special factors such as highly populated areas together with an intense solar radiation make air-quality forecasting particularly challenging. (Author)

  5. High Resolution Air Quality Forecasts in the Western Mediterranean area within the MACC, MACC-II and MACC-III European projects

    Energy Technology Data Exchange (ETDEWEB)

    Cansado, A.; Martinez, I.; Morales, T.

    2015-07-01

    The European Earth observation programme Copernicus, formerly known as GMES (Global Monitoring for Environment and Security) is establishing a core global and regional environmental atmospheric service as a component of the Europes Copernicus/GMES initiative through successive R and D projects led by ECMWF (European Center for Medium-range Weather Forecasting) and funded by the 6th and 7th European Framework Programme for Research and Horizon 2020 Programme: GEMS, MACC, MACC-II and MACC-III. AEMET (Spanish State Meteorological Agency) has participated in the projects MACC and MACC-II and continues participating in MACC-III (http://atmosphere.copernicus.eu). AEMET has contributed to those projects by generating highresolution (0.05 degrees) daily air-quality forecasts for the Western Mediterranean up to 48 hours aiming to analyse the dependence of the quality of forecasts on resolution. We monitor the evolution of different chemical species such as NO{sub 2}, O{sub 3}, CO y SO{sub 2} at surface and different vertical levels using the global model MOCAGE and the MACC Regional Ensemble forecasts as chemical boundary conditions. We will show different case-studies, where the considered chemical species present high values and will show a validation of the air-quality by comparing to some of the available air-quality observations (EMEP/GAW, regional -autonomous communities- and local -city councils- air-quality monitoring networks) over the forecast domain. The aim of our participation in these projects is helping to improve the understanding of the processes involved in the air-quality forecast in the Mediterranean where special factors such as highly populated areas together with an intense solar radiation make air-quality forecasting particularly challenging. (Author)

  6. A new version of the full-wave ICRH code FISIC for plasmas with noncircular flux surfaces

    Science.gov (United States)

    Kruecken, T.

    1988-12-01

    A user manual for a new version of the FISIC code which is now applicable to arbitrary (toroidal) geometry is presented. It describes the input parameters and quantities of all subroutines and contains a list of all common blocks.

  7. Milagro Version 2 An Implicit Monte Carlo Code for Thermal Radiative Transfer: Capabilities, Development, and Usage

    Energy Technology Data Exchange (ETDEWEB)

    T.J. Urbatsch; T.M. Evans

    2006-02-15

    We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.

  8. Experimental verification of the linear and non-linear versions of a panel code

    Science.gov (United States)

    Grigoropoulos, G. J.; Katsikis, C.; Chalkias, D. S.

    2011-03-01

    In the proposed paper numerical calculations are carried out using two versions of a three-dimensional, timedomain panel method developed by the group of Prof. P. Sclavounos at MIT, i.e. the linear code SWAN2, enabling optionally the use of the instantaneous non-linear Froude-Krylov and hydrostatic forces and the fully non-linear SWAN4. The analytical results are compared with experimental results for three hull forms with increasing geometrical complexity, the Series 60, a reefer vessel with stern bulb and a modern fast ROPAX hull form with hollow bottom in the stern region. The details of the geometrical modeling of the hull forms are discussed. In addition, since SWAN4 does not support transom sterns, only the two versions of SWAN2 were evaluated over experimental results for the parent hull form of the NTUA double-chine, wide-transom, high-speed monohull series. The effect of speed on the numerical predictions was investigated. It is concluded that both versions of SWAN2 the linear and the one with the non-linear Froude-Krylov and hydrostatic forces provide a more robust tool for prediction of the dynamic response of the vessels than the non-linear SWAN4 code. In general, their results are close to what was expected on the basis of experience. Furthermore, the use of the option of non-linear Froude-Krylov and hydrostatic forces is beneficial for the accuracy of the predictions. The content of the paper is based on the Diploma thesis of the second author, supervised by the first one and further refined by the third one.

  9. V.S.O.P. (99/09) Computer Code System for Reactor Physics and Fuel Cycle Simulation; Version 2009

    OpenAIRE

    Rütten, H.-J.; Haas, K. A.; Brockmann, H.; Ohlig, U.; Pohl, C.; Scherer, W.

    2010-01-01

    V.S.O.P.(99/ 09) represents the further development of V.S.O.P.(99/ 05). Compared to its precursor, the code system has been improved again in many details. The main motivation for this new code version was to update the basic nuclear libraries used by the code system. Thus, all cross section libraries involved in the code have now been based on ENDF/B-VII. V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of...

  10. Sodium combustion computer code ASSCOPS version 2.0 user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, Hiroyasu; Futagami, Satoshi; Ohno, Shuji; Seino, Hiroshi; Miyake, Osamu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1997-12-01

    ASSCOPS (Analysis of Simultaneous Sodium Combustion in Pool and Spray) has been developed for analyses of thermal consequences of sodium leak and fire accidents in LMFBRs. This report presents a description of the computational models, input, and output as the user`s manual of ASSCOPS version 2.0. ASSCOPS is an integrated code based on the sodium pool fire code SOFIRE II developed by the Atomics International Division of Rockwell International, and the sodium spray fire code SPRAY developed by the Hanford Engineering Development Laboratory in the U.S. The experimental studies conducted at PNC have been reflected in the ASSCOPS improvement. The users of ASSCOPS need to specify the sodium leak conditions (leak flow rate and temperature, etc.), the cell geometries (volume and structure surface area and thickness, etc.), and the atmospheric initial conditions, such as gas temperature, pressure, and gas composition. ASSCOPS calculates the time histories of atmospheric pressure and temperature changes along with those of the structural temperatures. (author)

  11. Development of a GPU Compatible Version of the Fast Radiation Code RRTMG

    Science.gov (United States)

    Iacono, M. J.; Mlawer, E. J.; Berthiaume, D.; Cady-Pereira, K. E.; Suarez, M.; Oreopoulos, L.; Lee, D.

    2012-12-01

    The absorption of solar radiation and emission/absorption of thermal radiation are crucial components of the physics that drive Earth's climate and weather. Therefore, accurate radiative transfer calculations are necessary for realistic climate and weather simulations. Efficient radiation codes have been developed for this purpose, but their accuracy requirements still necessitate that as much as 30% of the computational time of a GCM is spent computing radiative fluxes and heating rates. The overall computational expense constitutes a limitation on a GCM's predictive ability if it becomes an impediment to adding new physics to or increasing the spatial and/or vertical resolution of the model. The emergence of Graphics Processing Unit (GPU) technology, which will allow the parallel computation of multiple independent radiative calculations in a GCM, will lead to a fundamental change in the competition between accuracy and speed. Processing time previously consumed by radiative transfer will now be available for the modeling of other processes, such as physics parameterizations, without any sacrifice in the accuracy of the radiative transfer. Furthermore, fast radiation calculations can be performed much more frequently and will allow the modeling of radiative effects of rapid changes in the atmosphere. The fast radiation code RRTMG, developed at Atmospheric and Environmental Research (AER), is utilized operationally in many dynamical models throughout the world. We will present the results from the first stage of an effort to create a version of the RRTMG radiation code designed to run efficiently in a GPU environment. This effort will focus on the RRTMG implementation in GEOS-5. RRTMG has an internal pseudo-spectral vector of length of order 100 that, when combined with the much greater length of the global horizontal grid vector from which the radiation code is called in GEOS-5, makes RRTMG/GEOS-5 particularly suited to achieving a significant speed improvement

  12. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  13. Core 2D. A code for non-isothermal water flow and reactive solute transport. Users manual version 2

    Energy Technology Data Exchange (ETDEWEB)

    Samper, J.; Juncosa, R.; Delgado, J.; Montenegro, L. [Universidad de A Coruna (Spain)

    2000-07-01

    Understanding natural groundwater quality patterns, quantifying groundwater pollution and assessing the effects of waste disposal, require modeling tools accounting for water flow, and transport of heat and dissolved species as well as their complex interactions with solid and gases phases. This report contains the users manual of CORE ''2D Version V.2.0, a COde for modeling water flow (saturated and unsaturated), heat transport and multicomponent Reactive solute transport under both local chemical equilibrium and kinetic conditions. it is an updated and improved version of CORE-LE-2D V0 (Samper et al., 1988) which in turns is an extended version of TRANQUI, a previous reactive transport code (ENRESA, 1995). All these codes were developed within the context of Research Projects funded by ENRESA and the European Commission. (Author)

  14. AUS98 - The 1998 version of the AUS modular neutronic code system

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, G.S.; Harrington, B.V

    1998-07-01

    AUS is a neutronics code system which may be used for calculations of a wide range of fission reactors, fusion blankets and other neutron applications. The present version, AUS98, has a nuclear cross section library based on ENDF/B-VI and includes modules which provide for reactor lattice calculations, one-dimensional transport calculations, multi-dimensional diffusion calculations, cell and whole reactor burnup calculations, and flexible editing of results. Calculations of multi-region resonance shielding, coupled neutron and photon transport, energy deposition, fission product inventory and neutron diffusion are combined within the one code system. The major changes from the previous AUS publications are the inclusion of a cross-section library based on ENDF/B-VI, the addition of the MICBURN module for controlling whole reactor burnup calculations, and changes to the system as a consequence of moving from IBM main-frame computers to UNIX workstations This report gives details of all system aspects of AUS and all modules except the POW3D multi-dimensional diffusion module refs., tabs.

  15. Identification of MACC1 as a novel prognostic marker in hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Lu Canliang

    2011-09-01

    Full Text Available Abstract Background Metastasis-associated in colon cancer-1 (MACC1 is a newly identified gene that plays a role in colon cancer metastasis through upregulation of c-MET proto-oncogene (c-MET. However, the value of MACC1 as a potential biomarker for hepatocellular carcinoma (HCC remains unknown. Methods MACC1 mRNA expression in 128 HCC tissues was examined by quantitative polymerase chain reaction. To show the potential correlation of MACC1 and c-MET, c-MET was also analysed. Results MACC1 was more highly expressed in HCC than in non-HCC tissues (P = 0.009. High MACC1 expression was significantly increased in cases with high alpha fetoprotein (AFP (P = 0.025. A positive correlation was found between MACC1 and c-MET mRNAs (r = 0.235, P = 0.009. Both univariate and multivariate analyses revealed that MACC1 expression was associated with overall survival (OS and disease-free survival (DFS. Moreover, stratified analysis showed that tumour-node-metastasis (TNM stage I patients with high MACC1 levels had shorter OS and DFS than those with low MACC1. Conclusions MACC1 may identify low- and high-risk individuals with HCC and be a valuable indicator for stratifying the prognosis of TNM stage I patients. MACC1 may serve as a novel biomarker for HCC.

  16. ITS version 5.0 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2004-06-01

    ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  17. V.S.O.P. (99/09) computer code system for reactor physics and fuel cycle simulation. Version 2009

    Energy Technology Data Exchange (ETDEWEB)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Ohlig, U.; Pohl, C.; Scherer, W.

    2010-07-15

    V.S.O.P. (99/ 09) represents the further development of V.S.O.P. (99/ 05). Compared to its precursor, the code system has been improved again in many details. The main motivation for this new code version was to update the basic nuclear libraries used by the code system. Thus, all cross section libraries involved in the code have now been based on ENDF/B-VII. V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to gas-cooled reactors and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. This latest code version was developed and tested under the WINDOWS-XP - operating system. (orig.)

  18. Tripoli-3: monte Carlo transport code for neutral particles - version 3.5 - users manual; Tripoli-3: code de transport des particules neutres par la methode de monte carlo - version 3.5 - manuel d'utilisation

    Energy Technology Data Exchange (ETDEWEB)

    Vergnaud, Th.; Nimal, J.C.; Chiron, M

    2001-07-01

    The TRIPOLI-3 code applies the Monte Carlo method to neutron, gamma-ray and coupled neutron and gamma-ray transport calculations in three-dimensional geometries, either in steady-state conditions or having a time dependence. It can be used to study problems where there is a high flux attenuation between the source zone and the result zone (studies of shielding configurations or source driven sub-critical systems, with fission being taken into account), as well as problems where there is a low flux attenuation (neutronic calculations -- in a fuel lattice cell, for example -- where fission is taken into account, usually with the calculation on the effective multiplication factor, fine structure studies, numerical experiments to investigate methods approximations, etc). TRIPOLI-3 has been operational since 1995 and is the version of the TRIPOLI code that follows on from TRIPOLI-2; it can be used on SUN, RISC600 and HP workstations and on PC using the Linux or Windows/NT operating systems. The code uses nuclear data libraries generated using the THEMIS/NJOY system. The current libraries were derived from ENDF/B6 and JEF2. There is also a response function library based on a number of evaluations, notably the dosimetry libraries IRDF/85, IRDF/90 and also evaluations from JEF2. The treatment of particle transport is the same in version 3.5 as in version 3.4 of the TRIPOLI code; but the version 3.5 is more convenient for preparing the input data and for reading the output. The french version of the user's manual exists. (authors)

  19. Meeting the requirements of specialists and generalists in Version 3 of the Read Codes: Two illustrative "Case Reports"

    Directory of Open Access Journals (Sweden)

    Fiona Sinclair

    1997-11-01

    Full Text Available The Read Codes have been recognised as the standard for General Practice computing since 1988 and the original 4-byte set continues to be extensively used to record primary health care data. Read Version 3 (the Read Thesaurus is an expanded clinical vocabulary with an enhanced file structure designed to meet the detailed requirements of specialist practitioners and to address some of the limitations of previous versions. A recent phase of integration of the still widely-used 4-byte set has highlighted the need to ensure that the new Thesaurus continues to support generalist requirements.

  20. Deconvolution of magnetic acoustic change complex (mACC).

    Science.gov (United States)

    Bardy, Fabrice; McMahon, Catherine M; Yau, Shu Hui; Johnson, Blake W

    2014-11-01

    The aim of this study was to design a novel experimental approach to investigate the morphological characteristics of auditory cortical responses elicited by rapidly changing synthesized speech sounds. Six sound-evoked magnetoencephalographic (MEG) responses were measured to a synthesized train of speech sounds using the vowels /e/ and /u/ in 17 normal hearing young adults. Responses were measured to: (i) the onset of the speech train, (ii) an F0 increment; (iii) an F0 decrement; (iv) an F2 decrement; (v) an F2 increment; and (vi) the offset of the speech train using short (jittered around 135ms) and long (1500ms) stimulus onset asynchronies (SOAs). The least squares (LS) deconvolution technique was used to disentangle the overlapping MEG responses in the short SOA condition only. Comparison between the morphology of the recovered cortical responses in the short and long SOAs conditions showed high similarity, suggesting that the LS deconvolution technique was successful in disentangling the MEG waveforms. Waveform latencies and amplitudes were different for the two SOAs conditions and were influenced by the spectro-temporal properties of the sound sequence. The magnetic acoustic change complex (mACC) for the short SOA condition showed significantly lower amplitudes and shorter latencies compared to the long SOA condition. The F0 transition showed a larger reduction in amplitude from long to short SOA compared to the F2 transition. Lateralization of the cortical responses were observed under some stimulus conditions and appeared to be associated with the spectro-temporal properties of the acoustic stimulus. The LS deconvolution technique provides a new tool to study the properties of the auditory cortical response to rapidly changing sound stimuli. The presence of the cortical auditory evoked responses for rapid transition of synthesized speech stimuli suggests that the temporal code is preserved at the level of the auditory cortex. Further, the reduced amplitudes

  1. SACRD: a data base for fast reactor safety computer codes, contents and glossary of Version 1 of the system

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Forsberg, V.M.; Raiford, G.B.; Arwood, J.W.; Flanagan, G.F.

    1979-01-01

    SACRD is a data base of material properties and other handbook data needed in computer codes used for fast reactor safety studies. This document lists the contents of Version 1 and also serves as a glossary of terminology used in the data base. Data are available in the thermodynamics, heat transfer, fluid mechanics, structural mechanics, aerosol transport, meteorology, neutronics and dosimetry areas. Tabular, graphical and parameterized data are provided in many cases.

  2. A Spanish version for the new ERA-EDTA coding system for primary renal disease

    Directory of Open Access Journals (Sweden)

    Óscar Zurriaga

    2015-07-01

    Conclusions: Translation and adaptation into Spanish represent an improvement that will help to introduce and use the new coding system for PKD, as it can help reducing the time devoted to coding and also the period of adaptation of health workers to the new codes.

  3. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  4. Code Analysis and Refactoring with Clang Tools, Version 0.1

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  5. The European Register of Specialists in Clinical Chemistry and Laboratory Medicine: Code of Conduct, Version 2--2008.

    LENUS (Irish Health Repository)

    McMurray, Janet

    2009-01-01

    In 1997, the European Communities Confederation of Clinical Chemistry and Laboratory Medicine (EC4) set up a Register for European Specialists in Clinical Chemistry and Laboratory Medicine. The operation of the Register is undertaken by a Register Commission (EC4RC). During the last 10 years, more than 2000 specialists in Clinical Chemistry and Laboratory Medicine have joined the Register. In 2007, EC4 merged with the Federation of European Societies of Clinical Chemistry and Laboratory Medicine (FESCC) to form the European Federation of Clinical Chemistry and Laboratory Medicine (EFCC). A Code of Conduct was adopted in 2003 and a revised and updated version, taking account particularly of the guidelines of the Conseil Européen des Professions Libérales (CEPLIS) of which EFCC is a member, is presented in this article. The revised version was approved by the EC4 Register Commission and by the EFCC Executive Board in Paris on 6 November, 2008.

  6. A new version of code Java for 3D simulation of the CCA model

    Science.gov (United States)

    Zhang, Kebo; Xiong, Hailing; Li, Chao

    2016-07-01

    In this paper we present a new version of the program of CCA model. In order to benefit from the advantages involved in the latest technologies, we migrated the running environment from JDK1.6 to JDK1.7. And the old program was optimized into a new framework, so promoted extendibility.

  7. MACC1 as a prognostic biomarker for early-stage and AFP-normal hepatocellular carcinoma.

    Directory of Open Access Journals (Sweden)

    Chan Xie

    Full Text Available BACKGROUND: The metastasis-associated in colon cancer 1 gene (MACC1 has been found to be associated with cancer development and progression. The aim of this study was to investigate the prognostic value of MACC1 in early-stage and AFP-normal hepatocellular carcinoma (HCC. METHODS: mRNA and protein levels of MACC1 expression in one normal liver epithelial cells THLE3 and 15 HCC cell lines were examined using reverse transcription-PCR and Western blot. MACC1 expression was also comparatively studied in 6 paired HCC lesions and the adjacent non-cancerous tissue samples. Immunohistochemistry was employed to analyze MACC1 expression in 308 clinicopathologically characterized HCC cases. Statistical analyses were applied to derive association between MACC1 expression scores and clinical staging as well as patient survival. RESULTS: Levels of MACC1 mRNA and protein were higher in HCC cell lines and HCC lesions than in normal liver epithelial cells and the paired adjacent noncancerous tissues. Significant difference in MACC1 expression was found in patients of different TNM stages (P<0.001. Overall survival analysis showed that high MACC1 expression level correlated with lower survival rate (P = 0.001. Importantly, an inverse correlation between MACC1 level and patient survival remained significant in subjects with early-stage HCC or with normal serum AFP level. CONCLUSIONS: MACC1 protein may represent a promising biomarker for predicting the prognosis of HCC, including in early-stage and AFP-normal patients.

  8. UNSAT-H Version 1. 0: unsaturated flow code documentation and applications for the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Fayer, M.J.; Gee, G.W.; Jones, T.L.

    1986-08-01

    Waste mangement practices at the Hanford Site have relied havily on near-surface burial. Predicting the future performance of any burial site in terms of the migration of buried contaminants requires a model capable of simulating water flow in the unsaturated soils above the buried waste. The model currently being developed to meet this need is UNSAT-H, which was developed at Pacific Northwest Laboratory for assessing the water dynamics of near-surface waste-disposal sites at the Hanfrod Site. The code will primarily be used to predict deep drainage (i.e., recharge) as a function of environmental conditions such as climate, soil type, and vegetation. UNSAT-H will also simulate various waste-management practices such as placing surface barriers over waste sites. UNSAT-H is a one-dimensional model that simulates the dynamics processes of infiltration, drainage, redistribution, surface evaporation, and uptake of water from soil by plants. UNSAT-H is designed to utilize two auxiliary codes. These codes are DATAINH, which is used to process the input data, and DATAOUT, which is used to process the UNSAT-H output. Operation of the code requires three separate steps. First, the problem to be simulated must be conceptualized in terms of boundary conditions, available data, and soil properties. Next, the data must be correctly formatted for input. Finally, the unput data must be processed, UNSAT-H run, and the output data processed for analysis. This report includes three examples of code use. In the first example, a benchmark test case is run in which the results of UNSAT-H simulations of infiltration are compared with an analytical solution and a numerical solution. The comparisons show excellent agreement for the specific test case, and this agreement provides vertification of the infiltration portion of the UNSAT-H code. The other two examples of code use are a simulation of a layered soil and one of plant transpiration.

  9. Application of the Finite Orbit Width Version of the CQL3D Code to Transport of Fast Ions

    Science.gov (United States)

    Petrov, Yu. V.; Harvey, R. W.

    2016-10-01

    The CQL3D bounce-averaged Fokker-Planck (FP) code now includes the ``fully'' neoclassical version in which the diffusion and advection processes are averaged over actual drift orbits, rather than using a 1st-order expansion. Incorporation of Finite-Orbit-Width (FOW) effects results in neoclassical radial transport caused by collisions, RF wave heating and by toroidal electric field (radial pinch). We apply the CQL3D-full-FOW code to study the thermalization and radial transport of high-energy particles, such as alpha-particles produced by fusion in ITER or deuterons from NBI in NSTX, under effect of their interaction with auxiliary RF waves. A particular attention is given to visualization of transport in 3D space of velocity +major-radius coordinates. Supported by USDOE Grants FC02-01ER54649, FG02-04ER54744, and SC0006614.

  10. Validation of a Subchannel Analysis Code MATRA Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, Kyung Won; Kwon, Hyouk

    2008-10-15

    A subchannel analysis code MATRA has been developed for the thermal hydraulic analysis of SMART core. The governing equations and important models were established, and validation calculations have been performed for subchannel flow and enthalpy distributions in rod bundles under steady-state conditions. The governing equations of the MATRA were on the basis of integral balance equation of the two-phase mixture. The effects of non-homogeneous and non-equilibrium states were considered by employing the subcooled boiling model and the phasic slip model. Solution scheme and main structure of the MATRA code, as well as the difference of MATRA and COBRA-IV-I codes, were summarized. Eight different test data sets were employed for the validation of the MATRA code. The collected data consisted of single-phase subchannel flow and temperature distribution data, single-phase inlet flow maldistribution data, single-phase partial flow blockage data, and two-phase subchannel flow and enthalpy distribution data. The prediction accuracy as well as the limitation of the MATRA code was evaluated from this analysis.

  11. Development of MATRA-LMR code {alpha}-version for LMR subchannel analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won Seok; Kim, Young Gyun; Kim, Young Gin

    1998-05-01

    Since the sodium boiling point is very high, maximum cladding and pin temperature are used for design limit condition in sodium cooled liquid metal reactor. It is necessary to predict accurately the core temperature distribution to increase the sodium coolant efficiency. Based on the MATRA code, which is developed for PWR analysis, MATRA-LMR is being developed for LMR. The major modification are as follows : A) The sodium properties table is implemented as subprogram in the code. B) Heat transfer coefficients are changed for LMR C) The pressure drop correlations are changed for more accurate calculations, which are Novendstern, Chiu-Rohsenow-Todreas, and Cheng-Todreas correlations. To assess the development status of MATRA-LMR code, calculations have been performed for ORNL 19 pin and EBR-II 61 pin tests. MATRA-LMR calculation results are also compared with the results obtained by the ALTHEN code, which uses more simplied thermal hydraulic model. The MATRA-LMR predictions are found to agree well to the measured values. The differences in results between MATRA-LMR and SLTHEN have occurred because SLTHEN code uses the very simplied thermal-hydraulic model to reduce computing time. MATRA-LMR can be used only for single assembly analysis, but it is planned to extend for multi-assembly calculation. (author). 18 refs., 8 tabs., 14 figs.

  12. Object-Oriented Version of Glenn-HT Code Released: Glenn-HT2000

    Science.gov (United States)

    Heidmann, James D.; Ameri, Ali A.; Rigby, David I.; Garg, Vijay K.; Fabian, John C.; Lucci, Barbara L.; Steinthorsson, Erlendur

    2005-01-01

    NASA Glenn Research Center s General Multi-Block Navier-Stokes Convective Heat Transfer Code (Glenn-HT) has been used extensively to predict heat transfer and fluid flow for a variety of steady gas turbine engine problems. Efforts have focused on turbine heat transfer, where computations have modeled tip clearance, internal coolant, and film cooling flows. Excellent agreement has been achieved for a variety of experimental test cases, and results have been published in over 40 technical publications. The code is available to U.S. industry and has been used by several domestic gas turbine engine companies. The following figure shows a typical flow solution from the Glenn-HT code for a film-cooled turbine blade.

  13. SIERRA Code Coupling Module: Arpeggio User Manual Version 4.44

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Thermal/Fluid Team

    2017-04-01

    The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested in coupled applications via simple examples of its usage.

  14. Enhanced code for the full space parameterization approach to solving underspecified systems of algebraic equations: Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Morgansen, K.A.; Pin, F.G.

    1995-03-01

    This paper describes an enhanced version of the code for the Full Space Parameterization (FSP) method that has recently been presented for determining optimized (and possibly constrained) solutions, x, to underspecified system`s of algebraic equations b = Ax. The enhanced code uses the conditions necessary for linear independence of the m {minus} n + 1 vectors forming the solution as a basis for an efficient search pattern to quickly find the full set of solution vectors. A discussion is made of the complications which may be present due to the particular combination of the matrix A and the vector b. The first part of the code implements the various methods needed to handle these particular cases before the solution vectors are calculated so that computation time may be decreased. The second portion of the code implements methods which can be used to calculate the necessary solution vectors. The respective expressions of the full solution space, S, for the cases of the matrix A being full rank and rank deficient are given. Finally, examples of the resolution of particular cases are provided, and a sample application to the joint motion of a mobile manipulator for a given end-effector trajectory is presented.

  15. Adaptive Network Coded Clouds: High Speed Downloads and Cost-Effective Version Control

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Heide, Janus; Roetter, Daniel Enrique Lucani

    2017-01-01

    developed a novel scheme using recoding with limited packets to trade-off storage space, reliability, and data retrieval speed. Implementation and measurements with commercial cloud providers show that up to 9x less network use is needed compared to other network coding schemes, while maintaining similar...

  16. DIONISIO 2.0: New version of the code for simulating a whole nuclear fuel rod under extended irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Soba, Alejandro, E-mail: soba@cnea.gov.ar; Denis, Alicia

    2015-10-15

    Highlights: • A new version of the DIONISIO code is developed. • DIONISIO is devoted to simulating the behavior of a nuclear fuel rod in operation. • The formerly two-dimensional simulation of a pellet-cladding segment is now extended to the whole rod length. • An acceptable and more realistic agreement with experimental data is obtained. • The prediction range of our code is extended up to average burnup of 60 MWd/kgU. - Abstract: The version 2.0 of the DIONISIO code, that incorporates diverse new aspects, has been recently developed. One of them is referred to the code architecture that allows taking into account the axial variation of the conditions external to the rod. With this purpose, the rod is divided into a number of axial segments. In each one the program considers the system formed by a pellet and the corresponding cladding portion and solves the numerous phenomena that take place under the local conditions of linear power and coolant temperature, which are given as input parameters. To do this a bi-dimensional domain in the r–z plane is considered where cylindrical symmetry and also symmetry with respect to the pellet mid-plane are assumed. The results obtained for this representative system are assumed valid for the complete segment. The program thus produces in each rod section the values of the temperature, stress, strain, among others as outputs, as functions of the local coordinates r and z. Then, the general rod parameters (internal rod pressure, amount of fission gas released, pellet stack elongation, etc.) are evaluated. Moreover, new calculation tools designed to extend the application range of the code to high burnup, which were reported elsewhere, have also been incorporated to DIONISIO 2.0 in recent times. With these improvements, the code results are compared with some 33 experiments compiled in the IFPE data base, that cover more than 380 fuel rods irradiated up to average burnup levels of 40–60 MWd/kgU. The results of these

  17. Estimation of the Radiological Consequences of Fukushima Dai-ichi Nuclear Power Plant Accident using MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sora; Min, Byung-Il; Park, Kihyun; Yang, Byung-Mo; Suh, Kyung-suk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Three of them have undergone fuel melting and hydrogen explosions. A significant amount of radioactive material was released into the atmosphere from FDNPP and dispersed all over the world. In this study, we assessed the offsite consequences of Fukushima disaster in the region within a 30-km radius of FDNPP using the MELCOR Accident Consequence Code Systems 2(MACCS2) code, which is the Nuclear Regulatory Commission's (NRC's) code. The reflection of the realistic regional characteristics, such as long-term meteorological data, site- and population-specific data, and radiation safety regulatory, is essential to accurately analyze the off-site consequences. The assessment that reflects regional characteristics would contribute to identify main causes of exposure doses and to find the effective countermeasures for minimizing the accidental off-site consequences.

  18. An upgraded version of the nucleon meson transport code: NMTC/JAERI97

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Yoshizawa, Nobuaki; Kosako, Kazuaki; Ishibashi, Kenji

    1998-02-01

    The nucleon-meson transport code NMTC/JAERI is upgraded to NMTC/JAERI97 which has new features not only in physics model and nuclear data but also in computational procedure. NMTC/JAERI97 implements the following two new physics models: an intranuclear cascade model taking account of the in-medium nuclear effects and the preequilibrium calculation model based on the exciton one. For treating the nucleon transport process more accurately, the nucleon-nucleus cross sections are revised to those derived by the systematics of Pearlstein. Moreover, the level density parameter derived by Ignatyuk is included as a new option for particle evaporation calculation. Other than those physical aspects, a new geometry package based on the Combinatorial Geometry with multi-array system and the importance sampling technique are implemented in the code. Tally function is also employed for obtaining such physical quantities as neutron energy spectra, heat deposition and nuclide yield without editing a history file. The resultant NMTC/JAERI97 is tuned to be executed on the UNIX system. This paper explains about the function, physics models and geometry model adopted in NMTC/JAERI97 and guides how to use the code. (author)

  19. X-ray FEL Simulation with the MPP version of the GINGER Code

    Science.gov (United States)

    Fawley, William

    2001-06-01

    GINGER is a polychromatic, 2D (r-z) PIC code originally developed in the 1980's to examine sideband growth in FEL amplifiers. In the last decade, GINGER simulations have examined various aspects of x-ray and XUV FEL's based upon initiation by self-amplified spontaneous emission (SASE). Recently, GINGER's source code has been substantially updated to exploit many modern features of the Fortran90 language and extended to exploit multiprocessor hardware with the result that the code now runs effectively on platforms ranging from single processor workstations in serial mode to MPP hardware at NERSC such as the Cray-T3E and IBM-SP in full parallel mode. This poster discusses some of the numerical algorithms and structural details of GINGER which permitted relatively painless porting to parallel architectures. Examples of some recent SASE FEL modeling with GINGER will be given including both existing experiments such as the LEUTL UV FEL at Argonne and proposed projects such as the LCLS x-ray FEL at SLAC.

  20. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    Science.gov (United States)

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg.

  1. Calculation of Sodium Fire Test-I (Run-E6) using sodium combustion analysis code ASSCOPS version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Nakagiri, Toshio; Ohno, Shuji; Miyake, Osamu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1997-11-01

    The calculation of Sodium Fire Test-I (Run-E6) was performed using the ASSCOPS (Analysis of Simultaneous Sodium Combustions in Pool and Spray) code version 2.0 in order to determine the parameters used in the code for the calculations of sodium combustion behavior of small or medium scale sodium leak, and to validate the applicability of the code. The parameters used in the code were determined and the validation of the code was confirmed because calculated temperatures, calculated oxygen concentration and other calculated values almost agreed with the test results. (author)

  2. Development of a version of the reactor dynamics code DYN3D applicable for High Temperature Reactors; Entwicklung einer Version des Reaktordynamikcodes DYN3D fuer Hochtemperaturreaktoren. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Rohde, Ulrich; Apanasevich, Pavel; Baier, Silvio; Duerigen, Susan; Fridman, Emil; Grahn, Alexander; Kliem, Soeren; Merk, Bruno

    2012-07-15

    Based on the reactor dynamics code DYN3D for the simulation of transient processes in Light Water Reactors, a code version DYN3D-HTR for application to graphitemoderated, gas-cooled block-type high temperature reactors has been developed. This development comprises: - the methodical improvement of the 3D steady-state neutron flux calculation for the hexagonal geometry of the HTR fuel element blocks - the development of methods for the generation of homogenised cross section data taking into account the double heterogeneity of the fuel element block structure - the implementation of a 3D model for heat conduction and heat transport in the graphite matrix. The nodal method for neutron flux calculation based on SP3 transport approximation was extended to hexagonal fuel element geometry, where the hexagons are subdivided into triangles, thus the method had finally to be derived for triangular geometry. In triangular geometry, a subsequent subdivision of the hexagonal elements can be considered, and therefore, the effect of systematic mesh refinement can be studied. The algorithm was verified by comparison with Monte Carlo reference solutions, on the node-wise level, as well as also on the pin-wise level. New procedures were developed for the homogenization of the double-heterogeneous fuel element structures. One the one hand, the so-called Reactivity equivalent Physical Transformation (RPT), the two-step homogenization method based on 2D deterministic lattice calculations, was extended to cells with different temperatures of the materials. On the other hand, the progress in development of Monte Carlo methods for spectral calculations, in particular the development of the code SERPENT, opened a new, fully consistent 3D approach, where all details of the structures on fuel particle, fuel compact and fuel block level can be taken into account within one step. Moreover, a 3D heat conduction and heat transport model was integrated into DYN3D to be able to simulate radial

  3. The TORT three-dimensional discrete ordinates neutron/photon transport code (TORT version 3)

    Energy Technology Data Exchange (ETDEWEB)

    Rhoades, W.A.; Simpson, D.B.

    1997-10-01

    TORT calculates the flux or fluence of neutrons and/or photons throughout three-dimensional systems due to particles incident upon the system`s external boundaries, due to fixed internal sources, or due to sources generated by interaction with the system materials. The transport process is represented by the Boltzman transport equation. The method of discrete ordinates is used to treat the directional variable, and a multigroup formulation treats the energy dependence. Anisotropic scattering is treated using a Legendre expansion. Various methods are used to treat spatial dependence, including nodal and characteristic procedures that have been especially adapted to resist numerical distortion. A method of body overlay assists in material zone specification, or the specification can be generated by an external code supplied by the user. Several special features are designed to concentrate machine resources where they are most needed. The directional quadrature and Legendre expansion can vary with energy group. A discontinuous mesh capability has been shown to reduce the size of large problems by a factor of roughly three in some cases. The emphasis in this code is a robust, adaptable application of time-tested methods, together with a few well-tested extensions.

  4. Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC): User Guide. Version 3

    Science.gov (United States)

    Arnold, S. M.; Bednarcyk, B. A.; Wilt, T. E.; Trowbridge, D.

    1999-01-01

    The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC/ GMC is a versatile form of research software that "drives" the double or triply periodic micromechanics constitutive models based upon GMC. MAC/GMC enhances the basic capabilities of GMC by providing a modular framework wherein 1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, 2) different integration algorithms may be selected, 3) a variety of material constitutive models (both deformation and life) may be utilized and/or implemented, and 4) a variety of fiber architectures (both unidirectional, laminate and woven) may be easily accessed through their corresponding representative volume elements contained within the supplied library of RVEs or input directly by the user, and 5) graphical post processing of the macro and/or micro field quantities is made available.

  5. The SAS4A/SASSYS-1 Safety Analysis Code System, Version 5

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H. [Argonne National Lab. (ANL), Argonne, IL (United States); Brunett, A. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Sumner, T. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-01

    The SAS4A/SASSYS-1 computer code is developed by Argonne National Laboratory for thermal, hydraulic, and neutronic analysis of power and flow transients in liquidmetal- cooled nuclear reactors (LMRs). SAS4A was developed to analyze severe core disruption accidents with coolant boiling and fuel melting and relocation, initiated by a very low probability coincidence of an accident precursor and failure of one or more safety systems. SASSYS-1, originally developed to address loss-of-decay-heat-removal accidents, has evolved into a tool for margin assessment in design basis accident (DBA) analysis and for consequence assessment in beyond-design-basis accident (BDBA) analysis. SAS4A contains detailed, mechanistic models of transient thermal, hydraulic, neutronic, and mechanical phenomena to describe the response of the reactor core, its coolant, fuel elements, and structural members to accident conditions. The core channel models in SAS4A provide the capability to analyze the initial phase of core disruptive accidents, through coolant heat-up and boiling, fuel element failure, and fuel melting and relocation. Originally developed to analyze oxide fuel clad with stainless steel, the models in SAS4A have been extended and specialized to metallic fuel with advanced alloy cladding. SASSYS-1 provides the capability to perform a detailed thermal/hydraulic simulation of the primary and secondary sodium coolant circuits and the balance-ofplant steam/water circuit. These sodium and steam circuit models include component models for heat exchangers, pumps, valves, turbines, and condensers, and thermal/hydraulic models of pipes and plena. SASSYS-1 also contains a plant protection and control system modeling capability, which provides digital representations of reactor, pump, and valve controllers and their response to input signal changes.

  6. The bidimensional neutron transport code TWOTRAN-GG. Users manual and input data TWOTRAN-TRACA version; El codigo de transporte bidimensional TWOTRAN-GG. Manual de usuario y datos de entrada version TWOTRAN-TRACA

    Energy Technology Data Exchange (ETDEWEB)

    Ahnert, C.; Aragones, J. M.

    1981-07-01

    This Is a users manual of the neutron transport code TWOTRAN-TRACA, which is a version of the original TWOTRAN-GG from the Los Alamos Laboratory, with some modifications made at JEN. A detailed input data description is given as well as the new modifications developed at JEN. (Author) 8 refs.

  7. MSTor version 2013: A new version of the computer code for the multi-structural torsional anharmonicity, now with a coupled torsional potential

    Science.gov (United States)

    Zheng, Jingjing; Meana-Pañeda, Rubén; Truhlar, Donald G.

    2013-08-01

    We present an improved version of the MSTor program package, which calculates partition functions and thermodynamic functions of complex molecules involving multiple torsions; the method is based on either a coupled torsional potential or an uncoupled torsional potential. The program can also carry out calculations in the multiple-structure local harmonic approximation. The program package also includes seven utility codes that can be used as stand-alone programs to calculate reduced moment of inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes for torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files for the MSTor calculation and Voronoi calculation, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Restrictions: There is no limit on the number of torsions that can be included in either the Voronoi calculation or the full MS-T calculation. In practice, the range of problems that can be addressed with the present method consists of all multitorsional problems for which one can afford to calculate all the conformational structures and their frequencies. Unusual features: The method can be applied to transition states as well as stable molecules. The program package also includes the hull program for the calculation of Voronoi volumes, the symmetry program for determining point group symmetry of a molecule, and seven utility codes that can be used as stand-alone programs to calculate reduced moment-of-inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes of the torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files, and to calculate one-dimensional torsional

  8. 肿瘤转移的一个靶向治疗位点MACC1%A target site for the treatment of tumor meatastasis: MACC1

    Institute of Scientific and Technical Information of China (English)

    贺志云; 白志刚; 张忠涛

    2014-01-01

    Metastasis-associated in colon cancer-1 (MACC1) is a recently discovered gene associated with colon cancer metastasis,there is significant relationship indicated from some studies between MACC1 and different malignant tumors.It may play an important role in the regulation of tumors metastasis.This article reviewed the expression and regulating function of MACC1 in different cancers including colorectal cancer,hepatic cancer,gastric cancer,lung cancer,ovarian cancer,breast cancer,and so on.It may offer clues to find a new target for target treatment of cancer metastasis.%结肠癌转移相关基因-1(MACC1)是新近发现的一个与结肠癌密切相关的基因,研究表明,MACC1与多种恶性肿瘤的转移和复发有密切的相关性,可能在肿瘤转移的共同通路中起着至关重要的调控作用.本研究就MACC1在结肠癌、肝癌、胃癌、肺癌、卵巢癌和乳腺癌等多种恶性肿瘤中的表达与功能调控等方面进行综述,以期为肿瘤转移调控的靶向治疗提供一个新的方向.

  9. Implementation of a 3D version of ponderomotive guiding center solver in particle-in-cell code OSIRIS

    Science.gov (United States)

    Helm, Anton; Vieira, Jorge; Silva, Luis; Fonseca, Ricardo

    2016-10-01

    Laser-driven accelerators gained an increased attention over the past decades. Typical modeling techniques for laser wakefield acceleration (LWFA) are based on particle-in-cell (PIC) simulations. PIC simulations, however, are very computationally expensive due to the disparity of the relevant scales ranging from the laser wavelength, in the micrometer range, to the acceleration length, currently beyond the ten centimeter range. To minimize the gap between these despair scales the ponderomotive guiding center (PGC) algorithm is a promising approach. By describing the evolution of the laser pulse envelope separately, only the scales larger than the plasma wavelength are required to be resolved in the PGC algorithm, leading to speedups in several orders of magnitude. Previous work was limited to two dimensions. Here we present the implementation of the 3D version of a PGC solver into the massively parallel, fully relativistic PIC code OSIRIS. We extended the solver to include periodic boundary conditions and parallelization in all spatial dimensions. We present benchmarks for distributed and shared memory parallelization. We also discuss the stability of the PGC solver.

  10. Prognostic Value of MACC1 and c-met Expressions in Non-small Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Xingsheng HU

    2012-07-01

    Full Text Available Background and objective It has been proven that metastasis-associated in colon cancer 1 (MACC1 is a new gene that is related to the invasion and metastasis of tumors. MACC1 also regulates c-met expression. The aim of this study is to explore the expressions of MACC1 and hepatocyte growth factor receptor (c-met, and its relationship with invasion, metastasis, and prognosis of non-small cell lung cancer (NSCLC. Methods MACC1 and c-met expressions were detected in 103 cases of NSCLC and 40 cases of neighboring normal lung cancer tissue using immunohistochemistry. Results MACC1 and c-met expressions were significantly higher in lung cancer tissues than that in neighboring normal tissue (P<0.001. MACC1 and c-met expressions were associated with poor differentiation, advanced T stages, lymph node metastasis, and advanced TNM stages (P<0.05 of NSCLC, but not with sex, age, smoking, and histological classification (P>0.05. In addition, a positive correlation between MACC1 and c-met expressions was observed (r=0.403, P<0.001. The result from the Kaplan-Meier survival analysis showed that the five-year survival rate in patients with positive MACC1 and c-met expressions was remarkanly lower than that in patients with negative expressions (P<0.05. The result from the Cox regression analysis showed that MACC1 expression was an independent prognostic factor for NSCLC (P=0.026. Conclusion MACC1 and c-met have an important function in the differentiation, invasion, and metastasis of NSCLC. MACC1 and c-met have poor prognosis in patients with NSCLC. Moreover, MACC1 expression is an independent prognostic factor for NSCLC.

  11. Improved in Parallel Version of the thermalhydraulic code Subchannel COBRA-TF; Mejoras en la Version Paralela d el Codigo Termohidraulico de Subcanal COBRA-TF

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, E.; Abarca, A.; Roman, J. E.; Miro, R.

    2014-07-01

    Analysis on nuclear safety at the level rod of fuel requires the execution of coupled code neutronic-thermalhydraulic that allow the simulation of large physical domains in a reasonable amount of time. To do this it is essential the use of numerous processors (or cores) that work together to obtain the solution to a only problem using available memory and computational power in a cluster. This document sets out improvements in coupled code, which are centered in the part of memory optimization and parallelism, in addition the PVM (Parallel Virtual Machine) and MPI technology has combined to enable the use of the attached code CTF/PARCSV2.7. (Author)

  12. An interactive code (NETPATH) for modeling NET geochemical reactions along a flow PATH, version 2.0

    Science.gov (United States)

    Plummer, L. Niel; Prestemon, Eric C.; Parkhurst, David L.

    1994-01-01

    NETPATH is an interactive Fortran 77 computer program used to interpret net geochemical mass-balance reactions between an initial and final water along a hydrologic flow path. Alternatively, NETPATH computes the mixing proportions of two to five initial waters and net geochemical reactions that can account for the observed composition of a final water. The program utilizes previously defined chemical and isotopic data for waters from a hydrochemical system. For a set of mineral and (or) gas phases hypothesized to be the reactive phases in the system, NETPATH calculates the mass transfers in every possible combination of the selected phases that accounts for the observed changes in the selected chemical and (or) isotopic compositions observed along the flow path. The calculations are of use in interpreting geochemical reactions, mixing proportions, evaporation and (or) dilution of waters, and mineral mass transfer in the chemical and isotopic evolution of natural and environmental waters. Rayleigh distillation calculations are applied to each mass-balance model that satisfies the constraints to predict carbon, sulfur, nitrogen, and strontium isotopic compositions at the end point, including radiocarbon dating. DB is an interactive Fortran 77 computer program used to enter analytical data into NETPATH, and calculate the distribution of species in aqueous solution. This report describes the types of problems that can be solved, the methods used to solve problems, and the features available in the program to facilitate these solutions. Examples are presented to demonstrate most of the applications and features of NETPATH. The codes DB and NETPATH can be executed in the UNIX or DOS1 environment. This report replaces U.S. Geological Survey Water-Resources Investigations Report 91-4078, by Plummer and others, which described the original release of NETPATH, version 1.0 (dated December, 1991), and documents revisions and enhancements that are included in version 2.0. 1 The

  13. MACC1基因siRNA对肺癌SBC-5细胞增殖和迁移的影响%Influence of MACC1 Gene expression on the proliferation and migration of SBC-5 lung cancer cells

    Institute of Scientific and Technical Information of China (English)

    杨淑慧; 龙敏; 王希; 林芳; 张惠中

    2011-01-01

    目的:探讨MACC1(metastasis-associated in colon cancer l)基因特异性siRNA(small interfering RNA)对肺癌细胞系SBC-5体外增殖和迁移的影响.方法:化学合成MACC1基因特异性siRNA,阳离子脂质体LipofectamineTM 2000转染肺癌细胞系SBC-5细胞,RT-PCR和Western blot方法检测转染后SBC-5细胞中MACC1基因和蛋白的表达情况;MTT法检测细胞的生长增殖状况;划痕试验观察细胞迁移能力的改变.结果:MACC1基因特异性siRNA转染细胞后,SBC-5细胞MACC1的mRNA和蛋白表达水平明显降低,MTT试验和划痕试验观察到转染MACC1-siRNA的SBC-5细胞增殖、迁移能力明显减弱.结论:MACC1基因特异性siRNA可以明显抑制肺癌SBC-5细胞增殖和迁移,MACC1可能成为肺癌治疗的新靶点.%Objective: To investigate the influence of siRNA targeting metastasis - associated in colon cancer Ⅰ ( MACC1 ) gene on proliferation and migration of human lung cancer cell line SBC - 5 in vitro. Methods : The siRNA targeting MACC1 gene and negative control siRNA were chemically synthesized, and were then transiently transfected into human lung cancer cell line SBC - 5 hy LipofectamineTM 2000. Expression of MACC1 mRNA and protein in siRNA transfected cells were examined by RT - PCR and Western blotting analysis. Proliferation and migration of siRNA transfected SBC - 5 cells were examined by MTT and Wound healing assay. Results : Transfection of specific siRNA targeting MACC1 into SBC - 5 cells inhihited MACC1 gene expression and both mRNA and protein expression levels declined markedly. siRNA transfected SBC - 5 cells showed decreased ability of proliferation and migration. Conclusion : The specific siRNA targeting MACC1 gene can significantly inhibit proliferation and migration of SBC - 5 cells, MACC1 may serve as a potential target for gene therapy of lung cancer.

  14. Prognostic Value of MACC1 in Digestive System Neoplasms: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Wu, Zhenzhen; Zhou, Rui; Su, Yuqi; Sun, Li; Liao, Yulin; Liao, Wangjun

    2015-01-01

    Metastasis associated in colon cancer 1 (MACC1), a newly identified oncogene, has been associated with poor survival of cancer patients by multiple studies. However, the prognostic value of MACC1 in digestive system neoplasms needs systematic evidence to verify. Therefore, we aimed to provide further evidence on this topic by systematic review and meta-analysis. Literature search was conducted in multiple databases and eligible studies analyzing survival data and MACC1 expression were included for meta-analysis. Hazard ratio (HR) for clinical outcome was chosen as an effect measure of interest. According to our inclusion criteria, 18 studies with a total of 2,948 patients were identified. Pooled HRs indicated that high MACC1 expression significantly correlates with poorer OS in patients with digestive system neoplasms (HR = 1.94; 95% CI: 1.49-2.53) as well as poorer relapse-free survival (HR = 1.94, 95% CI: 1.33-2.82). The results of subgroup studies categorized by methodology, anatomic structure, and cancer subtype for pooled OS were all consistent with the overall pooled HR for OS as well. No publication bias was detected according to test of funnel plot asymmetry and Egger's test. In conclusion, high MACC1 expression may serve as a prognostic biomarker to guide individualized management in clinical practice for digestive system neoplasms.

  15. ITS version 5.0 :the integrated TIGER series of coupled electron/Photon monte carlo transport codes with CAD geometry.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2005-09-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  16. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS versions 8, 9 and 10.1

    Directory of Open Access Journals (Sweden)

    John Orchard

    2010-10-01

    Full Text Available John Orchard1, Katherine Rae1, John Brooks2, Martin Hägglund3, Lluis Til4, David Wales5, Tim Wood61Sports Medicine at Sydney University, Sydney NSW Australia; 2Rugby Football Union, Twickenham, England, UK; 3Department of Medical and Health Sciences, Linköping University, Linköping, Sweden; 4FC Barcelona, Barcelona, Catalonia, Spain; 5Arsenal FC, Highbury, England, UK; 6Tennis Australia, Melbourne, Vic, AustraliaAbstract: The Orchard Sports Injury Classification System (OSICS is one of the world’s most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer, Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes and OSICS 10.1 (four digit codes are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements.Keywords: sports injury classification, epidemiology, surveillance, coding

  17. User manual for version 4.3 of the Tripoli-4 Monte-Carlo method particle transport computer code; Notice d'utilisation du code Tripoli-4, version 4.3: code de transport de particules par la methode de Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Both, J.P.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B

    2003-07-01

    This manual relates to Version 4.3 TRIPOLI-4 code. TRIPOLI-4 is a computer code simulating the transport of neutrons, photons, electrons and positrons. It can be used for radiation shielding calculations (long-distance propagation with flux attenuation in non-multiplying media) and neutronic calculations (fissile medium, criticality or sub-criticality basis). This makes it possible to calculate k{sub eff} (for criticality), flux, currents, reaction rates and multi-group cross-sections. TRIPOLI-4 is a three-dimensional code that uses the Monte-Carlo method. It allows for point-wise description in terms of energy of cross-sections and multi-group homogenized cross-sections and features two modes of geometrical representation: surface and combinatorial. The code uses cross-section libraries in ENDF/B format (such as JEF2-2, ENDF/B-VI and JENDL) for point-wise description cross-sections in APOTRIM format (from the APOLLO2 code) or a format specific to TRIPOLI-4 for multi-group description. (authors)

  18. Single particle calculations for a Woods-Saxon potential with triaxial deformations, and large Cartesian oscillator basis (TRIAXIAL 2014, Third version of the code Triaxial)

    Science.gov (United States)

    Mohammed-Azizi, B.; Medjadi, D. E.

    2014-11-01

    Theory and FORTRAN program of the first version of this code (TRIAXIAL) have already been described in detail in Computer Physics Comm. 156 (2004) 241-282. A second version of this code (TRIAXIAL 2007) has been given in CPC 176 (2007) 634-635. The present FORTRAN program is the third version (TRIAXIAL 2014) of the same code. Now, It is written in free format. As the former versions, this FORTRAN program solves the same Schrodinger equation of the independent particle model of the atomic nucleus with the same method. However, the present version is much more convenient. In effect, it is characterized by the fact that the eigenvalues and the eigenfunctions can be given by specific subroutines. The latters did not exist in the old versions (2004 and 2007). In addition, it is to be noted that in the previous versions, the eigenfunctions were only given by their coefficients of their expansion onto the harmonic oscillator basis. This method is needed in some cases. But in other cases, it is preferable to treat the eigenfunctions directly in configuration space. For this reason, we have implemented an additional subroutine for this task. Some other practical subroutines have also been implemented. Moreover, eigenvalues and eigenfunctions are recorded onto several files. All these new features of the code and some important aspects of its structure are explained in the document ‘Triaxial2014 use.pdf’. Catalogue identifier: ADSK_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSK_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 13672 No. of bytes in distributed program, including test data, etc.: 217598 Distribution format: tar.gz Programming language: FORTRAN 77/90 (double precision). Computer: PC. Pentium 4, 2600MHz and beyond. Operating system: WINDOWS XP

  19. Mg/O2 Battery Based on the Magnesium-Aluminum Chloride Complex (MACC) Electrolyte

    DEFF Research Database (Denmark)

    Vardar, Galin; Smith, Jeffrey G.; Thomson, Travis

    2016-01-01

    Mg/O2 cells employing a MgCl2/AlCl3/DME (MACC/DME) electrolyte are cycled and compared to cells with modified Grignard electrolytes, showing that performance of magnesium/oxygen batteries depends strongly on electrolyte composition. Discharge capacity is far greater for MACC/DME-based cells, while...... substantially and likely explains the poor rechargeability. An additional impedance rise consistent with film formation on the Mg negative electrode suggests the presence of detrimental O2 crossover. Minimizing O2 crossover and bypassing charge transfer through the discharge product would improve battery...

  20. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS) versions 8, 9 and 10.1.

    Science.gov (United States)

    Orchard, John; Rae, Katherine; Brooks, John; Hägglund, Martin; Til, Lluis; Wales, David; Wood, Tim

    2010-01-01

    The Orchard Sports Injury Classification System (OSICS) is one of the world's most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer), Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes) and OSICS 10.1 (four digit codes) are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements.

  1. Solution of the Skyrme HF+BCS equation on a 3D mesh. II. A new version of the Ev8 code

    CERN Document Server

    Ryssens, W; Bender, M; Heenen, P -H

    2014-01-01

    We describe a new version of the EV8 code that solves the nuclear Skyrme-Hartree-Fock+BCS problem using a 3-dimensional cartesian mesh. Several new features have been implemented with respect to the earlier version published in 2005. In particular, the numerical accuracy has been improved for a given mesh size by (i) implementing a new solver to determine the Coulomb potential for protons (ii) implementing a more precise method to calculate the derivatives on a mesh that had already been implemented earlier in our beyond-mean-field codes. The code has been made very flexible to enable the use of a large variety of Skyrme energy density functionals that have been introduced in the last years. Finally, the treatment of the constraints that can be introduced in the mean-field equations has been improved. The code Ev8 is today the tool of choice to study the variation of the energy of a nucleus from its ground state to very elongated or triaxial deformations with a well-controlled accuracy.

  2. Targeting MACC1 by RNA interference inhibits proliferation and invasion of bladder urothelial carcinoma in T24 cells.

    Science.gov (United States)

    Xu, Song-Tao; Ding, Xiang; Ni, Qing-Feng; Jin, Shao-Ju

    2015-01-01

    The purpose of this article is to research on whether MACC1 can serve as a potential target for gene therapy of human bladder urothelial carcinoma (BUC). In this study, the expression of MACC1 gene was knocked down by RNA interference (RNAi) in the T24 cell (human BUC cell). The transcription level of MACC1 was detected by RT-PCR. Activities of MACC1, caspase-3, caspase-8, Bax and Met (mesenchymal-epithelial transition factor) protein were measured by Western blot. The cell proliferation and apoptosis were detected by MTT and flow cytometry. The cell's invasion ability was performed on Matrigel transwell assay. We also detect MMP2 (metalloproteinase-2) proteins by ELISA. The results showed that the level of MACC1 mRNA and protein was significantly reduced after RNAi. MTT assay showed that the proliferation of T24 cell was decreased due to RNA interference. Apoptosis studies also showed that MACC1 gene interference in T24 loses its anti-apoptotic effects. The expression of apoptosis proteins (Caspase-3, Caspase-8 and Bax) increased significantly due to the MACC1 RNAi. The level of Met protein was down-regulated obviously due to RNAi. Transwell assay showed that invasion abilities of T24 cells were reduced obviously due to MACC1 RNAi. Further studies showed that the secretion of MMP-2 was reduced by RNAi. It can conclude that the ability of proliferation and invasion in T24 cells can be inhibited by RNAi-targeting MACC1. As a result, MACC1 can serve as a potential target for gene therapy of human bladder urothelial carcinoma.

  3. MIG version 0.0 model interface guidelines: Rules to accelerate installation of numerical models into any compliant parent code

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, R.M.; Wong, M.K.

    1996-08-01

    A set of model interface guidelines, called MIG, is presented as a means by which any compliant numerical material model can be rapidly installed into any parent code without having to modify the model subroutines. Here, {open_quotes}model{close_quotes} usually means a material model such as one that computes stress as a function of strain, though the term may be extended to any numerical operation. {open_quotes}Parent code{close_quotes} means a hydrocode, finite element code, etc. which uses the model and enforces, say, the fundamental laws of motion and thermodynamics. MIG requires the model developer (who creates the model package) to specify model needs in a standardized but flexible way. MIG includes a dictionary of technical terms that allows developers and parent code architects to share a common vocabulary when specifying field variables. For portability, database management is the responsibility of the parent code. Input/output occurs via structured calling arguments. As much model information as possible (such as the lists of required inputs, as well as lists of precharacterized material data and special needs) is supplied by the model developer in an ASCII text file. Every MIG-compliant model also has three required subroutines to check data, to request extra field variables, and to perform model physics. To date, the MIG scheme has proven flexible in beta installations of a simple yield model, plus a more complicated viscodamage yield model, three electromechanical models, and a complicated anisotropic microcrack constitutive model. The MIG yield model has been successfully installed using identical subroutines in three vectorized parent codes and one parallel C++ code, all predicting comparable results. By maintaining one model for many codes, MIG facilitates code-to-code comparisons and reduces duplication of effort, thereby reducing the cost of installing and sharing models in diverse new codes.

  4. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    Energy Technology Data Exchange (ETDEWEB)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  5. The relationship development assessment - research version: preliminary validation of a clinical tool and coding schemes to measure parent-child interaction in autism.

    Science.gov (United States)

    Larkin, Fionnuala; Guerin, Suzanne; Hobson, Jessica A; Gutstein, Steven E

    2015-04-01

    The aim of this project was to replicate and extend findings from two recent studies on parent-child relatedness in autism (Beurkens, Hobson, & Hobson, 2013; Hobson, Tarver, Beurkens, & Hobson, 2013, under review) by adapting an observational assessment and coding schemes of parent-child relatedness for the clinical context and examining their validity and reliability. The coding schemes focussed on three aspects of relatedness: joint attentional focus (Adamson, Bakeman, & Deckner, 2004), the capacity to co-regulate an interaction and the capacity to share emotional experiences. The participants were 40 children (20 with autism, 20 without autism) aged 6-14, and their parents. Parent-child dyads took part in the observational assessment and were coded on these schemes. Comparisons were made with standardised measures of autism severity (Autism Diagnostic Observation Schedule, ADOS: Lord, Rutter, DiLavore, & Risi, 2001; Social Responsiveness Scale, SRS: Constantino & Gruber, 2005), relationship quality (Parent Child Relationship Inventory, PCRI: Gerard, 1994) and quality of parent-child interaction (Dyadic Coding Scales, DCS: Humber & Moss, 2005). Inter-rater reliability was very good and, as predicted, codes both diverged from the measure of parent-child relationship and converged with a separate measure of parent-child interaction quality. A detailed profile review revealed nuanced areas of group and individual differences which may be specific to verbally-able school-age children. The results support the utility of the Relationship Development Assessment - Research Version for clinical practice. © The Author(s) 2013.

  6. Over-expression of Metastasis-associated in Colon Cancer-1 (MACC1)Associates with Better Prognosis of Gastric Cancer Patients

    Institute of Scientific and Technical Information of China (English)

    Shao-hua Ge; Jia-fu Ji; Xiao-jiang Wu; Xiao-hong Wang; Xiao-fang Xing; Lian-hai Zhang; Yu-bing Zhu; Hong Du; Bin Dong; Ying Hu

    2011-01-01

    Objective: The aim of this study was to detect metastasis-associated in colon cancer-1 (MACC1) expression in Chinese gastric cancer and analyze the relationship between MACC1 expression and postoperative survival. Methods: The expression of MACC1 and c-MET protein in a sample of 128 gastric cancer tissues was detected by immunohistochemistry. A retrospective cohort study on the prognosis was carried out and data were collected from medical records. Results: The positive rate of MACC1 protein expression in gastric cancer was 47.66%, higher than that in adjacent noncancerous mucosa (P<0.001). MACC1 protein expression was not related to the clinicopathological variables involved. Kaplan-Meier analysis revealed that the survival of MACC1 positive group tended to be better than that of MACC1 negative group, particularly in patients with stage Ⅲ carcinoma (P=0.032). Cox regression analysis revealed that MACC1 protein over-expression in gastric cancer tended to be a protective factor with hazard ratio of 0.621 (P=0.057). Immunohistochemical analysis showed that the positive rate of c-MET protein expression was much higher in cases with positive MACC1 expression in gastric cancer (P=0.002), but P53 expression was not associated with MACC1 expression. Conclusion: MACC1 over-expression implies better survival and may be an independent prognostic factor for gastric cancer in Chinese patients.

  7. Prognostic significance of metastasis associated in colon cancer 1 (MACC1 expression in patients with gallbladder cancer

    Directory of Open Access Journals (Sweden)

    Lijian Chen

    2014-01-01

    Full Text Available Background: The clinical significance of metastasis associated in colon cancer 1 (MACC1, in human gallbladder cancer, is not yet established. This study was performed to assess the expression of MACC1 in benign and malignant gallbladder lesions, and to assess its clinicopathological significance. Materials and Methods: Tissue samples from resected gallbladder cancer (n = 70 and cholelithiasis (n = 70 were evaluated for MACC1 expression by immunohistochemical staining. Their expression was correlated with different clinicopathological parameters. Results: Cytoplasmic MACC1 expression was significantly higher (58.6% in gallbladder cancer than in chronic cholecystitis (27.1%, P 0.05. The univariate Kaplan-Meier analysis showed that a positive MACC1 expression was associated with decreased overall survival (P < 0.001. The multivariate Cox regression analysis showed that MACC1 expression and the histopathological subtypes were independent risk factors for disease-free survival. Conclusion: The expression of MACC1 might be closely related to carcinogenesis, clinical biological behaviors, and prognosis of gallbladder adenocarcinoma.

  8. The Eyjafjallajökull volcanic eruption from the MACC perspective

    Science.gov (United States)

    Engelen, Richard; Flemming, Johannes; Benedetti, Angela; Kaiser, Johannes W.; Morcrette, Jean-Jacques; Simmons, Adrian; MACC Consortium

    2010-05-01

    The recent eruption of the Eyjafjallajökull volcano on Iceland triggered a strong response from many modelling and observation groups around Europe. MACC (Monitoring Atmospheric Composition and Climate) is building the atmospheric component of Europe's GMES (Global Monitoring for Environment and Security) initiative and has used its pre-operational global assimilation and forecasting system to provide simulations of the development of the volcanic ash plume. Some basic assumptions were made about the height of the injection and the life time of the tracer. These simulations have been provided on a daily basis on the MACC web site. At the same time MACC has also tried to gather relevant observations on top of those that are already assimilated in the pre-operational system. This will allow validation of our plume forecasts as well as assessment of the potential of assimilating these additional observations. The main aim now is to further develop the MACC system to a stage where it can adequately respond to similar events in the future when GMES becomes operational. MACC will then be able to offer support to the official Volcanic Ash Advisory Centres in their task of advising the aviation authorities. In this presentation we will present our plume simulations as well as some initial validation. We will also present some preliminary data assimilation experiments to show the potential and difficulties of data assimilation in case of volcanic eruptions. Finally, we will try to make a first assessment of what is needed in the near future in terms of model development and observations to be fully prepared for events like the eruption of the Eyjafjallajökull

  9. Does geographical variability influence five-year MACCE rates in the multicentre SYNTAX revascularisation trial?

    Science.gov (United States)

    Roy, Andrew K; Chevalier, Bernard; Lefèvre, Thierry; Louvard, Yves; Segurado, Ricardo; Sawaya, Fadi; Spaziano, Marco; Neylon, Antoinette; Serruys, Patrick A; Dawkins, Keith D; Kappetein, Arie Pieter; Mohr, Friedrich-Wilhelm; Colombo, Antonio; Feldman, Ted; Morice, Marie-Claude

    2017-09-20

    The use of multiple geographical sites for randomised cardiovascular trials may lead to important heterogeneity in treatment effects. This study aimed to determine whether treatment effects from different geographical recruitment regions impacted significantly on five-year MACCE rates in the SYNTAX trial. Five-year SYNTAX results (n=1,800) were analysed for geographical variability by site and country for the effect of treatment (CABG vs. PCI) on MACCE rates. Fixed, random, and linear mixed models were used to test clinical covariate effects, such as diabetes, lesion characteristics, and procedural factors. Comparing five-year MACCE rates, the pooled odds ratio (OR) between study sites was 0.58 (95% CI: 0.47-0.71), and countries 0.59 (95% CI: 0.45-0.73). By homogeneity testing, no individual site (X2=93.8, p=0.051) or country differences (X2=25.7, p=0.080) were observed. For random effects models, the intraclass correlation was minimal (ICC site=5.1%, ICC country=1.5%, pfive-year MACCE outcomes (ICC 1.3%-5.2%), nor did revascularisation of the left main vs. three-vessel disease (p=0.241), across site or country subgroups. For CABG patients, the number of arterial (p=0.49) or venous (p=0.38) conduits used also made no difference. Geographic variability has no significant treatment effect on MACCE rates at five years. These findings highlight the generalisability of the five-year outcomes of the SYNTAX study.

  10. Benchmarking a modified version of the civ3 nonrelativistic atomic-structure code within Na-like-tungsten R -matrix calculations

    Science.gov (United States)

    Turkington, M. D.; Ballance, C. P.; Hibbert, A.; Ramsbottom, C. A.

    2016-08-01

    In this work we explore the validity of employing a modified version of the nonrelativistic structure code civ3 for heavy, highly charged systems, using Na-like tungsten as a simple benchmark. Consequently, we present radiative and subsequent collisional atomic data compared with corresponding results from a fully relativistic structure and collisional model. Our motivation for this line of study is to benchmark civ3 against the relativistic grasp0 structure code. This is an important study as civ3 wave functions in nonrelativistic R -matrix calculations are computationally less expensive than their Dirac counterparts. There are very few existing data for the W LXIV ion in the literature with which we can compare except for an incomplete set of energy levels available from the NIST database. The overall accuracy of the present results is thus determined by the comparison between the civ3 and grasp0 structure codes alongside collisional atomic data computed by the R -matrix Breit-Pauli and Dirac codes. It is found that the electron-impact collision strengths and effective collision strengths computed by these differing methods are in good general agreement for the majority of the transitions considered, across a broad range of electron temperatures.

  11. Assimilation of atmospheric methane products in the MACC-II system: from SCIAMACHY to TANSO and IASI

    Directory of Open Access Journals (Sweden)

    S. Massart

    2014-01-01

    Full Text Available The Monitoring Atmospheric Composition and Climate Interim Implementation (MACC-II delayed-mode (DM system has been producing an atmospheric methane (CH4 analysis 6 months behind real time since June 2009. This analysis used to rely on the assimilation of the CH4 product from the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY instrument on board Envisat. Recently the \\textit{Laboratoire de Météorologie Dynamique} (LMD CH4 products from the Infrared Atmospheric Sounding Interferometer (IASI and the SRON Netherlands Institute for Space Research CH4 products from the Thermal And Near-infrared Sensor for carbon Observation (TANSO were added to the DM system. With the loss of Envisat in April 2012, the DM system has to now rely on the assimilation of methane data from TANSO and IASI. This paper documents the impact of this change in the observing system on the methane tropospheric analysis. It is based on four experiments: one free run and three analyses from respectively the assimilation of SCIAMACHY, TANSO and a combination of TANSO and IASI CH4 products in the MACC-II system. The period between December 2010 and April 2012 is studied. This corresponds to a period during which the performance of SCIAMACHY was deteriorating. The SCIAMACHY experiment globally underestimates the tropospheric methane by 35 part per billion (ppb compared to the HIAPER Pole-to-Pole Observations (HIPPO data and the methane column by 23 ppb compared the Total Carbon Column Observing Network (TCCON data, when the global bias of the free run against the same HIPPO and TCCON data is respectively −5 ppb and 4 ppb. The assimilated TANSO product changed in October 2011 from version v.1 to version v.2.0. The analysis of version v.1 globally underestimates the tropospheric methane by 18 ppb compared to the HIPPO data and the column by 11 ppb compared to the TCCON data. In contrast, the analysis of version v.2.0 globally overestimates the

  12. Verification of ECMWF and ECMWF/MACC's global and direct irradiance forecasts with respect to solar electricity production forecasts

    Directory of Open Access Journals (Sweden)

    M. Schroedter-Homscheidt

    2017-02-01

    Full Text Available The successful electricity grid integration of solar energy into day-ahead markets requires at least hourly resolved 48 h forecasts. Technologies as photovoltaics and non-concentrating solar thermal technologies make use of global horizontal irradiance (GHI forecasts, while all concentrating technologies both from the photovoltaic and the thermal sector require direct normal irradiances (DNI. The European Centre for Medium-Range Weather Forecasts (ECMWF has recently changed towards providing direct as well as global irradiances. Additionally, the MACC (Monitoring Atmospheric Composition & Climate near-real time services provide daily analysis and forecasts of aerosol properties in preparation of the upcoming European Copernicus programme. The operational ECMWF/IFS (Integrated Forecast System forecast system will in the medium term profit from the Copernicus service aerosol forecasts. Therefore, within the MACC‑II project specific experiment runs were performed allowing for the assessment of the performance gain of these potential future capabilities. Also the potential impact of providing forecasts with hourly output resolution compared to three-hourly resolved forecasts is investigated. The inclusion of the new aerosol climatology in October 2003 improved both the GHI and DNI forecasts remarkably, while the change towards a new radiation scheme in 2007 only had minor and partly even unfavourable impacts on the performance indicators. For GHI, larger RMSE (root mean square error values are found for broken/overcast conditions than for scattered cloud fields. For DNI, the findings are opposite with larger RMSE values for scattered clouds compared to overcast/broken cloud situations. The introduction of direct irradiances as an output parameter in the operational IFS version has not resulted in a general performance improvement with respect to biases and RMSE compared to the widely used Skartveit et al. (1998 global to direct irradiance

  13. Construction of Efficient Version Management and Code Maintenance Mechanism%构建高效的版本管理和代码维护机制

    Institute of Scientific and Technical Information of China (English)

    翟宏宇; 文大化; 徐春凤

    2012-01-01

    The article discussed the construction of efficient management and code maintenance meehanisms based on the windows mobile phone development platform for driver development.Version controls check and ensures that the project each integration code is sound, and the location and purpose of each change is clear.Daily Build and BVT compiler veri-fy testing was used to verify the function of each version's error, making the entire project is to be developed officiontly%本文讨论的是基于Windows Moblie平台的手机项目开发中,针对驱动开发,构建高效管理和代码维护机制.版本控制检查并保证项目每次的集成代码是健全的,清楚每次变更的位置及目的.采用Daily Build和BVT编译验证测试,验证每一个版本改动后的功能是否出现错误,使得整个项目的开发高效的进行.

  14. A fully-neoclassical finite-orbit-width version of the CQL3D Fokker-Planck code

    Science.gov (United States)

    Petrov, Yu V.; Harvey, R. W.

    2016-11-01

    The time-dependent bounce-averaged CQL3D flux-conservative finite-difference Fokker-Planck equation (FPE) solver has been upgraded to include finite-orbit-width (FOW) capabilities which are necessary for an accurate description of neoclassical transport, losses to the walls, and transfer of particles, momentum, and heat to the scrape-off layer. The FOW modifications are implemented in the formulation of the neutral beam source, collision operator, RF quasilinear diffusion operator, and in synthetic particle diagnostics. The collisional neoclassical radial transport appears naturally in the FOW version due to the orbit-averaging of local collision coefficients coupled with transformation coefficients from local (R, Z) coordinates along each guiding-center orbit to the corresponding midplane computational coordinates, where the FPE is solved. In a similar way, the local quasilinear RF diffusion terms give rise to additional radial transport of orbits. We note that the neoclassical results are obtained for ‘full’ orbits, not dependent on a common small orbit-width approximation. Results of validation tests for the FOW version are also presented.

  15. MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  16. ECMWF MACC-II evaluation of performances with MPLNET Lidar network at NASA Goddard Flight Center

    Science.gov (United States)

    Lolli, Simone; Welton, Ellsworth J.; Benedetti, Angela; Lewis, Jasper

    2016-04-01

    Aerosol vertical distribution is a critical parameter for most of the common aerosol forecast models. In this study are evaluated the performances of the MACC-II ECMWF aerosol model in forecasting aerosol extinction profiles and planetary boundary layer height versus the new V3 measured MPLNET Lidar extinction retrievals taken as reference at continuous operational site Goddard Space Flight Center, MD, USA. The model is evaluated at different assimilation stages: no assimilation, MODIS Aerosol Optical Depth (AOD) assimilation and MODIS AOD plus lidar CALIPSO assimilation. The sensitivity study of the model is also investigated respect to the assimilation process..Assessing the model performances it is the first step for future near-real time lidar data assimilation into MACC-II aerosol model forecast.

  17. WSPEEDI (worldwide version of SPEEDI): A computer code system for the prediction of radiological impacts on Japanese due to a nuclear accident in foreign countries

    Energy Technology Data Exchange (ETDEWEB)

    Chino, Masamichi; Yamazawa, Hiromi; Nagai, Haruyasu; Moriuchi, Shigeru [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ishikawa, Hirohiko

    1995-09-01

    A computer code system has been developed for near real-time dose assessment during radiological emergencies. The system WSPEEDI, the worldwide version of SPEEDI (System for Prediction of Environmental Emergency Dose Information) aims at predicting the radiological impact on Japanese due to a nuclear accident in foreign countries. WSPEEDI consists of a mass-consistent wind model WSYNOP for large-scale wind fields and a particle random walk model GEARN for atmospheric dispersion and dry and wet deposition of radioactivity. The models are integrated into a computer code system together with a system control software, worldwide geographic database, meteorological data processor and graphic software. The performance of the models has been evaluated using the Chernobyl case with reliable source terms, well-established meteorological data and a comprehensive monitoring database. Furthermore, the response of the system has been examined by near real-time simulations of the European Tracer Experiment (ETEX), carried out over about 2,000 km area in Europe. (author).

  18. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Aufiero, M.; Cammi, A.; Fiorina, C. [Politecnico di Milano, Department of Energy, CeSNEF (Enrico Fermi Center for Nuclear Studies), via Ponzio, 34/3, I-20133 Milano (Italy); Leppänen, J. [VTT Technical Research Centre of Finland, P.O. Box 1000, FI-02044 VTT (Finland); Luzzi, L., E-mail: lelio.luzzi@polimi.it [Politecnico di Milano, Department of Energy, CeSNEF (Enrico Fermi Center for Nuclear Studies), via Ponzio, 34/3, I-20133 Milano (Italy); Ricotti, M.E. [Politecnico di Milano, Department of Energy, CeSNEF (Enrico Fermi Center for Nuclear Studies), via Ponzio, 34/3, I-20133 Milano (Italy)

    2013-10-15

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  19. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    Science.gov (United States)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  20. MACCS : Multi-Mission Atmospheric Correction and Cloud Screening tool for high-frequency revisit data processing

    Science.gov (United States)

    Petrucci, B.; Huc, M.; Feuvrier, T.; Ruffel, C.; Hagolle, O.; Lonjou, V.; Desjardins, C.

    2015-10-01

    For the production of Level2A products during Sentinel-2 commissioning in the Technical Expertise Center Sentinel-2 in CNES, CESBIO proposed to adapt the Venus Level-2 , taking advantage of the similarities between the two missions: image acquisition at a high frequency (2 days for Venus, 5 days with the two Sentinel-2), high resolution (5m for Venus, 10, 20 and 60m for Sentinel-2), images acquisition under constant viewing conditions. The Multi-Mission Atmospheric Correction and Cloud Screening (MACCS) tool was born: based on CNES Orfeo Toolbox Library, Venμs processor which was already able to process Formosat2 and VENμS data, was adapted to process Sentinel-2 and Landsat5-7 data; since then, a great effort has been made reviewing MACCS software architecture in order to ease the add-on of new missions that have also the peculiarity of acquiring images at high resolution, high revisit and under constant viewing angles, such as Spot4/Take5 and Landsat8. The recursive and multi-temporal algorithm is implemented in a core that is the same for all the sensors and that combines several processing steps: estimation of cloud cover, cloud shadow, water, snow and shadows masks, of water vapor content, aerosol optical thickness, atmospheric correction. This core is accessed via a number of plug-ins where the specificity of the sensor and of the user project are taken into account: products format, algorithmic processing chaining and parameters. After a presentation of MACCS architecture and functionalities, the paper will give an overview of the production facilities integrating MACCS and the associated specificities: the interest for this tool has grown worldwide and MACCS will be used for extensive production within the THEIA land data center and Agri-S2 project. Finally the paper will zoom on the use of MACCS during Sentinel-2 In Orbit Test phase showing the first Level-2A products.

  1. GOME-2 total ozone columns from MetOp-A/MetOp-B and assimilation in the MACC system

    Directory of Open Access Journals (Sweden)

    N. Hao

    2014-03-01

    Full Text Available The two Global Ozone Monitoring Instrument (GOME-2 sensors operated in tandem are flying onboard EUMETSAT's MetOp-A and MetOp-B satellites, launched in October 2006 and September 2012 respectively. This paper presents the operational GOME-2/MetOp-A (GOME-2A and GOME-2/MetOp-B (GOME-2B total ozone products provided by the EUMETSAT Satellite Application Facility on Ozone and Atmospheric Chemistry Monitoring (O3M-SAF. These products are generated using the latest version of the GOME Data Processor (GDP version 4.7. The enhancements in GDP 4.7, including the application of Brion–Daumont–Malicet ozone absorption cross-sections, are presented here. On a global scale, GOME-2B has the same high accuracy as the corresponding GOME-2A products. There is an excellent agreement between the ozone total columns from the two sensors, with GOME-2B values slightly lower with a mean difference of only 0.55 ± 0.29%. First global validation results for 6 months of GOME-2B total ozone using ground-based measurements show that on average the GOME-2B total ozone data obtained with GDP 4.7 slightly overestimate Dobson observations by about 2.0 ± 1.0% and Brewer observations by about 1.0 ± 0.8%. It is concluded that the total ozone columns (TOCs provided by GOME-2A and GOME-2B are consistent and may be used simultaneously without introducing trends or other systematic effects. GOME-2A total ozone data have been used operationally in the Copernicus atmospheric service project MACC-II (Monitoring Atmospheric Composition and Climate – Interim Implementation near-real-time (NRT system since October 2013. The magnitude of the bias correction needed for assimilating GOME-2A ozone is reduced (to about −6 DU in the global mean when the GOME-2 ozone retrieval algorithm changed to GDP 4.7.

  2. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    Energy Technology Data Exchange (ETDEWEB)

    Zehtabian, M; Zaker, N; Sina, S [Shiraz University, Shiraz, Fars (Iran, Islamic Republic of); Meigooni, A Soleimani [Comprehensive Cancer Center of Nevada, Las Vegas, Nevada (United States)

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 which is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.

  3. Mg/O2 Battery Based on the Magnesium-Aluminum Chloride Complex (MACC) Electrolyte

    DEFF Research Database (Denmark)

    Vardar, Galin; Smith, Jeffrey G.; Thomson, Travis

    2016-01-01

    rechargeability in these systems is severely limited. The Mg/O2-MACC/DME discharge product comprises a mixture of Mg(ClO4)2 and MgCl2, with the latter likely formed from slow decomposition of the former. The presence of Cl in these compounds suggests that the electrolyte participates in the cell reaction...... or reacts readily with the initial electrochemical products. A rate study suggests that O2 diffusion in the electrolyte limits discharge capacities at higher currents. Formation of an insulating product film on the positive electrodes of Mg/O2-MACC/DME cells following deep discharge increases cell impedance......Mg/O2 cells employing a MgCl2/AlCl3/DME (MACC/DME) electrolyte are cycled and compared to cells with modified Grignard electrolytes, showing that performance of magnesium/oxygen batteries depends strongly on electrolyte composition. Discharge capacity is far greater for MACC/DME-based cells, while...

  4. The MACC reanalysis: an 8-yr data set of atmospheric composition

    Directory of Open Access Journals (Sweden)

    A. Inness

    2012-12-01

    Full Text Available An eight-year long reanalysis of atmospheric composition data covering the period 2003–2010 was constructed as part of the FP7 funded Monitoring Atmospheric Composition and Climate project by assimilating satellite data into a global model and data assimilation system. This reanalysis provides fields of chemically reactive gases, namely carbon monoxide, ozone, nitrogen oxides, and formaldehyde, as well as aerosols and greenhouse gases globally at a resolution of about 80 km for both the troposphere and the stratosphere. This paper describes the assimilation system for the reactive gases and presents validation results for the reactive gases analysis fields to document the dataset and to give a first indication of its quality.

    Tropospheric CO values from the MACC reanalysis are on average 10–20% lower than routine observations from commercial aircrafts over airports through most of the troposphere, and have larger negative biases in the boundary layer at urban sites affected by air pollution, possibly due to an underestimation of CO or precursor emissions.

    Stratospheric ozone fields from the MACC reanalysis agree with ozone sondes and ACE-FTS data to within ±10% in most situations. In the troposphere the reanalysis shows biases of −5% to +10% with respect to ozone sondes and aircraft data in the extratropics, but has larger negative biases in the tropics. Area averaged total column ozone agrees with ozone fields from a multi sensor reanalysis data set to within a few percent.

    NO2 fields from the reanalysis show the right seasonality over polluted urban areas of the NH and over tropical biomass burning areas, but underestimate wintertime NO2 maxima over anthropogenic pollution regions and overestimate NO2 in Northern and Southern Africa during the tropical biomass burning seasons.

    Tropospheric HCHO is well simulated in the MACC reanalysis even though no satellite data are

  5. The MACC reanalysis: an 8 yr data set of atmospheric composition

    Directory of Open Access Journals (Sweden)

    A. Inness

    2013-04-01

    Full Text Available An eight-year long reanalysis of atmospheric composition data covering the period 2003–2010 was constructed as part of the FP7-funded Monitoring Atmospheric Composition and Climate project by assimilating satellite data into a global model and data assimilation system. This reanalysis provides fields of chemically reactive gases, namely carbon monoxide, ozone, nitrogen oxides, and formaldehyde, as well as aerosols and greenhouse gases globally at a horizontal resolution of about 80 km for both the troposphere and the stratosphere. This paper describes the assimilation system for the reactive gases and presents validation results for the reactive gas analysis fields to document the data set and to give a first indication of its quality. Tropospheric CO values from the MACC reanalysis are on average 10–20% lower than routine observations from commercial aircrafts over airports through most of the troposphere, and have larger negative biases in the boundary layer at urban sites affected by air pollution, possibly due to an underestimation of CO or precursor emissions. Stratospheric ozone fields from the MACC reanalysis agree with ozonesondes and ACE-FTS data to within ±10% in most seasons and regions. In the troposphere the reanalysis shows biases of −5% to +10% with respect to ozonesondes and aircraft data in the extratropics, but has larger negative biases in the tropics. Area-averaged total column ozone agrees with ozone fields from a multi-sensor reanalysis data set to within a few percent. NO2 fields from the reanalysis show the right seasonality over polluted urban areas of the NH and over tropical biomass burning areas, but underestimate wintertime NO2 maxima over anthropogenic pollution regions and overestimate NO2 in northern and southern Africa during the tropical biomass burning seasons. Tropospheric HCHO is well simulated in the MACC reanalysis even though no satellite data are assimilated. It shows good agreement with

  6. Parents' Assessments of Disability in Their Children Using World Health Organization International Classification of Functioning, Disability and Health, Child and Youth Version Joined Body Functions and Activity Codes Related to Everyday Life

    DEFF Research Database (Denmark)

    Illum, Niels Ove; Gradel, Kim Oren

    2017-01-01

    AIM: To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. METHOD......: Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers...... of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after...

  7. SPON2, a newly identified target gene of MACC1, drives colorectal cancer metastasis in mice and is prognostic for colorectal cancer patient survival.

    Science.gov (United States)

    Schmid, F; Wang, Q; Huska, M R; Andrade-Navarro, M A; Lemm, M; Fichtner, I; Dahlmann, M; Kobelt, D; Walther, W; Smith, J; Schlag, P M; Stein, U

    2016-11-17

    MACC1 (metastasis associated in colon cancer 1) is a prognostic biomarker for tumor progression, metastasis and survival of a variety of solid cancers including colorectal cancer (CRC). Here we aimed to identify the MACC1-induced transcriptome and key players mediating the MACC1-induced effects in CRC. We performed microarray analyses using CRC cells ectopically overexpressing MACC1. We identified more than 1300 genes at least twofold differentially expressed, including the gene SPON2 (Spondin 2) as 90-fold upregulated transcriptional target of MACC1. MACC1-dependent SPON2 expression regulation was validated on mRNA and protein levels in MACC1 high (endogenously or ectopically) and low (endogenously or by knockdown) expressing cells. Chromatin immunoprecipitation analysis demonstrated the binding of MACC1 to the gene promoter of SPON2. In cell culture, ectopic SPON2 overexpression induced cell viability, migration, invasion and colony formation in endogenously MACC1 and SPON2 low expressing cells, whereas SPON2 knockdown reduced proliferative, migratory and invasive abilities in CRC cells with high endogenous MACC1 and SPON2 expression. In intrasplenically transplanted NOD/SCID mice, metastasis induction was analyzed with control or SPON2-overexpressing CRC cells. Tumors with SPON2 overexpression induced liver metastasis (vs control animals without any metastases, P=0.0036). In CRC patients, SPON2 expression was determined in primary tumors (stages I-III), and survival time was analyzed by Kaplan-Meier method. CRC patients with high SPON2 expressing primary tumors demonstrated 8 months shorter metastasis-free survival (MFS) compared with patients with low SPON2 levels (P=0.053). Combining high levels of SPON2 and MACC1 improved the identification of high-risk patients with a 20-month shorter MFS vs patients with low biomarker expression. In summary, SPON2 is a transcriptional target of the metastasis gene MACC1. SPON2 induces cell motility in vitro and CRC

  8. Consistent evaluation of GOSAT, SCIAMACHY, CarbonTracker, and MACC through comparisons to TCCON

    Science.gov (United States)

    Kulawik, S. S.; Wunch, D.; O'Dell, C.; Frankenberg, C.; Reuter, M.; Oda, T.; Chevallier, F.; Sherlock, V.; Buchwitz, M.; Osterman, G.; Miller, C.; Wennberg, P.; Griffith, D. W. T.; Morino, I.; Dubey, M.; Deutscher, N. M.; Notholt, J.; Hase, F.; Warneke, T.; Sussmann, R.; Robinson, J.; Strong, K.; Schneider, M.; Wolf, J.

    2015-06-01

    Consistent validation of satellite CO2 estimates is a prerequisite for using multiple satellite CO2 measurements for joint flux inversion, and for establishing an accurate long-term atmospheric CO2 data record. We focus on validating model and satellite observation attributes that impact flux estimates and CO2 assimilation, including accurate error estimates, correlated and random errors, overall biases, biases by season and latitude, the impact of coincidence criteria, validation of seasonal cycle phase and amplitude, yearly growth, and daily variability. We evaluate dry air mole fraction (XCO2) for GOSAT (ACOS b3.5) and SCIAMACHY (BESD v2.00.08) as well as the CarbonTracker (CT2013b) simulated CO2 mole fraction fields and the MACC CO2 inversion system (v13.1) and compare these to TCCON observations (GGG2014). We find standard deviations of 0.9 ppm, 0.9, 1.7, and 2.1 ppm versus TCCON for CT2013b, MACC, GOSAT, and SCIAMACHY, respectively, with the single target errors 1.9 and 0.9 times the predicted errors for GOSAT and SCIAMACHY, respectively. When satellite data are averaged and interpreted according to error2 = a2+ b2 /n (where n are the number of observations averaged, a are the systematic (correlated) errors, and b are the random (uncorrelated) errors), we find that the correlated error term a = 0.6 ppm and the uncorrelated error term b = 1.7 ppm for GOSAT and a = 1.0 ppm, b = 1.4 ppm for SCIAMACHY regional averages. Biases at individual stations have year-to-year variability of ~ 0.3 ppm, with biases larger than the TCCON predicted bias uncertainty of 0.4 ppm at many stations. Using fitting software, we find that GOSAT underpredicts the seasonal cycle amplitude in the Northern Hemisphere (NH) between 46-53° N. In the Southern Hemisphere (SH), CT2013b underestimates the seasonal cycle amplitude. Biases are calculated for 3-month intervals and indicate the months that contribute to the observed amplitude differences. The seasonal cycle phase indicates

  9. BIGFLOW: A numerical code for simulating flow in variably saturated, heterogeneous geologic media. Theory and user`s manaual, Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Ababou, R. [CEA Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France); Bagtzoglou, A.C. [Southwest Research Inst., San Antonio, TX (United States). Center for Nuclear Waste Regulatory Analyses

    1993-06-01

    This report documents BIGFLOW 1.1, a numerical code for simulating flow in variably saturated heterogeneous geologic media. It contains the underlying mathematical and numerical models, test problems, benchmarks, and applications of the BIGFLOW code. The BIGFLOW software package is composed of a simulation and an interactive data processing code (DATAFLOW). The simulation code solves linear and nonlinear porous media flow equations based on Darcy`s law, appropriately generalized to account for 3D, deterministic, or random heterogeneity. A modified Picard Scheme is used for linearizing unsaturated flow equations, and preconditioned iterative methods are used for solving the resulting matrix systems. The data processor (DATAFLOW) allows interactive data entry, manipulation, and analysis of 3D datasets. The report contains analyses of computational performance carried out using Cray-2 and Cray-Y/MP8 supercomputers. Benchmark tests include comparisons with other independently developed codes, such as PORFLOW and CMVSFS, and with analytical or semi-analytical solutions.

  10. Evaluation of severe accident risks: Quantification of major input parameters: MAACS (MELCOR Accident Consequence Code System) input

    Energy Technology Data Exchange (ETDEWEB)

    Sprung, J.L.; Jow, H-N (Sandia National Labs., Albuquerque, NM (USA)); Rollstin, J.A. (GRAM, Inc., Albuquerque, NM (USA)); Helton, J.C. (Arizona State Univ., Tempe, AZ (USA))

    1990-12-01

    Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric and biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.

  11. Relationship between pathological characteristics of prostate cancer and MACC1, c-Met, Apaf-1 as well as Caspase-9 expression in tumor tissue

    Institute of Scientific and Technical Information of China (English)

    Hang-Yu Ai; Xue-De Qiu

    2016-01-01

    Objective:To study the MACC1, c-Met, Apaf-1 and Caspase-9 expression in prostate cancer tissue and their relationship with the pathological characteristics of tumor.Methods:Prostate cancer and benign prostatic hyperplasia patients who received surgical treatment in our hospital from May 2015 to March 2016 were selected as the research subjects, prostate cancer tissue and benign prostatic hyperplasia tissue were collected during surgery to determine MACC1, c-Met, Apaf-1 and Caspase-9 expression, and serum specimens were collected to determine miR-let7i, -32, -128, -196a and -218 expression levels.Results: mRNA content of MACC1 and c-Met in prostate cancer tissue were significantly higher than those in benign prostatic hyperplasia tissue while mRNA content of Apaf-1 and Caspase-9 were significantly lower than those in benign prostatic hyperplasia tissue, and the higher the Gleason grading and the higher the Whitmore-Prout staging, the higher the mRNA content of MACC1 and c-Met in prostate cancer tissue and the lower the mRNA content of Apaf-1 and Caspase-9; serum miR-32, miR-128 and miR-196a expression levels in prostate cancer patients were significantly higher than those in patients with benign prostatic hyperplasia and negatively correlated with the mRNA content of Apaf-1 and Caspase-9, and the expression levels of miR-let7i and miR-218 were significantly lower than those in patients with benign prostatic hyperplasia and negatively correlated with MACC1 and c-Met.Conclusion: High MACC1 and c-Met expression and low Caspase-9 and Apaf-1 expression are related to the occurrence and progression of prostate cancer, and the MACC1, c-Met, Apaf-1 and Caspase-9 expression in prostate cancer tissue are regulated by miRNAs.

  12. Rn3D: A finite element code for simulating gas flow and radon transport in variably saturated, nonisothermal porous media. User`s manual, Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Holford, D.J.

    1994-01-01

    This document is a user`s manual for the Rn3D finite element code. Rn3D was developed to simulate gas flow and radon transport in variably saturated, nonisothermal porous media. The Rn3D model is applicable to a wide range of problems involving radon transport in soil because it can simulate either steady-state or transient flow and transport in one-, two- or three-dimensions (including radially symmetric two-dimensional problems). The porous materials may be heterogeneous and anisotropic. This manual describes all pertinent mathematics related to the governing, boundary, and constitutive equations of the model, as well as the development of the finite element equations used in the code. Instructions are given for constructing Rn3D input files and executing the code, as well as a description of all output files generated by the code. Five verification problems are given that test various aspects of code operation, complete with example input files, FORTRAN programs for the respective analytical solutions, and plots of model results. An example simulation is presented to illustrate the type of problem Rn3D is designed to solve. Finally, instructions are given on how to convert Rn3D to simulate systems other than radon, air, and water.

  13. EGS code system: computer programs for the Monte Carlo simulation of electromagnetic cascade showers. Version 3. [EGS, PEGS, TESTSR, in MORTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Ford, R.L.; Nelson, W.R.

    1978-06-01

    A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables. (RWR)

  14. Chaos Many-Body Engine v03: A new version of code C# for chaos analysis of relativistic many-body systems with reactions

    Science.gov (United States)

    Grossu, I. V.; Besliu, C.; Jipa, Al.; Felea, D.; Esanu, T.; Stan, E.; Bordeianu, C. C.

    2013-04-01

    In this paper we present a new version of the Chaos Many-Body Engine C# application (Grossu et al. 2012 [1]). In order to benefit from the latest technological advantages, we migrated the application from .Net Framework 2.0 to .Net Framework 4.0. New tools were implemented also. Trying to estimate the particle interactions dependence on initial conditions, we considered a new distance, which takes into account only the structural differences between two systems. We used this distance for implementing the “Structural Lyapunov” function. We propose also a new precision test based on temporal reversed simulations. New version program summaryProgram title: Chaos Many-Body Engine v03 Catalogue identifier: AEGH_v3_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEGH_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 214429 No. of bytes in distributed program, including test data, etc.: 9512380 Distribution format: tar.gz Programming language: Visual C# .Net 2010 Computer: PC Operating system: .Net Framework 4.0 running on MS Windows RAM: 128 MB Classification: 24.60.Lz, 05.45.a Catalogue identifier of previous version: AEGH_v2_0 Journal reference of previous version: Computer Physics Communications 183 (2012) 1055-1059 Does the new version supersede the previous version?: Yes Nature of problem: Chaos analysis of three-dimensional, relativistic many-body systems with reactions. Solution method: Second order Runge-Kutta algorithm. Implementation of temporal reversed simulation precision test, and “Structural Lyapunov” function. In order to benefit from the advantages involved in the latest technologies (e.g. LINQ Queries [2]), Chaos Many-Body Engine was migrated from .Net Framework 2.0 to .Net Framework 4.0. In addition to existing energy conservation

  15. Analysis of system thermal hydraulic responses for passive safety injection experiment at ROSA-IV Large Scale Test Facility. Using JAERI modified version of RELAP5/MOD2 code

    Energy Technology Data Exchange (ETDEWEB)

    Asaka, Hideaki; Yonomoto, Taisuke; Kukita, Yutaka (Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment)

    1994-12-01

    An experiment was conducted at the ROSA-IV/Large Scale Test Facility (LSTF) on the performance of a gravity-driven emergency core coolant (ECC) injection system attached to a pressurized water reactor (PWR). Such a gravity-driven injection system, though not used in the current-generation PWRs, is proposed for future reactor designs. The experiment was performed to identify key phenomena peculiar to the operation of a gravity injection system and to provide data base for code assessment against such phenomena. The simulated injection system consisted of a tank which was initially filled with cold water of the same pressure as the primary system. The tank was connected at its top and bottom, respectively, to the cold leg and the vessel downcomer. The injection into the downcomer was driven primarily by the static head difference between the cold water in the tank and the hot water in the pressure balance line (PBL) connecting the cold leg to the tank top. The injection flow was oscillatory after the flow through the PBL became two-phase flow. The experiment was post-test analyzed using a JAERI modified version of the RELAP5/MOD2 code. The code calculation simulated reasonably well the system responses observed in the experiment, and suggested that the oscillations in the injection flow was caused by oscillatory liquid holdup in the PBL connecting the cold leg to tank top. (author).

  16. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  17. Solving the heat transfer in the cold rain of a cross flow cooling tower. N3S code - cooling tower release; Traitement thermique de la zone de pluie des aerorefrigerants a contre-courant. Code N3S - version aerorefrigerants

    Energy Technology Data Exchange (ETDEWEB)

    Grange, J.L.

    1996-09-01

    A simplified model for heat and mass transfer in the lower rainfall of a counter-flow cooling toward had to be implemented in the N3S code-cooling tower release It is built from an old code: ZOPLU. The air velocity field is calculated by N3S. The air and water temperature fields are solved by a Runge-Kutta method on a mesh in an adequate number of vertical plans. Heat exchange and drags correlations are given. And all the necessary parameters are specified. All the subroutines are described. They are taken from ZOPLU and modified in order to adapt their abilities to the N3S requirements. (author). 6 refs., 3 figs., 3 tabs., 3 appends.

  18. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  19. COMPARATIVE RESEARCH REGARDING THE DESIGN AND THE DEVELOPMENT OF THE G CODE PROGRAMMING LANGUAGE BETWEEN CASE R290-02IS AND THE CLASSIC VERSION, CONSTITUENTS OF THE GARLAND PRODUCT C3G 1200, 1400, 1600

    Directory of Open Access Journals (Sweden)

    Valeria Victoria IOVANOV

    2014-05-01

    Full Text Available A CNC machine makes use of mathematics and various coordinate systems to understand and process the information it receives to determine what to move where and how fast . The most important function of any CNC machine is precise and rigorous control of the movement. All CNC equipment have two or more directions of motion, called axes. CNC machines are driven by computer controlled servo motors and generally guided by a stored program, the type of movement (fast , linear, circular , the moving axes, the distances of movement and the speed of movement ( processing being programmable for most CNC machines . This paper proposes the design and implementation of a G code programming language for the reference point Case R290 - 02IS the short version compared to the classical part of the garland product C3G 1200,1400,1600 , reference points used in all fields that use conveyors .

  20. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  1. The diurnal variation in stratospheric ozone from the MACC reanalysis, the ERA-Interim reanalysis, WACCM and Earth observation data: characteristics and intercomparison

    Directory of Open Access Journals (Sweden)

    A. Schanz

    2014-12-01

    Full Text Available In this study we compare the diurnal variation in stratospheric ozone derived from free-running simulations of the Whole Atmosphere Community Climate Model (WACCM and from reanalysis data of the atmospheric service MACC (Monitoring Atmospheric Composition and Climate which both use a similar stratospheric chemistry module. We find good agreement between WACCM and the MACC reanalysis for the diurnal ozone variation in the high-latitude summer stratosphere based on photochemistry. In addition, we consult the ozone data product of the ERA-Interim reanalysis. The ERA-Interim reanalysis ozone system with its long-term ozone parametrization can not capture these diurnal variations in the upper stratosphere that are due to photochemistry. The good dynamics representations, however, reflects well dynamically induced ozone variations in the lower stratosphere. For the high-latitude winter stratosphere we describe a novel feature of diurnal variation in ozone where changes of up to 46.6% (3.3 ppmv occur in monthly mean data. For this effect good agreement between the ERA-Interim reanalysis and the MACC reanalysis suggest quite similar diurnal advection processes of ozone. The free-running WACCM model seriously underestimates the role of diurnal advection processes at the polar vortex at the two tested resolutions. The intercomparison of the MACC reanalysis and the ERA-Interim reanalysis demonstrates how global reanalyses can benefit from a chemical representation held by a chemical transport model. The MACC reanalysis provides an unprecedented description of the dynamics and photochemistry of the diurnal variation of stratospheric ozone which is of high interest for ozone trend analysis and research on atmospheric tides. We confirm the diurnal variation in ozone at 5 hPa by observations of the Superconducting Submillimeter-Wave Limb-Emission Sounder (SMILES experiment and selected sites of the Network for Detection of Atmospheric Composition Change (NDACC

  2. Lidar measurements during a haze episode in Penang, Malaysia and validation of the ECMWF MACC-II model

    Science.gov (United States)

    Khor, Wei Ying; Lolli, Simone; Hee, Wan Shen; Lim, Hwee San; Jafri, M. Z. Mat; Benedetti, Angela; Jones, Luke

    2015-04-01

    Haze is a phenomenon which occurs when there is a great amount of tiny particulates suspended in the atmosphere. During the period of March 2014, a long period of haze event occurred in Penang, Malaysia. The haze condition was measured and monitored using a ground-based Lidar system. By using the measurements obtained, we evaluated the performance of the ECMWF MACC-II model. Lidar measurements showed that there was a thick aerosol layer confined in the planetary boundary layer (PBL) with extinction coefficients exceeding values of 0.3 km-1. The model however has underestimated the atmospheric conditions in Penang. Backward trajectories analysis was performed to identify aerosols sources and transport. It is speculated that the aerosols came from the North-East direction which was influenced by the North-East monsoon wind and some originated from the central eastern coast of Sumatra along the Straits of Malacca.

  3. DOSFAC2 user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.L.; Chanin, D.

    1997-12-01

    This document describes the DOSFAC2 code, which is used for generating dose-to-source conversion factors for the MACCS2 code. DOSFAC2 is a revised and updated version of the DOSFAC code that was distributed with version 1.5.11 of the MACCS code. included are (1) an overview and background of DOSFAC2, (2) a summary of two new functional capabilities, and (3) a user`s guide. 20 refs., 5 tabs.

  4. GENII Version 2 Users’ Guide

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.

    2004-03-08

    The GENII Version 2 computer code was developed for the Environmental Protection Agency (EPA) at Pacific Northwest National Laboratory (PNNL) to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) and the radiological risk estimating procedures of Federal Guidance Report 13 into updated versions of existing environmental pathway analysis models. The resulting environmental dosimetry computer codes are compiled in the GENII Environmental Dosimetry System. The GENII system was developed to provide a state-of-the-art, technically peer-reviewed, documented set of programs for calculating radiation dose and risk from radionuclides released to the environment. The codes were designed with the flexibility to accommodate input parameters for a wide variety of generic sites. Operation of a new version of the codes, GENII Version 2, is described in this report. Two versions of the GENII Version 2 code system are available, a full-featured version and a version specifically designed for demonstrating compliance with the dose limits specified in 40 CFR 61.93(a), the National Emission Standards for Hazardous Air Pollutants (NESHAPS) for radionuclides. The only differences lie in the limitation of the capabilities of the user to change specific parameters in the NESHAPS version. This report describes the data entry, accomplished via interactive, menu-driven user interfaces. Default exposure and consumption parameters are provided for both the average (population) and maximum individual; however, these may be modified by the user. Source term information may be entered as radionuclide release quantities for transport scenarios, or as basic radionuclide concentrations in environmental media (air, water, soil). For input of basic or derived concentrations, decay of parent radionuclides and ingrowth of radioactive decay products prior to the start of the exposure scenario may be considered. A single code run can

  5. 下调肝星状细胞 MACC1对胃癌细胞迁移侵袭能力的影响%Effect of down -regulating MACC1 by shRNA on invasion and migration of gastric carci-noma cells

    Institute of Scientific and Technical Information of China (English)

    袁园; 朱正秋

    2017-01-01

    Objective:By in vitro experiment to explore the regulatory effect of MACC1 gene silencing in HSC expressα-SMA,MMP -2,MMP -9 and HGF,and its effects on ability of migration and invasion of co -cultured gastric carcinoma cells.Methods:To construct targeting MACC1 gene eukaryotic expression interference plasmid (MACC1 -shRNA)and non -specific irrelevant interference plasmid (shNC),transient transfect them into HSC with lipofectamine -2000,named low -expression group(shRNA)and negative control group (NC)accordingly,non-transfected HSC treated as a blank control group(Blank -Ctrl).The above three groups of cells were cultured in accordance with routinel,the expression levels of MACC1,α-SMA,HGF,MMP -2 and MMP -9 in human hepatic stellate cells were detected by Western blot and RT -PCR.Three groups of cells were co -cultured with MGC803 gastric carcinoma cells in vitro,the capabilities of migration and invasion were measured by transwell assay.Results:Western -Blot results showed that the expression of MACC1,α-SMA,HGF,MMP -2 and MMP -9 protein in HSC was significantly lower than that in the negative group and blank control group after transfection of MACC1 -shRNA (P 0.05).Conclusion:Silencing MACC1 in hepatic stellate cells, can inhibit the expression of MMP -2,MMP -9 and HGF,and decrease the migration and invasion of gastric cancer cells in co -culture.%目的:通过体外实验探讨肝星状细胞(hepatic stellate cell,HSC)MACC1基因沉默后调节肝星状细胞激活标志物α-平滑肌肌动蛋白(α-smooth muscle actin,α-SMA)、肝细胞生长因子(hepatocyte growth fac-tor,HGF)、基质金属蛋白酶2(matrix metalloproteinase -2,MMP -2)、基质金属蛋白酶9(matrix metalloprotein-ase -9,MMP -9)的表达,以及对共培养胃癌细胞迁移侵袭能力的影响。方法:利用 RNA 干扰技术,构建针对MACC1基因的真核表达干涉质粒(MACC1-shRNA)、非特异性无关

  6. Statistical comparison of properties of simulated and observed cumulus clouds in the vicinity of Houston during the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS)

    OpenAIRE

    Jiang, Hongli; Feingold, Graham; Jonsson, Haflidi H.; Lu, Miao-Ling; Chuang, Patrick Y.; Flagan, Richard C.; Seinfeld, John H.

    2008-01-01

    Journal of Geophysical Research, Vol. 113, D13205 The article of record as published may be located at http://dx.doi.org/10.1029/2007JD009304. We present statistical comparisons of properties of clouds generated by Large Eddy Simulations (LES) with aircraft observations of nonprecipitating, warm cumulus clouds made in the vicinity of Houston, TX during the Gulf of Mexico Atmospheric Composition and Climate Study (GoMACCS), carried out in the summer of 2006. Aircraft data wer...

  7. Comparison of in situ and columnar aerosol spectral measurements during TexAQS-GoMACCS 2006: testing parameterizations for estimating aerosol fine mode properties

    OpenAIRE

    D. B. Atkinson; P. Massoli; N. T. O'Neill; P. K. Quinn; S. D. Brooks; Lefer, B.

    2009-01-01

    During the 2006 Texas Air Quality Study and Gulf of Mexico Atmospheric Composition and Climate Study (TexAQS-GoMACCS 2006), the optical, chemical and microphysical properties of atmospheric aerosols were measured on multiple mobile platforms and at ground based stations. In situ measurements of the aerosol light extinction coefficient (σep) were performed by two multi-wavelength cavity ring-down (CRD) instruments, one located on board the NO...

  8. A 3-D evaluation of the MACC reanalysis dust product over the greater European region using CALIOP/CALIPSO satellite observations

    Science.gov (United States)

    Georgoulias, Aristeidis K.; Tsikerdekis, Athanasios; Amiridis, Vassilis; Marinou, Eleni; Benedetti, Angela; Zanis, Prodromos; Kourtidis, Konstantinos

    2016-04-01

    Significant amounts of dust are being transferred on an annual basis over the Mediterranean Basin and continental Europe from Northern Africa (Sahara Desert) and Middle East (Arabian Peninsula) as well as from other local sources. Dust affects a number of processes in the atmosphere modulating weather and climate also having an impact on human health and the economy. Therefore, the ability of simulating adequately the amount and optical properties of dust is essential. This work focuses on the evaluation of the MACC reanalysis dust product over the regions mentioned above. The evaluation procedure is based on pure dust satellite retrievals from CALIOP/CALIPSO that cover the period 2007-2012. The CALIOP/CALIPSO data utilized here come from an optimized retrieval scheme that was originally developed within the framework of the LIVAS (Lidar Climatology of Vertical Aerosol Structure for Space-Based LIDAR Simulation Studies) project. CALIOP/CALIPSO dust extinction coefficients and dust optical depth patterns at 532 nm are used for the validation of MACC natural aerosol extinction coefficients and dust optical depth patterns at 550 nm. Overall, it is shown in this work that space-based lidars may play a major role in the improvement of the MACC aerosol product. This research has been financed under the FP7 Programme MarcoPolo (Grand Number 606953, Theme SPA.2013.3.2-01).

  9. Code Flows : Visualizing Structural Evolution of Source Code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    2008-01-01

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual met

  10. Code flows : Visualizing structural evolution of source code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    2008-01-01

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual met

  11. Tokamak Systems Code

    Energy Technology Data Exchange (ETDEWEB)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  12. Distributed Version Control and Library Metadata

    Directory of Open Access Journals (Sweden)

    Galen M. Charlton

    2008-06-01

    Full Text Available Distributed version control systems (DVCSs are effective tools for managing source code and other artifacts produced by software projects with multiple contributors. This article describes DVCSs and compares them with traditional centralized version control systems, then describes extending the DVCS model to improve the exchange of library metadata.

  13. UPGRADES TO Monteburns, VERSION 3.0

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, Jack D [Los Alamos National Laboratory; Trellue, Holly R [Los Alamos National Laboratory

    2012-06-22

    Monteburns VERSION 3.0 is an upgrade of the existing Monteburns code available through RSICC. The new version includes modern programming style, increased parallel computing, more accurate capture gamma calculations and an automated input generator. This capability was demonstrated through a small PWR core simulation.

  14. Version Description and Installation Guide Kernel Version 3.0

    Science.gov (United States)

    1989-12-01

    approved for publication. FOR THE COMMANDER Karl H. Shingler SEI Joint Program Office This work is sponsored by the U.S. Department of Defense. Copyright...8217version’ * $ append/new-version tcl .lis,tc2. lise ,t3 .lis,tc4.lis - [dark -cm. dcc. vdiq]contents -of -’version’ $ delete tcl.lis;,tc2.lis;,tc3.lis...INOIVIOUAL 22b. TELEPHONE NUMBER 22c. OFFICE SYMBOL KARL H. SHINGLER (include A’a Code) 1, 412 268-7630 SEI JPO 00 FORM 1473.83 APR EITION OF JAN 73 IS OBSOLETE. I i SECURITY CLASSIFICATION OF THIS PAGE

  15. Evaluation of the MACC operational forecast system – potential and challenges of global near-real-time modelling with respect to reactive gases in the troposphere

    Directory of Open Access Journals (Sweden)

    A. Wagner

    2015-03-01

    Full Text Available Monitoring Atmospheric Composition and Climate (MACC/MACCII currently represents the European Union's Copernicus Atmosphere Monitoring Service (CAMS (http://www.copernicus.eu, which will become fully operational in the course of 2015. The global near-real-time MACC model production run for aerosol and reactive gases provides daily analyses and 5 day forecasts of atmospheric composition fields. It is the only assimilation system world-wide that is operational to produce global analyses and forecasts of reactive gases and aerosol fields. We have investigated the ability of the MACC analysis system to simulate tropospheric concentrations of reactive gases (CO, O3, and NO2 covering the period between 2009 and 2012. A validation was performed based on CO and O3 surface observations from the Global Atmosphere Watch (GAW network, O3 surface observations from the European Monitoring and Evaluation Programme (EMEP and furthermore, NO2 tropospheric columns derived from the satellite sensors SCIAMACHY and GOME-2, and CO total columns derived from the satellite sensor MOPITT. The MACC system proved capable of reproducing reactive gas concentrations in consistent quality, however, with a seasonally dependent bias compared to surface and satellite observations: for northern hemispheric surface O3 mixing ratios, positive biases appear during the warm seasons and negative biases during the cold parts of the years, with monthly Modified Normalised Mean Biases (MNMBs ranging between −30 and 30% at the surface. Model biases are likely to result from difficulties in the simulation of vertical mixing at night and deficiencies in the model's dry deposition parameterization. Observed tropospheric columns of NO2 and CO could be reproduced correctly during the warm seasons, but are mostly underestimated by the model during the cold seasons, when anthropogenic emissions are at a highest, especially over the US, Europe and Asia. Monthly MNMBs of the satellite data

  16. Inter-comparison between HERMESv2.0 and TNO-MACC-II emission data using the CALIOPE air quality system (Spain)

    Science.gov (United States)

    Guevara, Marc; Pay, María Teresa; Martínez, Francesc; Soret, Albert; Denier van der Gon, Hugo; Baldasano, José M.

    2014-12-01

    This work examines and compares the performance of two emission datasets on modelling air quality concentrations for Spain: (i) the High-Elective Resolution Modelling Emissions System (HERMESv2.0) and (ii) the TNO-MACC-II emission inventory. For this purpose, the air quality system CALIOPE-AQFS (WRF-ARW/CMAQ/BSC-DREAM8b) was run over Spain for February and June 2009 using the two emission datasets (4 km × 4 km and 1 h). Nitrogen dioxide (NO2), sulphur dioxide (SO2), Ozone (O3) and particular matter (PM10) modelled concentrations were compared with measurements at different type of air quality stations (i.e. rural background, urban, suburban industrial). A preliminary emission comparison showed significant discrepancies between the two datasets, highlighting an overestimation of industrial emissions in urban areas when using TNO-MACC-II. However, simulations showed similar performances of both emission datasets in terms of air quality. Modelled NO2 concentrations were similar between both datasets at the background stations, although TNO-MACC-II presented lower underestimations due to differences in industrial, other mobile sources and residential emissions. At Madrid urban stations NO2 was significantly underestimated in both cases despite the fact that HERMESv2.0 estimates traffic emissions using a more local information and detailed methodology. This NO2 underestimation problem was not found in Barcelona due to the influence of international shipping emissions located in the coastline. An inadequate characterization of some TNO-MACC-II's point sources led to high SO2 biases at industrial stations, especially in northwest Spain where large facilities are grouped. In general, surface O3 was overestimated regardless of the emission dataset used, depicting the problematic of CMAQ on overestimating low ozone at night. On the other hand, modelled PM10 concentrations were less underestimated in urban areas when applying HERMESv2.0 due to the inclusion of road dust

  17. HEFF---A user`s manual and guide for the HEFF code for thermal-mechanical analysis using the boundary-element method; Version 4.1: Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.M.; Sanjeevan, K. [Agapito (J.F.T.) and Associates, Inc., Grand Junction, CO (United States)

    1991-12-01

    The HEFF Code combines a simple boundary-element method of stress analysis with the closed form solutions for constant or exponentially decaying heat sources in an infinite elastic body to obtain an approximate method for analysis of underground excavations in a rock mass with heat generation. This manual describes the theoretical basis for the code, the code structure, model preparation, and step taken to assure that the code correctly performs its intended functions. The material contained within the report addresses the Software Quality Assurance Requirements for the Yucca Mountain Site Characterization Project. 13 refs., 26 figs., 14 tabs.

  18. Classical Holographic Codes

    CERN Document Server

    Brehm, Enrico M

    2016-01-01

    In this work, we introduce classical holographic codes. These can be understood as concatenated probabilistic codes and can be represented as networks uniformly covering hyperbolic space. In particular, classical holographic codes can be interpreted as maps from bulk degrees of freedom to boundary degrees of freedom. Interestingly, they are shown to exhibit features similar to those expected from the AdS/CFT correspondence. Among these are a version of the Ryu-Takayanagi formula and intriguing properties regarding bulk reconstruction and boundary representations of bulk operations. We discuss the relation of our findings with expectations from AdS/CFT and, in particular, with recent results from quantum error correction.

  19. The effects of springtime mid-latitude storms on trace gas composition determined from the MACC reanalysis

    Directory of Open Access Journals (Sweden)

    K. E. Knowland

    2014-10-01

    Full Text Available The relationship between springtime air pollution transport of ozone (O3 and carbon monoxide (CO and mid-latitude cyclones is explored for the first time using the Monitoring Atmospheric Composition and Climate (MACC reanalysis for the period 2003–2012. In this study, the most intense spring storms (95th percentile are selected for two regions, the North Pacific (NP and the North Atlantic (NA. These storms (~60 storms over each region often track over the major emission sources of East Asia and eastern North America. By compositing the storms, the distributions of O3 and CO within a "typical" intense storm are examined. We compare the storm-centered composite to background composites of "average conditions" created by sampling the reanalysis data of the previous year to the storm locations. Mid-latitude storms are found to redistribute concentrations of O3 and CO horizontally and vertically throughout the storm. This is clearly shown to occur through two main mechanisms: (1 vertical lifting of CO-rich and O3-poor air isentropically from near the surface to the mid- to upper-troposphere in the region of the warm conveyor belt; and (2 descent of O3-rich and CO-poor air isentropically in the vicinity of the dry intrusion, from the stratosphere toward the mid-troposphere. This can be seen in the composite storm's life cycle as the storm intensifies, with area-averaged O3 (CO increasing (decreasing between 200 and 500 hPa. At the time of maximum intensity, area-averaged O3 around the storm center at 300 hPa is enhanced by 50 and 36% for the NP and NA regions respectively, compared to the background, and by 11 and 7.6% at 500 hPa. In contrast, area-averaged CO at 300 hPa decreases by 12% for NP and 5.5% for NA, and at 500 hPa area-averaged CO decreases by 2.4% for NP while there is little change over the NA region at 500 hPa. From the mid-troposphere, O3-rich air is clearly seen to be transported toward the surface but the downward transport of CO

  20. Development of the unified version of COBRA/RELAP5

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J. J.; Ha, K. S.; Chung, B. D.; Lee, W. J.; Sim, S. K. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    The COBRA/RELAP5 code, an integrated version of the COBRA-TF and RELAP5/MOD3 codes, has been developed for the realistic simulations of complicated, multi-dimensional, two-phase, thermal-hydraulic system transients in light water reactors. Recently, KAERI developed an unified version of the COBRA/RELAP5 code, which can run in serial mode on both workstations and personal computers. This paper provides the brief overview of the code integration scheme, the recent code modifications, the developmental assessments, and the future development plan. 13 refs., 5 figs., 2 tabs. (Author)

  1. 从The Da Vinci Code两译本看翻译规范与译者思维的互动%Interaction Between Translational Norms and Translators' Habitus A Case Study of Two Chinese Versions of The Da Vinci Code

    Institute of Scientific and Technical Information of China (English)

    吴琼军

    2012-01-01

    本文以The Da Vinci Code大陆和台湾的两个中译本的描述研究为例,来验证译者的翻译策略是否遵循或违背规范。从描述研究中得出结论,翻译过程是一个极为动态多变的过程。对翻译规范与译者思维互动的研究,正是为了更动态的还原多变的翻译过程,从而对翻译实践提供借鉴与帮助。%Based on the descriptive study of the two Chinese versions of The Da Vinci Code - Mainland version and Taiwan version, this thesis is an endeavor to verify whether the translation strategies reflect or violate the translation norms by showing variability and creativity in their translation strategies. It can be concluded that translation process is a dyDamlc course. The research aims to reproduce the dynamic process so as to give some suggestions for future translating practice.

  2. Evaluation of the ability of the MACC-II Reanalysis to reproduce the distribution of O3 and CO in the UTLS as measured by MOZAIC-IAGOS

    Science.gov (United States)

    Gaudel, A.; Clark, H.; Thouret, V.; Eskes, H.; Huijnen, V.; Nedelec, P.

    2013-12-01

    Tropospheric ozone is probably one of the most important trace gases in the atmosphere. It plays a major role in the chemistry of the troposphere by exerting a strong influence on the concentrations of oxidants such as hydroxyl radical (OH) and is the third greenhouse gas after carbon dioxide and methane. Its radiative impact is of particular importance in the Upper Troposphere / Lower Stratosphere (UTLS), the most critical region regarding the climate change. Carbon Monoxide (CO) is one of the major ozone precursors (originating from all types of combustion) in the troposphere. In the UTLS, it also has implications for stratospheric chemistry and indirect radiative forcing effects (as a chemical precursor of CO2 and O3). Assessing the global distribution (and possibly trends) of O3 and CO in this region of the atmosphere, combining high resolution in situ data and the most appropriate global 3D model to further quantify the different sources and their origins is then of particular interest. This is one of the objectives of the MOZAIC-IAGOS (http://www.iagos.fr) and MACC-II (http://www.gmes-atmosphere.eu) European programs. The aircraft of the MOZAIC program have collected simultaneously O3 and CO data regularly all over the world since the end of 2001. Most of the data are recorded in northern mid-latitudes, in the UTLS region (as commercial aircraft cruise altitude is between 9 and 12 km). MACC-II aims at providing information services covering air quality, climate forcing and stratospheric ozone, UV radiation and solar-energy resources, using near real time analysis and forecasting products, and reanalysis. The validation reports of the MACC models are regularly published (http://www.gmes-atmosphere.eu/services/gac/nrt/ and http://www.gmes-atmosphere.eu/services/gac/reanalysis/). We will present and discuss the performance of the MACC-reanalysis, including the ECMWF-Integrated Forecasting System (IFS) coupled to the CTM MOZART with 4DVAR data assimilation, to

  3. Current status of the ability of the GEMS/MACC models to reproduce the tropospheric CO vertical distribution as measured by MOZAIC

    Directory of Open Access Journals (Sweden)

    N. Elguindi

    2010-04-01

    Full Text Available Vertical profiles of CO taken from the MOZAIC aircraft database are used to present (1 a global analysis of CO seasonal averages and interannual variability for the years 2002–2007 and (2 a global validation of CO estimates produced by the MACC models for 2004, including an assessment of their ability to transport pollutants originating from the Alaskan/Canadian wildfires. Seasonal averages and interannual variability from several MOZAIC sites representing different regions of the world show that CO concentrations are highest and most variable during the winter season. The inter-regional variability is significant with concentrations increasing eastward from Europe to Japan. The impact of the intense boreal fires, particularly in Russia, during the fall of 2002 on the Northern Hemisphere CO concentrations throughout the troposphere is well represented by the MOZAIC data.

    A global validation of the GEMS/MACC GRG models which include three stand-alone CTMs (MOZART, MOCAGE and TM5 and the coupled ECMWF Integrated Forecasting System (IFS/MOZART model with and without MOPITT CO data assimilation show that the models have a tendency to underestimate CO. The models perform best in Europe and the US where biases range from 0 to –25% in the free troposphere and from 0 to –50% in the surface and boundary layers (BL. The biases are largest in the winter and during the daytime when emissions are highest, indicating that current inventories are too low. Data assimilation is shown to reduce biases by up to 25% in some regions. The models are not able to reproduce well the CO plumes originating from the Alaskan/Canadian wildfires at downwind locations in the eastern US and Europe, not even with assimilation. Sensitivity tests reveal that this is mainly due to deficiencies in the fire emissions inventory and injection height.

  4. Current status of the ability of the GEMS/MACC models to reproduce the tropospheric CO vertical distribution as measured by MOZAIC

    Directory of Open Access Journals (Sweden)

    N. Elguindi

    2010-10-01

    Full Text Available Vertical profiles of CO taken from the MOZAIC aircraft database are used to globally evaluate the performance of the GEMS/MACC models, including the ECMWF-Integrated Forecasting System (IFS model coupled to the CTM MOZART-3 with 4DVAR data assimilation for the year 2004. This study provides a unique opportunity to compare the performance of three offline CTMs (MOZART-3, MOCAGE and TM5 driven by the same meteorology as well as one coupled atmosphere/CTM model run with data assimilation, enabling us to assess the potential gain brought by the combination of online transport and the 4DVAR chemical satellite data assimilation.

    First we present a global analysis of observed CO seasonal averages and interannual variability for the years 2002–2007. Results show that despite the intense boreal forest fires that occurred during the summer in Alaska and Canada, the year 2004 had comparably lower tropospheric CO concentrations. Next we present a validation of CO estimates produced by the MACC models for 2004, including an assessment of their ability to transport pollutants originating from the Alaskan/Canadian wildfires. In general, all the models tend to underestimate CO. The coupled model and the CTMs perform best in Europe and the US where biases range from 0 to -25% in the free troposphere and from 0 to -50% in the surface and boundary layers (BL. Using the 4DVAR technique to assimilate MOPITT V4 CO significantly reduces biases by up to 50% in most regions. However none of the models, even the IFS-MOZART-3 coupled model with assimilation, are able to reproduce well the CO plumes originating from the Alaskan/Canadian wildfires at downwind locations in the eastern US and Europe. Sensitivity tests reveal that deficiencies in the fire emissions inventory and injection height play a role.

  5. Decoding Generalized Concatenated Codes Using Interleaved Reed-Solomon Codes

    CERN Document Server

    Senger, Christian; Bossert, Martin; Zyablov, Victor

    2008-01-01

    Generalized Concatenated codes are a code construction consisting of a number of outer codes whose code symbols are protected by an inner code. As outer codes, we assume the most frequently used Reed-Solomon codes; as inner code, we assume some linear block code which can be decoded up to half its minimum distance. Decoding up to half the minimum distance of Generalized Concatenated codes is classically achieved by the Blokh-Zyablov-Dumer algorithm, which iteratively decodes by first using the inner decoder to get an estimate of the outer code words and then using an outer error/erasure decoder with a varying number of erasures determined by a set of pre-calculated thresholds. In this paper, a modified version of the Blokh-Zyablov-Dumer algorithm is proposed, which exploits the fact that a number of outer Reed-Solomon codes with average minimum distance d can be grouped into one single Interleaved Reed-Solomon code which can be decoded beyond d/2. This allows to skip a number of decoding iterations on the one...

  6. BOOK REVIEW: Numerical Recipes in C++: The Art of Scientific Computing (2nd edn)1 Numerical Recipes Example Book (C++) (2nd edn)2 Numerical Recipes Multi-Language Code CD ROM with LINUX or UNIX Single-Screen License Revised Version3Numerical Recipes in C++: The Art of Scientific Computing (2nd edn) Numerical Recipes Example Book (C++) (2nd edn) Numerical Recipes Multi-Language Code CD ROM with LINUX or UNIX Single-Screen License Revised Version

    Science.gov (United States)

    Press, William H.; Teukolsky, Saul A.; Vettering, William T.; Flannery, Brian P.

    2003-05-01

    The two Numerical Recipes books are marvellous. The principal book, The Art of Scientific Computing, contains program listings for almost every conceivable requirement, and it also contains a well written discussion of the algorithms and the numerical methods involved. The Example Book provides a complete driving program, with helpful notes, for nearly all the routines in the principal book. The first edition of Numerical Recipes: The Art of Scientific Computing was published in 1986 in two versions, one with programs in Fortran, the other with programs in Pascal. There were subsequent versions with programs in BASIC and in C. The second, enlarged edition was published in 1992, again in two versions, one with programs in Fortran (NR(F)), the other with programs in C (NR(C)). In 1996 the authors produced Numerical Recipes in Fortran 90: The Art of Parallel Scientific Computing as a supplement, called Volume 2, with the original (Fortran) version referred to as Volume 1. Numerical Recipes in C++ (NR(C++)) is another version of the 1992 edition. The numerical recipes are also available on a CD ROM: if you want to use any of the recipes, I would strongly advise you to buy the CD ROM. The CD ROM contains the programs in all the languages. When the first edition was published I bought it, and have also bought copies of the other editions as they have appeared. Anyone involved in scientific computing ought to have a copy of at least one version of Numerical Recipes, and there also ought to be copies in every library. If you already have NR(F), should you buy the NR(C++) and, if not, which version should you buy? In the preface to Volume 2 of NR(F), the authors say 'C and C++ programmers have not been far from our minds as we have written this volume, and we think that you will find that time spent in absorbing its principal lessons will be amply repaid in the future as C and C++ eventually develop standard parallel extensions'. In the preface and introduction to NR

  7. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  8. Premixing of corium into water during a Fuel-Coolant Interaction. The models used in the 3 field version of the MC3D code and two examples of validation on Billeau and FARO experiments

    Energy Technology Data Exchange (ETDEWEB)

    Berthoud, G.; Crecy, F. de; Duplat, F.; Meignen, R.; Valette, M. [CEA/Grenoble, DRN/DTP, 17 Avenue des Martyrs, 38054 Grenoble Cedex 9 (France)

    1998-01-01

    This paper presents the <> application of the multiphasic 3D computer code MC3D. This application is devoted to the premixing phase of a Fuel Coolant Interaction (FCI) when large amounts of molten corium flow into water and interact with it. A description of the new features of the model is given (a more complete description of the full model is given in annex). Calculations of Billeau experiments (cold or hot spheres dropped into water) and of a FARO test (<> corium dropped into 5 MPa saturated water) are presented. (author)

  9. A new parallel TreeSPH code

    OpenAIRE

    Lia, Cesario; Carraro, Giovanni; Chiosi, Cesare; Voli, Marco

    1998-01-01

    In this report we describe a parallel implementation of a Tree-SPH code realized using the SHMEM libraries in the Cray T3E supercomputer at CINECA. We show the result of a 3D test to check the code performances against its scalar version. Finally we compare the load balancing and scalability of the code with PTreeSPH (Dav\\'e et al 1997), the only other parallel Tree-SPH code present in the literature.

  10. brulilo, Version 0.x

    Energy Technology Data Exchange (ETDEWEB)

    2015-04-16

    effectively remove some of the stiffness and allow for efficient explicit integration techniques to be used. The original intent of brulilo was to implement these stiffness-alleviating techniques with explicit integrators and compare the performance to traditional implicit integrations of the full stiff system. This is still underway, as the code is very much in an alpha-release state. Furthermore, explicit integrators are often much easier to parallelize than their implicit counterparts. brulilo will implement parallelization of these techniques, leveraging both the Python implementation of MPI, mpi4py, as well as highly parallelized versions targeted at GPUs with PyOpenCL and/or PyCUDA.

  11. Transformation of Mexican lime with an intron-hairpin construct expressing untranslatable versions of the genes coding for the three silencing suppressors of Citrus tristeza virus confers complete resistance to the virus.

    Science.gov (United States)

    Soler, Nuria; Plomer, Montserrat; Fagoaga, Carmen; Moreno, Pedro; Navarro, Luis; Flores, Ricardo; Peña, Leandro

    2012-06-01

    Citrus tristeza virus (CTV), the causal agent of the most devastating viral disease of citrus, has evolved three silencing suppressor proteins acting at intra- (p23 and p20) and/or intercellular level (p20 and p25) to overcome host antiviral defence. Previously, we showed that Mexican lime transformed with an intron-hairpin construct including part of the gene p23 and the adjacent 3' untranslated region displays partial resistance to CTV, with a fraction of the propagations from some transgenic lines remaining uninfected. Here, we transformed Mexican lime with an intron-hairpin vector carrying full-length, untranslatable versions of the genes p25, p20 and p23 from CTV strain T36 to silence the expression of these critical genes in CTV-infected cells. Three transgenic lines presented complete resistance to viral infection, with all their propagations remaining symptomless and virus-free after graft inoculation with CTV-T36, either in the nontransgenic rootstock or in the transgenic scion. Accumulation of transgene-derived siRNAs was necessary but not sufficient for CTV resistance. Inoculation with a divergent CTV strain led to partially breaking the resistance, thus showing the role of sequence identity in the underlying mechanism. Our results are a step forward to developing transgenic resistance to CTV and also show that targeting simultaneously by RNA interference (RNAi) the three viral silencing suppressors appears critical for this purpose, although the involvement of concurrent RNAi mechanisms cannot be excluded.

  12. TOUGH2-GRS version 1. User manual

    Energy Technology Data Exchange (ETDEWEB)

    Navarro, Martin; Eckel, Jens

    2016-07-15

    TOUGH2 is a code for the simulation of multi-phase flow processes in porous media that has been developed by the Lawrence Berkeley National Laboratory, California, USA. Since 1991, GRS has been using the code for process analyses and safety assessments for deep geological repositories and has extended the code by several processes that are relevant for repository systems. The TOUGH2 source code that has been developed further by GRS is referred to as TOUGH2-GRS. The present report presents code version 1.1.g, which was developed in project UM13 A 03400 sponsored by the German Federal Ministry for the Environment, Nature Conservation, Building and Nuclear Safety (BMUB).

  13. VH-1: Multidimensional ideal compressible hydrodynamics code

    Science.gov (United States)

    Hawley, John; Blondin, John; Lindahl, Greg; Lufkin, Eric

    2012-04-01

    VH-1 is a multidimensional ideal compressible hydrodynamics code written in FORTRAN for use on any computing platform, from desktop workstations to supercomputers. It uses a Lagrangian remap version of the Piecewise Parabolic Method developed by Paul Woodward and Phil Colella in their 1984 paper. VH-1 comes in a variety of versions, from a simple one-dimensional serial variant to a multi-dimensional version scalable to thousands of processors.

  14. Modified Mean-Pyramid Coding Scheme

    Science.gov (United States)

    Cheung, Kar-Ming; Romer, Richard

    1996-01-01

    Modified mean-pyramid coding scheme requires transmission of slightly fewer data. Data-expansion factor reduced from 1/3 to 1/12. Schemes for progressive transmission of image data transmitted in sequence of frames in such way coarse version of image reconstructed after receipt of first frame and increasingly refined version of image reconstructed after receipt of each subsequent frame.

  15. Modified Mean-Pyramid Coding Scheme

    Science.gov (United States)

    Cheung, Kar-Ming; Romer, Richard

    1996-01-01

    Modified mean-pyramid coding scheme requires transmission of slightly fewer data. Data-expansion factor reduced from 1/3 to 1/12. Schemes for progressive transmission of image data transmitted in sequence of frames in such way coarse version of image reconstructed after receipt of first frame and increasingly refined version of image reconstructed after receipt of each subsequent frame.

  16. Construction of TH code development and validation environment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyungjun; Kim, Hee-Kyung; Bae, Kyoo-Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, each component of code development and validation system, i.e. IVS and Mercurial will be introduced and Redmine, the integrated platform of IVS and Mercurial, will be explained later. Integrated TH code validation system, IVS and code development and management environment are constructed. The code validation could be achieved by a comparison of results with corresponding experiments. The development of thermal-hydraulic (TH) system code for nuclear reactor requires much time and effort, also for its validation and verification(V and V). In previous, TASS/SMR-S code (hereafter TASS) for SMART is developed by KAERI through V and V process. On the way of code development, the version control of source code has great importance. Also, during the V and V process, the way to reduce repeated labor- and time-consuming work of running the code before releasing new version of TH code, is required. Therefore, the integrated platform for TH code development and validation environment is constructed. Finally, Redmine, the project management and issue tracking system, is selected as platform, Mercurial (hg) for source version control and IVS (Integrated Validation System) for TASS is constructed as a prototype for automated V and V. IVS is useful before release a new code version. The code developer can validate code result easily using IVS. Even during code development, IVS could be used for validation of code modification. Using Redmine and Mercurial, users and developers can use IVS result more effectively.

  17. Holographic codes

    CERN Document Server

    Latorre, Jose I

    2015-01-01

    There exists a remarkable four-qutrit state that carries absolute maximal entanglement in all its partitions. Employing this state, we construct a tensor network that delivers a holographic many body state, the H-code, where the physical properties of the boundary determine those of the bulk. This H-code is made of an even superposition of states whose relative Hamming distances are exponentially large with the size of the boundary. This property makes H-codes natural states for a quantum memory. H-codes exist on tori of definite sizes and get classified in three different sectors characterized by the sum of their qutrits on cycles wrapped through the boundaries of the system. We construct a parent Hamiltonian for the H-code which is highly non local and finally we compute the topological entanglement entropy of the H-code.

  18. Sharing code

    OpenAIRE

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  19. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  20. Versioning Complex Data

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, Matt C.; Lee, Benno; Beus, Sherman J.

    2014-06-29

    Using the history of ARM data files, we designed and demonstrated a data versioning paradigm that is feasible. Assigning versions to sets of files that are modified with some special assumptions and domain specific rules was effective in the case of ARM data, which has more than 5000 datastreams and 500TB of data.

  1. A Preliminary Comparison Between SuperDARN Flow Vectors and Equivalent Ionospheric Currents From the GIMA, Greenland, MACCS, THEMIS, CARISMA, and CANMOS Ground Magnetometer Arrays

    Science.gov (United States)

    Kivelson, M. G.; Amm, O.; Weygand, J. M.; Bristow, W. A.; Angelopoulos, V.; Beheshti, B.; Steinmetz, E. S.; Engebretson, M. J.; Murr, D.; Viljanen, A.; Pulkkinen, A.; Gleisner, H.; Mann, I.; Russell, C.

    2009-12-01

    With data from the GIMA, Greenland, MACCs, CARISMA, and CANMOS, and THEMIS ground magnetometer arrays, we obtain maps of equivalent ionospheric currents (EIC) over North America using the state-of-art technique based on SECS (spherical elementary currents systems) developed by Amm and Viljanen [1999] . The EIC maps can be calculated with the same time resolution as the magnetometer data, which is 10 sec. The results thus show in detail the dynamic evolution of the currents over the whole THEMIS ground network. The EIC maps can further be compared and quantitatively combined with near simultaneous images of the THEMIS all sky imager mosaics, SuperDARN RADAR data, and THEMIS spacecraft data. We find using 5 full days of SuperDARN flow vector data obtained during the northern hemisphere winter that the flows, in general, are antiparallel to the EICs. The largest differences from the antiparallel direction appear to occur during moderate to quiet geomagnetic conditions in the midnight sector. These differences are most likely the result of non-uniform conductivity in the ionosphere that influences the EIC direction.

  2. Small marine craft emission factors observed from the NOAA R/V Ronald H. Brown during TexAQS/GoMACCS 2006

    Science.gov (United States)

    Lerner, B. M.; Lack, D.; Murphy, P. C.; Williams, E. J.

    2007-12-01

    During the TexAQS/GoMACCS 2006 field campaign, the NOAA R/V Ronald H. Brown often encountered small marine recreational craft and small fishing vessels while sailing close to the Texas coast, especially in Galveston Bay. Measurement of a suite of trace gases at high time resolution (1 Hz) allowed us to calculate emission factors (EFs), relative to carbon dioxide, for nitrogen oxides (NOx), sulfur dioxide (SO2) and carbon monoxide (CO) for distinct exhaust plumes from these sources. Photoacoustic aerosol absorption spectroscopy (PAS) measurements made concurrently allowed for the first quantification of mass EFs for light-absorbing particles from fishing craft. As previously observed along the New England coast, gasoline-powered recreational vessels showed significantly higher NOx/CO2 and lower CO/CO2 EFs than current emissions inventories predict, although in agreement with the most recent published literature of laboratory studies. These findings imply lower volatile organic compound emissions from these vessels, although this was not directly measured.

  3. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  4. Polar Codes

    Science.gov (United States)

    2014-12-01

    QPSK Gaussian channels . .......................................................................... 39 vi 1. INTRODUCTION Forward error correction (FEC...Capacity of BSC. 7 Figure 5. Capacity of AWGN channel . 8 4. INTRODUCTION TO POLAR CODES Polar codes were introduced by E. Arikan in [1]. This paper...Under authority of C. A. Wilgenbusch, Head ISR Division EXECUTIVE SUMMARY This report describes the results of the project “More reliable wireless

  5. 75 FR 54131 - Updating State Residential Building Energy Efficiency Codes

    Science.gov (United States)

    2010-09-03

    ...The Department of Energy (DOE or Department) has preliminarily determined that the 2009 version of the International Code Council (ICC) International Energy Conservation Code (IECC) would achieve greater energy efficiency in low-rise residential buildings than the 2006 IECC. Also, DOE has preliminarily determined that the 2006 version of the IECC would achieve greater energy efficiency than......

  6. Confocal coded aperture imaging

    Energy Technology Data Exchange (ETDEWEB)

    Tobin, Jr., Kenneth William (Harriman, TN); Thomas, Jr., Clarence E. (Knoxville, TN)

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  7. GCFM Users Guide Revision for Model Version 5.0

    Energy Technology Data Exchange (ETDEWEB)

    Keimig, Mark A.; Blake, Coleman

    1981-08-10

    This paper documents alterations made to the MITRE/DOE Geothermal Cash Flow Model (GCFM) in the period of September 1980 through September 1981. Version 4.0 of GCFM was installed on the computer at the DOE San Francisco Operations Office in August 1980. This Version has also been distributed to about a dozen geothermal industry firms, for examination and potential use. During late 1980 and 1981, a few errors detected in the Version 4.0 code were corrected, resulting in Version 4.1. If you are currently using GCFM Version 4.0, it is suggested that you make the changes to your code that are described in Section 2.0. User's manual changes listed in Section 3.0 and Section 4.0 should then also be made.

  8. Standard Preanalytical Coding for Biospecimens : Review and Implementation of the Sample PREanalytical Code (SPREC)

    NARCIS (Netherlands)

    Lehmann, Sabine; Guadagni, Fiorella; Moore, Helen; Ashton, Garry; Barnes, Michael; Benson, Erica; Clements, Judith; Koppandi, Iren; Coppola, Domenico; Demiroglu, Sara Yasemin; DeSouza, Yvonne; De Wilde, Annemieke; Duker, Jacko; Eliason, James; Glazer, Barbara; Harding, Keith; Jeon, Jae Pil; Kessler, Joseph; Kokkat, Theresa; Nanni, Umberto; Shea, Kathi; Skubitz, Amy; Somiari, Stella; Tybring, Gunnel; Gunter, Elaine; Betsou, Fotini

    2012-01-01

    The first version of the Standard PREanalytical Code (SPREC) was developed in 2009 by the International Society for Biological and Environmental Repositories (ISBER) Biospecimen Science Working Group to facilitate documentation and communication of the most important preanalytical quality parameters

  9. A Fortran 90 code for magnetohydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Walker, D.W.

    1992-03-01

    This report describes progress in developing a Fortran 90 version of the KITE code for studying plasma instabilities in Tokamaks. In particular, the evaluation of convolution terms appearing in the numerical solution is discussed, and timing results are presented for runs performed on an 8k processor Connection Machine (CM-2). Estimates of the performance on a full-size 64k CM-2 are given, and range between 100 and 200 Mflops. The advantages of having a Fortran 90 version of the KITE code are stressed, and the future use of such a code on the newly announced CM5 and Paragon computers, from Thinking Machines Corporation and Intel, is considered.

  10. TRAC code development status and plans

    Energy Technology Data Exchange (ETDEWEB)

    Spore, J.W.; Liles, D.R.; Nelson, R.A.; Dotson, P.J.; Steinke, R.G.; Knight, T.D.; Henninger, R.J.; Martinez, V.; Jenks, R.P.; Cappiello, M.W.

    1986-01-01

    This report summarizes the characteristics and current status of the TRAC-PF1/MOD1 computer code. Recent error corrections and user-convenience features are described, and several user enhancements are identified. Current plans for the release of the TRAC-PF1/MOD2 computer code and some preliminary MOD2 results are presented. This new version of the TRAC code implements stability-enhancing two-step numerics into the 3-D vessel, using partial vectorization to obtain a code that has run 400% faster than the MOD1 code.

  11. GalPot: Galaxy potential code

    Science.gov (United States)

    McMillan, Paul J.

    2016-11-01

    GalPot finds the gravitational potential associated with axisymmetric density profiles. The package includes code that performs transformations between commonly used coordinate systems for both positions and velocities (the class OmniCoords), and that integrates orbits in the potentials. GalPot is a stand-alone version of Walter Dehnen's GalaxyPotential C++ code taken from the falcON code in the NEMO Stellar Dynamics Toolbox (ascl:1010.051).

  12. The limits of mathematics tutorial version

    CERN Document Server

    Chaitin, G J

    1995-01-01

    The latest in a series of reports presenting the information-theoretic incompleteness theorems of algorithmic information theory via algorithms written in specially designed versions of LISP. Previously in this LISP code only one-character identifiers were allowed, and arithmetic had to be programmed out. Now identifiers can be many characters long, and arithmetic with arbitrarily large unsigned decimal integers is built in. This and many other changes in the software have made this material much easier to understand and to use.

  13. A line-based visualization of code evolution

    OpenAIRE

    Voinea, SL Lucian; Telea, AC Alexandru; Wijk, van, M.N.

    2005-01-01

    The source code of software systems changes many times during the system lifecycle. We study how developers can get insight in these changes in order to understand the project context and the product artifacts. For this we propose new techniques for code evolution representation and visualization interaction from a version-centric perspective. Central to our approach is a line-based display of the changing code, where each file version is shown as a column and the horizontal axis shows time. ...

  14. Version pressure feedback mechanisms for speculative versioning caches

    Science.gov (United States)

    Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong

    2013-03-12

    Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.

  15. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  16. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  17. The moving mesh code Shadowfax

    CERN Document Server

    Vandenbroucke, Bert

    2016-01-01

    We introduce the moving mesh code Shadowfax, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public License. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare Shadowfax with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  18. The moving mesh code SHADOWFAX

    Science.gov (United States)

    Vandenbroucke, B.; De Rijcke, S.

    2016-07-01

    We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  19. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    ; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech...... expression in the public realm. The book’s line of argument defends language against its invasion by economics, arguing that speech continues to underscore the human condition, however paradoxical this may seem in an era of pervasive computing....

  20. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    Science.gov (United States)

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  1. Enigma Version 12

    Science.gov (United States)

    Shores, David; Goza, Sharon P.; McKeegan, Cheyenne; Easley, Rick; Way, Janet; Everett, Shonn; Guerra, Mark; Kraesig, Ray; Leu, William

    2013-01-01

    Enigma Version 12 software combines model building, animation, and engineering visualization into one concise software package. Enigma employs a versatile user interface to allow average users access to even the most complex pieces of the application. Using Enigma eliminates the need to buy and learn several software packages to create an engineering visualization. Models can be created and/or modified within Enigma down to the polygon level. Textures and materials can be applied for additional realism. Within Enigma, these models can be combined to create systems of models that have a hierarchical relationship to one another, such as a robotic arm. Then these systems can be animated within the program or controlled by an external application programming interface (API). In addition, Enigma provides the ability to use plug-ins. Plugins allow the user to create custom code for a specific application and access the Enigma model and system data, but still use the Enigma drawing functionality. CAD files can be imported into Enigma and combined to create systems of computer graphics models that can be manipulated with constraints. An API is available so that an engineer can write a simulation and drive the computer graphics models with no knowledge of computer graphics. An animation editor allows an engineer to set up sequences of animations generated by simulations or by conceptual trajectories in order to record these to highquality media for presentation. Enigma Version 12 Lyndon B. Johnson Space Center, Houston, Texas 28 NASA Tech Briefs, September 2013 Planetary Protection Bioburden Analysis Program NASA's Jet Propulsion Laboratory, Pasadena, California This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous

  2. Comparative evaluation of the impact of WRF/NMM and WRF/ARW meteorology on CMAQ simulations for PM2.5 and its related precursors during the 2006 TexAQS/GoMACCS study

    OpenAIRE

    Yu, S.; R. Mathur; J. Pleim; G. Pouliot; Wong, D; Eder, B.; Schere, K.; R. Gilliam; S. T. Rao

    2012-01-01

    This study presents a comparative evaluation of the impact of WRF-NMM and WRF-ARW meteorology on CMAQ simulations of PM2.5, its composition and related precursors over the eastern United States with the intensive observations obtained by aircraft (NOAA WP-3), ship and surface monitoring networks (AIRNow, IMPROVE, CASTNet and STN) during the 2006 TexAQS/GoMACCS study. The results at the AIRNow surface sites show that both ARW-CMAQ and NMM-CMAQ reproduced day-to-day variations...

  3. The 2009 Version of the Aeroprediction Code: The AP09

    Science.gov (United States)

    2008-01-01

    Aerodynamic Properties and Stability Behavior of the 155mm Howitzer Shell M107, BRL Report No. 2547, Oct 1975 (ARL, Aberdeen, MD). 16. Silton, S. I...00000 9 C! 9 Co9 9 .co cocoa ,,,00cco 00000 00000 ’M C, aC . 9 N 00000 00000P. t4....~..... S 00000 S 00000 4 e en co 0a N 00 0 coc00000 o 0 0 0

  4. Modified NASA-Lewis Chemical Equilibrium Code for MHD applications

    Energy Technology Data Exchange (ETDEWEB)

    Sacks, R. A.; Geyer, H. K.; Grammel, S. J.; Doss, E. D.

    1979-12-01

    A substantially modified version of the NASA-Lewis Chemical Equilibrium Code has recently been developed. The modifications were designed to extend the power and convenience of the Code as a tool for performing combustor analysis for MHD systems studies. This report describes the effect of the programming details from a user point of view, but does not describe the Code in detail.

  5. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe...... the codes succinctly using Gröbner bases....

  6. Contact Control, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-21

    The contact control code is a generalized force control scheme meant to interface with a robotic arm being controlled using the Robot Operating System (ROS). The code allows the user to specify a control scheme for each control dimension in a way that many different control task controllers could be built from the same generalized controller. The input to the code includes maximum velocity, maximum force, maximum displacement, and a control law assigned to each direction and the output is a 6 degree of freedom velocity command that is sent to the robot controller.

  7. Transfinite Version of Welter's Game

    OpenAIRE

    Abuku, Tomoaki

    2017-01-01

    We study the transfinite version of Welter's Game, a combinatorial game, which is played on the belt divided into squares with general ordinal numbers extended from natural numbers. In particular, we obtain a straight-forward solution for the transfinite version based on those of the transfinite version of Nim and the original version of Welter's Game.

  8. A note on Type II covolutional codes

    OpenAIRE

    Johannesson, Rolf; Ståhl, Per; Wittenmark, Emma

    2000-01-01

    The result of a search for the world's second type II (doubly-even and self-dual) convolutional code is reported. A rate R=4/8, 16-state, time-invariant, convolutional code with free distance dfree=8 was found to be type II. The initial part of its weight spectrum is better than that of the Golay convolutional code (GCC). Generator matrices and path weight enumerators for some other type II convolutional codes are given. By the “wrap-around” technique tail-biting versions of (32, 18, 8) T...

  9. Review of AVS Audio Coding Standard

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tao; ZHANG Caixia; ZHAO Xin

    2016-01-01

    Audio Video Coding Standard (AVS) is a second⁃generation source coding standard and the first standard for audio and video coding in China with independent intellectual property rights. Its performance has reached the international standard. Its coding efficiency is 2 to 3 times greater than that of MPEG⁃2. This technical solution is more simple, and it can greatly save channel resource. After more than ten years ’develop⁃ment, AVS has achieved great success. The latest version of the AVS audio coding standard is ongoing and mainly aims at the increasing demand for low bitrate and high quality audio services. The paper reviews the history and recent develop⁃ment of AVS audio coding standard in terms of basic fea⁃tures, key techniques and performance. Finally, the future de⁃velopment of AVS audio coding standard is discussed.

  10. An implicit Smooth Particle Hydrodynamic code

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, Charles E. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-05-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  11. Comparison of in situ and columnar aerosol spectral measurements during TexAQS-GoMACCS 2006: testing parameterizations for estimating aerosol fine mode properties

    Directory of Open Access Journals (Sweden)

    D. B. Atkinson

    2010-01-01

    Full Text Available During the 2006 Texas Air Quality Study and Gulf of Mexico Atmospheric Composition and Climate Study (TexAQS-GoMACCS 2006, the optical, chemical and microphysical properties of atmospheric aerosols were measured on multiple mobile platforms and at ground based stations. In situ measurements of the aerosol light extinction coefficient (σep were performed by two multi-wavelength cavity ring-down (CRD instruments, one located on board the NOAA R/V Ronald H. Brown (RHB and the other located at the University of Houston, Moody Tower (UHMT. An AERONET sunphotometer was also located at the UHMT to measure the columnar aerosol optical depth (AOD. The σep data were used to extract the extinction Ångström exponent (åep, a measure of the wavelength dependence of σep. There was general agreement between the åep (and to a lesser degree σep measurements by the two spatially separated CRD instruments during multi-day periods, suggesting a regional scale consistency of the sampled aerosols. Two spectral models are applied to the σep and AOD data to extract the fine mode fraction of extinction (η and the fine mode effective radius (Reff,f. These two parameters are robust measures of the fine mode contribution to total extinction and the fine mode size distribution, respectively. The results of the analysis are compared to Reff,f values extracted using AERONET V2 retrievals and calculated from in situ particle size measurements on the RHB and at UHMT. During a time period when fine mode aerosols dominated the extinction over a large area extending from Houston/Galveston Bay and out into the Gulf of Mexico, the various methods for obtaining Reff,f agree qualitatively (showing the same temporal trend and quantitatively (pooled standard deviation = 28 nm.

  12. Comparison of in situ and columnar aerosol spectral measurements during TexAQS-GoMACCS 2006: testing parameterizations for estimating aerosol fine mode properties

    Directory of Open Access Journals (Sweden)

    D. B. Atkinson

    2009-08-01

    Full Text Available During the 2006 Texas Air Quality Study and Gulf of Mexico Atmospheric Composition and Climate Study (TexAQS-GoMACCS 2006, the optical, chemical and microphysical properties of atmospheric aerosols were measured on multiple mobile platforms and at ground based stations. In situ measurements of the aerosol light extinction coefficient (σep were performed by two multi-wavelength cavity ring-down (CRD instruments, one located on board the NOAA R/V Ronald H. Brown (RHB and the other located at the University of Houston, Moody Tower (UHMT. An AERONET sunphotometer was also located at the UHMT to measure the columnar aerosol optical depth (AOD. The σep data were used to extract the extinction Ångström exponent (åep, a measure of the wavelength dependence of σep. There was general agreement between the åep (and to a lesser degree σep measurements by the two spatially separated CRD instruments during multi-day periods, suggesting a regional scale consistency of the sampled aerosols. Two spectral models are applied to the σep and AOD data to extract the fine mode fraction of extinction (η and the fine mode effective radius (Reff f. These two parameters are robust measures of the fine mode contribution to total extinction and the fine mode size distribution respectively. The results of the analysis are compared to Reff f values extracted using AERONET V2 retrievals and calculated from in situ particle size measurements on the RHB and at UHMT. During a time period when fine mode aerosols dominated the extinction over a large area extending from Houston/Galveston Bay and out into the Gulf of Mexico, the various methods for obtaining Reff f agree qualitatively (showing the same temporal trend and quantitatively (pooled standard deviation

  13. Oxygenated fraction and mass of organic aerosol from direct emission and atmospheric processing measured on the R/V Ronald Brown during TEXAQS/GoMACCS 2006

    Science.gov (United States)

    Russell, L. M.; Takahama, S.; Liu, S.; Hawkins, L. N.; Covert, D. S.; Quinn, P. K.; Bates, T. S.

    2009-04-01

    Submicron particles collected on Teflon filters aboard the R/V Ronald Brown during the Texas Air Quality Study and Gulf of Mexico Atmospheric Composition and Climate Study (TexAQS/GoMACCS) 2006 in and around the port of Houston, Texas, were measured by Fourier transform infrared (FTIR) and X-ray fluorescence for organic functional groups and elemental composition. Organic mass (OM) concentrations (1-25 μg m-3) for ambient particle samples measured by FTIR showed good agreement with measurements made with an aerosol mass spectrometer. The fractions of organic mass identified as alkane and carboxylic acid groups were 47% and 32%, respectively. Three different types of air masses were identified on the basis of the air mass origin and the radon concentration, with significantly higher carboxylic acid group mass fractions in air masses from the north (35%) than the south (29%) or Gulf of Mexico (26%). Positive matrix factorization analysis attributed carboxylic acid fractions of 30-35% to factors with mild or strong correlations (r > 0.5) to elemental signatures of oil combustion and 9-24% to wood smoke, indicating that part of the carboxylic acid fraction of OM was formed by the same sources that controlled the metal emissions, namely the oil and wood combustion activities. The implication is that a substantial part of the measured carboxylic acid contribution was formed independently of traditionally "secondary" processes, which would be affected by atmospheric (both photochemical and meteorological) conditions and other emission sources. The carboxylic acid group fractions in the Gulf of Mexico and south air masses (GAM and SAM, respectively) were largely oil combustion emissions from ships as well as background marine sources, with only limited recent land influences (based on radon concentrations). Alcohol groups accounted for 14% of OM (mostly associated with oil combustion emissions and background sources), and amine groups accounted for 4% of OM in all air

  14. RRTMGP: A High-Performance Broadband Radiation Code for the Next Decade

    Science.gov (United States)

    2015-09-30

    the PI of this project, and his team at AER includes programmers with experience coding for modern computer architectures, including the recent GPU ...Supercomputer Center (CSCS) in Lugano will be developing a GPU version (OpenACC) of this code for use in the ICON LES model. This version will provide a...significant foundation for the GPU version of our code that is a deliverable for this project. Andre Wehe of AER will spend the first week in November

  15. AERONET Version 3 processing

    Science.gov (United States)

    Holben, B. N.; Slutsker, I.; Giles, D. M.; Eck, T. F.; Smirnov, A.; Sinyuk, A.; Schafer, J.; Rodriguez, J.

    2014-12-01

    The Aerosol Robotic Network (AERONET) database has evolved in measurement accuracy, data quality products, availability to the scientific community over the course of 21 years with the support of NASA, PHOTONS and all federated partners. This evolution is periodically manifested as a new data version release by carefully reprocessing the entire database with the most current algorithms that fundamentally change the database and ultimately the data products used by the community. The newest processing, Version 3, will be released in 2015 after the entire database is reprocessed and real-time data processing becomes operational. All V 3 algorithms have been developed, individually vetted and represent four main categories: aerosol optical depth (AOD) processing, inversion processing, database management and new products. The primary trigger for release of V 3 lies with cloud screening of the direct sun observations and computation of AOD that will fundamentally change all data available for analysis and all subsequent retrieval products. This presentation will illustrate the innovative approach used for cloud screening and assesses the elements of V3 AOD relative to the current version. We will also present the advances in the inversion product processing with emphasis on the random and systematic uncertainty estimates. This processing will be applied to the new hybrid measurement scenario intended to provide inversion retrievals for all solar zenith angles. We will introduce automatic quality assurance criteria that will allow near real time quality assured aerosol products necessary for real time satellite and model validation and assimilation. Last we will introduce the new management structure that will improve access to the data database. The current version 2 will be supported for at least two years after the initial release of V3 to maintain continuity for on going investigations.

  16. NOVEL BIPHASE CODE -INTEGRATED SIDELOBE SUPPRESSION CODE

    Institute of Scientific and Technical Information of China (English)

    Wang Feixue; Ou Gang; Zhuang Zhaowen

    2004-01-01

    A kind of novel binary phase code named sidelobe suppression code is proposed in this paper. It is defined to be the code whose corresponding optimal sidelobe suppression filter outputs the minimum sidelobes. It is shown that there do exist sidelobe suppression codes better than the conventional optimal codes-Barker codes. For example, the sidelobe suppression code of length 11 with filter of length 39 has better sidelobe level up to 17dB than that of Barker code with the same code length and filter length.

  17. The NJOY Nuclear Data Processing System, Version 2016

    Energy Technology Data Exchange (ETDEWEB)

    Macfarlane, Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Muir, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boicourt, R. M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kahler, III, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-09

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  18. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  19. Modifications to the FCHART/SLR version 2. 0 program

    Energy Technology Data Exchange (ETDEWEB)

    Hill, J. M.

    1981-07-01

    A number of errors have been detected in the FCHART/SLR computer code as it pertains to the thermal performance of passive solar energy systems. Along with minor coding changes, major revisions in the code have been made to improve the computer models used to predict the effects of overhangs on incident solar radiation and the radiation absorbed in attached sunspaces. Modifications to the code were also made to improve the handling of mullions and to reduce the effort required to describe the placement of overhangs. The theoretical basis of these changes, along with the associated alterations to the code, are given. For the cases examined, the program as modified now agrees to within 15% of published LANL passive system performance correlations. This new code has been designated as Version 2.1 and is presently operational at SSEC.

  20. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  1. Assessment of TRACE Code for MIT Pressurizer Tests to Review Industrial Code

    Energy Technology Data Exchange (ETDEWEB)

    Song, Chanyi; Bang, Young Seok; Shin, Andong; Woo, Sweng-Wong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Korea Institute of Nuclear Safety (KINS) has initiated to review the industrial code for safety analysis of nuclear power plant, in which MARS-KS and TRACE codes are being used to support the understanding of specific phenomena and code prediction. For this aspect, the TRACE code was assessed for the MIT pressurizer test. The TRACE code has been developed continuously, and NRC released the TRACE code version 5.0 patch 4 recently. This updated version has some improvement from version 5.0 patch 3. In this paper, TRACE code calculations with version 5.0 patch 3 and patch 4 for 3 cases of MIT pressurizer tests have been performed to assess the applicability of the TRACE code for verification of industrial codes. The MIT pressurizer test is one of the fundamental separate effect tests and frequently simulated to verify safety analysis codes. Predictability of the system code for the behavior of pressurizer in the plant is very important because it has an effect on the progress of accidents such as loss of coolant, control rod withdrawal, and loss of feedwater flow, etc. In the reactor protection system, the high pressurizer pressure trip signal provides an assurance of the integrity of the RCS boundary for AOOs that could lead to an over pressurization of the RCS. Also, the low pressurizer pressure trip signal provides an assistance for the ESF during the system pressure reduction events and a LOCA. According to the results, node effect was significantly reduced at patch 4 compared with patch 3 of TRACE version 5.0. Based on the prediction of Test ST4 and Test A, at least 20 cells are needed to predict pressurizer insurge behavior reasonably. However, the results of patch 4 show that 10 cells are enough to simulate the transient behavior of pressurizer. For outsurge case B, there was no major difference between patch 3 and patch 4 even though it was not shown in this paper. Overall, the results of the TRACE code version 5.0 patch 4 fit well with those of experiments

  2. Good Codes From Generalised Algebraic Geometry Codes

    CERN Document Server

    Jibril, Mubarak; Ahmed, Mohammed Zaki; Tjhai, Cen

    2010-01-01

    Algebraic geometry codes or Goppa codes are defined with places of degree one. In constructing generalised algebraic geometry codes places of higher degree are used. In this paper we present 41 new codes over GF(16) which improve on the best known codes of the same length and rate. The construction method uses places of small degree with a technique originally published over 10 years ago for the construction of generalised algebraic geometry codes.

  3. Assessment of uncertainties in early off-site consequences from nuclear reactor accidents

    Energy Technology Data Exchange (ETDEWEB)

    Madni, I.K.; Cazzoli, E.G. (Brookhaven National Lab., Dept. of Nuclear Energy, Upton, NY (US)); Khatib-Rahbar, M. (Energy Research, Inc., Rockville, MD (US))

    1990-04-01

    A simplified approach has been developed to calculate uncertainties in early off-site consequences from nuclear reactor accidents. The consequence model (SMART) is based on a solution procedure that uses simplified meteorology and involves direct analytic integration of air concentration equations over time and position. This is different from the discretization approach currently used in the CRAC2 and MACCS codes. The SMART code is fast running, thereby providing a valuable tool for sensitivity and uncertainty studies. The code was benchmarked against both MACCS version 1.4 and CRAC2. Results of benchmarketing and detailed sensitivity and uncertainty analyses using SMART are presented.

  4. Assessment of TRACE Code for GE Level Swell Test to Review Industrial Code

    Energy Technology Data Exchange (ETDEWEB)

    Song, Chanyi; Cheng, Ae Ju; Bang, Young Seok; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Korea Institute of Nuclear Safety(KINS) has reviewed the industrial code for safety analysis of nuclear power plant, in which TRACE and MARS-KS codes are being used to support the understanding of specific phenomena and code prediction. For this aspect, the TRACE code was assessed for the GE Level Swell Experiment. General Electric (GE) performed a series of experiments to investigate thermal-hydraulic phenomena such as critical flow, void distribution, and liquid-vapor mixture swell during blowdown conditions. These GE Level swell experiments are frequently simulated to verify safety analysis codes as a separate effect test. TRACE code calculations with version 5.0 patch 4 for GE Level Swell experiment 1004-3 have been performed to assess the applicability of the TRACE code for verification of industrial code. An Assessment analysis of the TRACE version 5.0 patch 4 code was carried out for GE Level Swell experiments 1004-3 by comparison purpose with SPACE. Overall, TRACE predicted the pressure and axial void fractions at different times reasonably well for 1004-3 blowdown test, while SPACE tends to underestimate the pressure. It was also found that results of void fraction distribution should be compared at different time to discuss the accuracy of the SPACE code against this test.

  5. PVWatts Version 5 Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dobos, A. P.

    2014-09-01

    The NREL PVWatts calculator is a web application developed by the National Renewable Energy Laboratory (NREL) that estimates the electricity production of a grid-connected photovoltaic system based on a few simple inputs. PVWatts combines a number of sub-models to predict overall system performance, and makes includes several built-in parameters that are hidden from the user. This technical reference describes the sub-models, documents assumptions and hidden parameters, and explains the sequence of calculations that yield the final system performance estimate. This reference is applicable to the significantly revised version of PVWatts released by NREL in 2014.

  6. Nuflood, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-29

    NUFLOOD Version 1.x is a surface-water hydrodynamic package designed for the simulation of overland flow of fluids. It consists of various routines to address a wide range of applications (e.g., rainfall-runoff, tsunami, storm surge) and real time, interactive visualization tools. NUFLOOD has been designed for general-purpose computers and workstations containing multi-core processors and/or graphics processing units. The software is easy to use and extensible, constructed in mind for instructors, students, and practicing engineers. NUFLOOD is intended to assist the water resource community in planning against water-related natural disasters.

  7. TOGAF version 9

    CERN Document Server

    Group, The Open

    2010-01-01

    This is the official Open Group Pocket Guide for TOGAF Version 9 Enterprise Edition. This pocket guide is published by Van Haren Publishing on behalf of The Open Group.TOGAF, The Open Group Architectural Framework is a fast growing, worldwide accepted standard that can help organisations build their own Enterprise Architecture in a standardised way. This book explains why the in?s and out?s of TOGAF in a concise manner.This book explains how TOGAF can help to make an Enterprise Architecture. Enterprise Architecture is an approach that can help management to understand this growing complexity.

  8. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  9. CERN access cards - Introduction of a bar code (Reminder)

    CERN Multimedia

    Relations with the Host States Service

    2004-01-01

    In accordance with the latest revised version of the implementation measures relating to Operational Circular No. 2, CERN access cards may bear a bar code transcribing the holder's identification number (the revised version of this subsidiary document to the aforementioned Circular will be published shortly). Relations with the Host States Service http://www.cern.ch/relations/ relations.secretariat@cern.ch Tel. 72848

  10. Code Development and Analysis Program: developmental checkout of the BEACON/MOD2A code. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Ramsthaler, J. A.; Lime, J. F.; Sahota, M. S.

    1978-12-01

    A best-estimate transient containment code, BEACON, is being developed by EG and G Idaho, Inc. for the Nuclear Regulatory Commission's reactor safety research program. This is an advanced, two-dimensional fluid flow code designed to predict temperatures and pressures in a dry PWR containment during a hypothetical loss-of-coolant accident. The most recent version of the code, MOD2A, is presently in the final stages of production prior to being released to the National Energy Software Center. As part of the final code checkout, seven sample problems were selected to be run with BEACON/MOD2A.

  11. EOSlib, Version 3

    Energy Technology Data Exchange (ETDEWEB)

    2017-02-03

    Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, as well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.

  12. URGENCES NOUVELLE VERSION

    CERN Multimedia

    Medical Service

    2002-01-01

    The table of emergency numbers that appeared in Bulletin 10/2002 is out of date. The updated version provided by the Medical Service appears on the following page. Please disregard the previous version. URGENT NEED OF A DOCTOR GENEVAPATIENT NOT FIT TO BE MOVED: Call your family doctor Or SOS MEDECINS (24H/24H) 748 49 50 Or ASSOC. OF GENEVA DOCTORS (7H-23H) 322 20 20 PATIENT CAN BE MOVED: HOPITAL CANTONAL 24 Micheli du Crest 372 33 11 382 33 11 CHILDREN'S HOSPITAL 6 rue Willy Donzé 382 68 18 382 45 55 MATERNITY 24 Micheli du Crest 382 68 16 382 33 11 OPHTALMOLOGY 22 Alcide Jentzer 382 84 00 HOPITAL DE LA TOUR Meyrin 719 61 11 CENTRE MEDICAL DE MEYRIN Champs Fréchets 719 74 00 URGENCES : FIRE BRIGADE 118 FIRE BRIGADE CERN 767 44 44 BESOIN URGENT D'AMBULANCE (GENEVE ET VAUD) : 144 POLICE 117 ANTI-POISON CENTRE 24H/24H 01 251 51 510 EUROPEAN EMERGENCY CALL: 112 FRANCE PATIENT NOT FIT TO BE MOVED: call your family doctor PATIENT CAN BE MOVED: ST. JULIE...

  13. DF-224 Version 7.3 FS Installation

    Science.gov (United States)

    Boyce, Leslye

    1990-12-01

    Version 7.3 will implement a three logical memory unit contingency plan. It removes under utilized DF-224 code to recover memory margins and restructures code. The installation of DF-224, version 7.3 will be broken into a Installation, Regression test, and Contingency Back-out time. In addition, it eliminates the 10 and 300 second processing rate. It is highly desirable to implement the version of the flight software ASAP. Due to the complexity of the installation, dedicated spacecraft time is required to minimize spacecraft risk In summary, this proposal requests 14 orbits of dedicated spacecraft time for installation, regression test, and contingency back_out time for DF-224 FS V7.3 to be scheduled in the first few days of SMS 054.

  14. Correction, improvement and model verification of CARE 3, version 3

    Science.gov (United States)

    Rose, D. M.; Manke, J. W.; Altschul, R. E.; Nelson, D. L.

    1987-01-01

    An independent verification of the CARE 3 mathematical model and computer code was conducted and reported in NASA Contractor Report 166096, Review and Verification of CARE 3 Mathematical Model and Code: Interim Report. The study uncovered some implementation errors that were corrected and are reported in this document. The corrected CARE 3 program is called version 4. Thus the document, correction. improvement, and model verification of CARE 3, version 3 was written in April 1984. It is being published now as it has been determined to contain a more accurate representation of CARE 3 than the preceding document of April 1983. This edition supercedes NASA-CR-166122 entitled, 'Correction and Improvement of CARE 3,' version 3, April 1983.

  15. MC3, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    2016-09-09

    The MC3 code is used to perform Monte Carlo simulations in the isothermal-isobaric ensemble (constant number of particles, temperature, and pressure) on molecular crystals. The molecules within the periodic simulation cell are treated as rigid bodies, alleviating the requirement for a complex interatomic potential. Intermolecular interactions are described using generic, atom-centered pair potentials whose parameterization is taken from the literature [D. E. Williams, J. Comput. Chem., 22, 1154 (2001)] and electrostatic interactions arising from atom-centered, fixed, point partial charges. The primary uses of the MC3 code are the computation of i) the temperature and pressure dependence of lattice parameters and thermal expansion coefficients, ii) tensors of elastic constants and compliances via the Parrinello and Rahman’s fluctuation formula [M. Parrinello and A. Rahman, J. Chem. Phys., 76, 2662 (1982)], and iii) the investigation of polymorphic phase transformations. The MC3 code is written in Fortran90 and requires LAPACK and BLAS linear algebra libraries to be linked during compilation. Computationally expensive loops are accelerated using OpenMP.

  16. SCDAP/RELAP5 code development and assessment

    Energy Technology Data Exchange (ETDEWEB)

    Allison, C.M.; Hohorst, J.K. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1996-03-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The current version of the code is SCDAP/RELAP5/MOD3.1e. Although MOD3.1e contains a number of significant improvements since the initial version of MOD3.1 was released, new models to treat the behavior of the fuel and cladding during reflood have had the most dramatic impact on the code`s calculations. This paper provides a brief description of the new reflood models, presents highlights of the assessment of the current version of MOD3.1, and discusses future SCDAP/RELAP5/MOD3.2 model development activities.

  17. FLUKA: A Multi-Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan; Fasso, A.; /SLAC; Ranft, J.; /Siegen U.

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  18. Decoding Algorithms for Random Linear Network Codes

    DEFF Research Database (Denmark)

    Heide, Janus; Pedersen, Morten Videbæk; Fitzek, Frank

    2011-01-01

    achieve a high coding throughput, and reduce energy consumption.We use an on-the-fly version of the Gauss-Jordan algorithm as a baseline, and provide several simple improvements to reduce the number of operations needed to perform decoding. Our tests show that the improvements can reduce the number...

  19. CLES, Code Liegeois d'Evolution Stellaire

    CERN Document Server

    Scuflaire, R; Montalban, J; Miglio, A; Bourge, P -O; Godart, M; Thoul, A; Noels, A

    2007-01-01

    Cles is an evolution code recently developed to produce stellar models meeting the specific requirements of studies in asteroseismology. It offers the users a lot of choices in the input physics they want in their models and its versatility allows them to tailor the code to their needs and implement easily new features. We describe the features implemented in the current version of the code and the techniques used to solve the equations of stellar structure and evolution. A brief account is given of the use of the program and of a solar calibration realized with it.

  20. Space Time Codes from Permutation Codes

    CERN Document Server

    Henkel, Oliver

    2006-01-01

    A new class of space time codes with high performance is presented. The code design utilizes tailor-made permutation codes, which are known to have large minimal distances as spherical codes. A geometric connection between spherical and space time codes has been used to translate them into the final space time codes. Simulations demonstrate that the performance increases with the block lengths, a result that has been conjectured already in previous work. Further, the connection to permutation codes allows for moderate complex en-/decoding algorithms.

  1. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  2. On the use of MOZAIC-IAGOS data to assess the ability of the MACC reanalysis to reproduce the distribution of ozone and CO in the UTLS over Europe

    Directory of Open Access Journals (Sweden)

    Audrey Gaudel

    2015-12-01

    Full Text Available MOZAIC-IAGOS data are used to assess the ability of the MACC reanalysis (REAN to reproduce distributions of ozone (O3 and carbon monoxide (CO, along with vertical and inter-annual variability in the upper troposphere/lower stratosphere region (UTLS over Europe for the period 2003–2010. A control run (CNTRL, without assimilation is compared with the MACC reanalysis (REAN, with assimilation to assess the impact of assimilation. On average over the period, REAN underestimates ozone by 60 ppbv in the lower stratosphere (LS, whilst CO is overestimated by 20 ppbv. In the upper troposphere (UT, ozone is overestimated by 50 ppbv, while CO is partly over or underestimated by up to 20 ppbv. As expected, assimilation generally improves model results but there are some exceptions. Assimilation leads to increased CO mixing ratios in the UT which reduce the biases of the model in this region but the difference in CO mixing ratios between LS and UT has not changed and remains underestimated after assimilation. Therefore, this leads to a significant positive bias of CO in the LS after assimilation. Assimilation improves estimates of the amplitude of the seasonal cycle for both species. Additionally, the observations clearly show a general negative trend of CO in the UT which is rather well reproduced by REAN. However, REAN misses the observed inter-annual variability in summer. The O3–CO correlation in the Ex-UTLS is rather well reproduced by the CNTRL and REAN, although REAN tends to miss the lowest CO mixing ratios for the four seasons and tends to oversample the extra-tropical transition layer (ExTL region in spring. This evaluation stresses the importance of the model gradients for a good description of the mixing in the Ex-UTLS region, which is inherently difficult to observe from satellite instruments.

  3. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  4. Nuclear Criticality Safety Handbook, Version 2. English translation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-08-01

    The Nuclear Criticality Safety Handbook, Version 2 essentially includes the description of the Supplement Report to the Nuclear Criticality Safety Handbook, released in 1995, into the first version of the Nuclear Criticality Safety Handbook, published in 1988. The following two points are new: (1) exemplifying safety margins related to modeled dissolution and extraction processes, (2) describing evaluation methods and alarm system for criticality accidents. Revision has been made based on previous studies for the chapter that treats modeling the fuel system: e.g., the fuel grain size that the system can be regarded as homogeneous, non-uniformity effect of fuel solution, an burnup credit. This revision has solved the inconsistencies found in the first version between the evaluation of errors found in JACS code system and the criticality condition data that were calculated based on the evaluation. This report is an English translation of the Nuclear Criticality Safety Handbook, Version 2, originally published in Japanese as JAERI 1340 in 1999. (author)

  5. Git version control cookbook

    CERN Document Server

    Olsson, Aske

    2014-01-01

    This practical guide contains a wide variety of recipes, taking you through all the topics you need to know about to fully utilize the most advanced features of the Git system. If you are a software developer or a build and release engineer who uses Git in your daily work and want to take your Git knowledge to the next level, then this book is for you. To understand and follow the recipes included in this book, basic knowledge of Git command-line code is mandatory.

  6. NODC Standard Product: NODC Taxonomic Code on CD-ROM (NODC Accession 0050418)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The content of the NODC Taxonomic Code, Version 8 CD-ROM (CD-ROM NODC-68) distributed by NODC is archived in this accession. Version 7 of the NODC Taxonomic Code...

  7. Strong Trinucleotide Circular Codes

    Directory of Open Access Journals (Sweden)

    Christian J. Michel

    2011-01-01

    Full Text Available Recently, we identified a hierarchy relation between trinucleotide comma-free codes and trinucleotide circular codes (see our previous works. Here, we extend our hierarchy with two new classes of codes, called DLD and LDL codes, which are stronger than the comma-free codes. We also prove that no circular code with 20 trinucleotides is a DLD code and that a circular code with 20 trinucleotides is comma-free if and only if it is a LDL code. Finally, we point out the possible role of the symmetric group ∑4 in the mathematical study of trinucleotide circular codes.

  8. 20-SIM code generation for PC/104 target

    NARCIS (Netherlands)

    Groothuis, Marcel

    2001-01-01

    From version 3.2, 20-Sim will contain a new tool, called C-code generation. With this tool it will be possible to generate C code from a 20-Sim model. This tool works on basis of templates. For each target, a target specific template has to be made. The goal of this project was to write a new 20-Sim

  9. Zgoubi user`s guide. Version 4

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Fermi National Accelerator Lab., Batavia, IL (United States). Dept. of Physics; Valero, S. [CEA, Gif-sur-Yvette (France)

    1997-10-15

    The computer code Zgoubi calculates trajectories of charged particles in magnetic and electric fields. At the origin specially adapted to the definition and adjustment of beam lines and magnetic spectrometers, it has so-evolved that it allows the study of systems including complex sequences of optical elements such as dipoles, quadrupoles, arbitrary multipoles and other magnetic or electric devices, and is able as well to handle periodic structures. Compared to other codes, it presents several peculiarities: (1) a numerical method for integrating the Lorentz equation, based on Taylor series, which optimizes computing time and provides high accuracy and strong symplecticity, (2) spin tracking, using the same numerical method as for the Lorentz equation, (3) calculation of the synchrotron radiation electric field and spectra in arbitrary magnetic fields, from the ray-tracing outcomes, (4) the possibility of using a mesh, which allows ray-tracing from simulated or measured (1-D, 2-D or 3-D) field maps, (5) Monte Carlo procedures: unlimited number of trajectories, in-flight decay, etc. (6) built-in fitting procedure, (7) multiturn tracking in circular accelerators including many features proper to machine parameter calculation and survey, and also the simulation of time-varying power supplies. The initial version of the Code, dedicated to the ray-tracing in magnetic fields, was developed by D. Garreta and J.C. Faivre at CEN-Saclay in the early 1970`s. It was perfected for the purpose of studying the four spectrometers (SPES I, II, III, IV) at the Laboratoire National Saturne (CEA-Saclay, France), and SPEG at Ganil (Caen, France). It is now in use in several national and foreign laboratories. This manual is intended only to describe the details of the most recent version of Zogoubi, which is far from being a {open_quotes}finished product{close_quotes}.

  10. Joint source channel coding using arithmetic codes

    CERN Document Server

    Bi, Dongsheng

    2009-01-01

    Based on the encoding process, arithmetic codes can be viewed as tree codes and current proposals for decoding arithmetic codes with forbidden symbols belong to sequential decoding algorithms and their variants. In this monograph, we propose a new way of looking at arithmetic codes with forbidden symbols. If a limit is imposed on the maximum value of a key parameter in the encoder, this modified arithmetic encoder can also be modeled as a finite state machine and the code generated can be treated as a variable-length trellis code. The number of states used can be reduced and techniques used fo

  11. Cryptanalysis of Achterbahn-Version 1 and -Version 2

    Institute of Scientific and Technical Information of China (English)

    Xiao-Li Huang; Chuan-Kun Wu

    2007-01-01

    Achterbahn is one of the candidate stream ciphers submitted to the eSTREAM, which is the ECRYPT StreamCipher Project. The cipher Achterbahn uses a new structure which is based on several nonlinear feedback shift registers(NLFSR) and a nonlinear combining output Boolean function. This paper proposes distinguishing attacks on Achterbahn-Version 1 and -Version 2 on the reduced mode and the full mode. These distinguishing attacks are based on linear approxi-mations of the output functions. On the basis of these linear approximations and the periods of the registers, parity checkswith noticeable biases are found. Then distinguishing attacks can be achieved through these biased parity checks. As toAchterbahn-Version 1, three cases that the output function has three possibilities are analyzed. Achterbahn-Version 2, themodification version of Achterbahn-Version 1, is designed to avert attacks based on approximations of the output Booleanfunction. Our attack with even much lower complexities on Achterbahn-Version 2 shows that Achterbahn-Version 2 cannotprevent attacks based on linear approximations.

  12. Diversity, Coding, and Multiplexing Trade-Off of Network-Coded Cooperative Wireless Networks

    CERN Document Server

    Iezzi, Michela; Graziosi, Fabio

    2012-01-01

    In this paper, we study the performance of network-coded cooperative diversity systems with practical communication constraints. More specifically, we investigate the interplay between diversity, coding, and multiplexing gain when the relay nodes do not act as dedicated repeaters, which only forward data packets transmitted by the sources, but they attempt to pursue their own interest by forwarding packets which contain a network-coded version of received and their own data. We provide a very accurate analysis of the Average Bit Error Probability (ABEP) for two network topologies with three and four nodes, when practical communication constraints, i.e., erroneous decoding at the relays and fading over all the wireless links, are taken into account. Furthermore, diversity and coding gain are studied, and advantages and disadvantages of cooperation and binary Network Coding (NC) are highlighted. Our results show that the throughput increase introduced by NC is offset by a loss of diversity and coding gain. It i...

  13. Versions Of Care Technology

    Directory of Open Access Journals (Sweden)

    Sampsa Hyysalo

    2007-01-01

    Full Text Available The importance of users for innovation has been increasingly emphasized in the literatures on design and management of technology. However, less attention has been given to how people shape technology-in-use. This paper first provides a review of literature on technology use in the social and cultural studies of technology. It then moves to examine empirically how a novel alarm and monitoring appliance was appropriated in the work of home-care nurses and in the everyday living of elderly people. Analysis shows that even these technically unsavvy users shaped the technology considerably by various, even if mundane, acts of adapting it materially, as well as by attributing different meanings to it. However, the paper goes on to argue that such commonplace phrasing of the findings obscures their significance and interrelations. Consequently, the final section of the paper reframes the key findings of this study using the concepts of practice, enactment, and versions of technology to reach a more adequate description.

  14. A Fortran 90 code for magnetohydrodynamics. Part 1, Banded convolution

    Energy Technology Data Exchange (ETDEWEB)

    Walker, D.W.

    1992-03-01

    This report describes progress in developing a Fortran 90 version of the KITE code for studying plasma instabilities in Tokamaks. In particular, the evaluation of convolution terms appearing in the numerical solution is discussed, and timing results are presented for runs performed on an 8k processor Connection Machine (CM-2). Estimates of the performance on a full-size 64k CM-2 are given, and range between 100 and 200 Mflops. The advantages of having a Fortran 90 version of the KITE code are stressed, and the future use of such a code on the newly announced CM5 and Paragon computers, from Thinking Machines Corporation and Intel, is considered.

  15. Subsystem codes with spatially local generators

    CERN Document Server

    Bravyi, Sergey

    2010-01-01

    We study subsystem codes whose gauge group has local generators in the 2D geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way we introduce and study properties of generalized Bacon-Shor codes which might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d^2=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd^2=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  16. Recent developments in the Los Alamos radiation transport code system

    Energy Technology Data Exchange (ETDEWEB)

    Forster, R.A.; Parsons, K. [Los Alamos National Lab., NM (United States)

    1997-06-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.

  17. Predictive Bias and Sensitivity in NRC Fuel Performance Codes

    Energy Technology Data Exchange (ETDEWEB)

    Geelhood, Kenneth J.; Luscher, Walter G.; Senor, David J.; Cunningham, Mitchel E.; Lanning, Donald D.; Adkins, Harold E.

    2009-10-01

    The latest versions of the fuel performance codes, FRAPCON-3 and FRAPTRAN were examined to determine if the codes are intrinsically conservative. Each individual model and type of code prediction was examined and compared to the data that was used to develop the model. In addition, a brief literature search was performed to determine if more recent data have become available since the original model development for model comparison.

  18. Oil and gas field code master list, 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-16

    This document contains data collected through October 1993 and provides standardized field name spellings and codes for all identified oil and/or gas fields in the United States. Other Federal and State government agencies, as well as industry, use the EIA Oil and Gas Field Code Master List as the standard for field identification. A machine-readable version of the Oil and Gas Field Code Master List is available from the National Technical Information Service.

  19. Model Children's Code.

    Science.gov (United States)

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  20. Code conversion for system design and safety analysis of NSSS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hae Cho; Kim, Young Tae; Choi, Young Gil; Kim, Hee Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-01-01

    This report describes overall project works related to conversion, installation and validation of computer codes which are used in NSSS design and safety analysis of nuclear power plants. Domain/os computer codes for system safety analysis are installed and validated on Apollo DN10000, and then Apollo version are converted and installed again on HP9000/700 series with appropriate validation. Also, COOLII and COAST which are cyber version computer codes are converted into versions of Apollo DN10000 and HP9000/700, and installed with validation. This report details whole processes of work involved in the computer code conversion and installation, as well as software verification and validation results which are attached to this report. 12 refs., 8 figs. (author)

  1. GNU Octave Manual Version 3

    DEFF Research Database (Denmark)

    W. Eaton, John; Bateman, David; Hauberg, Søren

    This manual is the definitive guide to GNU Octave, an interactive environment for numerical computation. The manual covers the new version 3 of GNU Octave.......This manual is the definitive guide to GNU Octave, an interactive environment for numerical computation. The manual covers the new version 3 of GNU Octave....

  2. Data calculation program for RELAP 5 code

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Larissa J.B.; Sabundjian, Gaiane, E-mail: larissajbs@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    As the criteria and requirements for a nuclear power plant are extremely rigid, computer programs for simulation and safety analysis are required for certifying and licensing a plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors. A major difficulty in the simulation using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data leads to a very large number of mathematical operations for calculating the geometry of the components. Therefore, a mathematical friendly preprocessor was developed in order to perform these calculations and prepare RELAP5 input data. The Visual Basic for Application (VBA) combined with Microsoft EXCEL demonstrated to be an efficient tool to perform a number of tasks in the development of the program. Due to the absence of necessary information about some RELAP5 components, this work aims to make improvements to the Mathematic Preprocessor for RELAP5 code (PREREL5). For the new version of the preprocessor, new screens of some components that were not programmed in the original version were designed; moreover, screens of pre-existing components were redesigned to improve the program. In addition, an English version was provided for the new version of the PREREL5. The new design of PREREL5 contributes for saving time and minimizing mistakes made by users of the RELAP5 code. The final version of this preprocessor will be applied to Angra 2. (author)

  3. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  4. AWESOME 1.1: A code for the calculation of phase and group velocities of acoustic waves in homogeneous solids

    Science.gov (United States)

    Muñoz-Santiburcio, Daniel; Hernández-Laguna, Alfonso

    2017-08-01

    We present an improved version of the code AWESOME, capable of computing phase and group velocities, power flow angles and enhancement factors of acoustic waves in homogeneous solids. In this version, some algorithms are improved and the code provides a better estimation of the enhancement factor compared to the previous version. In addition, we include a quadruple-precision version of the code, which even though using the same numerical approach as the double-precision version, is able to calculate the exact values of the enhancement factor. The standard, double-precision version of the code has been interfaced and merged with the development version of CRYSTAL and will be available as part of its next stable release. Finally, we have improved the scripts for visualizing the results, which now are compatible with GNUPLOT 5.X.X, including new scripts for the visualization of the normal and ray surfaces.

  5. New developments in the Saphire computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J. [Idaho Engineering Lab., Idaho Falls, ID (United States)] [and others

    1996-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. Many recent enhancements to this suite of codes have been made. This presentation will provide an overview of these features and capabilities. The presentation will include a discussion of the new GEM module. This module greatly reduces and simplifies the work necessary to use the SAPHIRE code in event assessment applications. An overview of the features provided in the new Windows version will also be provided. This version is a full Windows 32-bit implementation and offers many new and exciting features. [A separate computer demonstration was held to allow interested participants to get a preview of these features.] The new capabilities that have been added since version 5.0 will be covered. Some of these major new features include the ability to store an unlimited number of basic events, gates, systems, sequences, etc.; the addition of improved reporting capabilities to allow the user to generate and {open_quotes}scroll{close_quotes} through custom reports; the addition of multi-variable importance measures; and the simplification of the user interface. Although originally designed as a PRA Level 1 suite of codes, capabilities have recently been added to SAPHIRE to allow the user to apply the code in Level 2 analyses. These features will be discussed in detail during the presentation. The modifications and capabilities added to this version of SAPHIRE significantly extend the code in many important areas. Together, these extensions represent a major step forward in PC-based risk analysis tools. This presentation provides a current up-to-date status of these important PRA analysis tools.

  6. ORNL ALICE: a statistical model computer code including fission competition. [In FORTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Plasil, F.

    1977-11-01

    A listing of the computer code ORNL ALICE is given. This code is a modified version of computer codes ALICE and OVERLAID ALICE. It allows for higher excitation energies and for a greater number of evaporated particles than the earlier versions. The angular momentum removal option was made more general and more internally consistent. Certain roundoff errors are avoided by keeping a strict accounting of partial probabilities. Several output options were added.

  7. Coding of hyperspectral imagery using adaptive classification and trellis-coded quantization

    Science.gov (United States)

    Abousleman, Glen P.

    1997-08-01

    A system is presented for compression of hyperspectral imagery. Specifically, DPCM is used for spectral decorrelation, while an adaptive 2D discrete cosine transform coding scheme is used for spatial decorrelation. Trellis coded quantization is used to encode the transform coefficients. Side information and rate allocation strategies are discussed. Entropy-constrained codebooks are designed using a modified version of the generalized Lloyd algorithm. This entropy constrained system achieves a compression ratio of greater than 70:1 with an average PSNR of the coded hyperspectral sequence approaching 41 dB.

  8. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  9. CBP PHASE I CODE INTEGRATION

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  10. An introduction to using QR codes in scholarly journals

    Directory of Open Access Journals (Sweden)

    Jae Hwa Chang

    2014-08-01

    Full Text Available The Quick Response (QR code was first developed in 1994 by Denso Wave Incorporated, Japan. From that point on, it came into general use as an identification mark for all kinds of commercial products, advertisements, and other public announcements. In scholarly journals, the QR code is used to provide immediate direction to the journal homepage or specific content such as figures or videos. To produce a QR code and print it in the print version or upload to the web is very simple. Using a QR code producing program, an editor can add simple information to a website. After that, a QR code is produced. A QR code is very stable, such that it can be used for a long time without loss of quality. Producing and adding QR codes to a journal costs nothing; therefore, to increase the visibility of their journals, it is time for editors to add QR codes to their journals.

  11. SYMTRAN - A Time-dependent Symmetric Tandem Mirror Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Hua, D; Fowler, T

    2004-06-15

    A time-dependent version of the steady-state radial transport model in symmetric tandem mirrors in Ref. [1] has been coded up and first tests performed. Our code, named SYMTRAN, is an adaptation of the earlier SPHERE code for spheromaks, now modified for tandem mirror physics. Motivated by Post's new concept of kinetic stabilization of symmetric mirrors, it is an extension of the earlier TAMRAC rate-equation code omitting radial transport [2], which successfully accounted for experimental results in TMX. The SYMTRAN code differs from the earlier tandem mirror radial transport code TMT in that our code is focused on axisymmetric tandem mirrors and classical diffusion, whereas TMT emphasized non-ambipolar transport in TMX and MFTF-B due to yin-yang plugs and non-symmetric transitions between the plugs and axisymmetric center cell. Both codes exhibit interesting but different non-linear behavior.

  12. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  13. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  14. Development of N-version software samples for an experiment in software fault tolerance

    Science.gov (United States)

    Lauterbach, L.

    1987-01-01

    The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.

  15. Guidelines for Sandia ASCI Verification and Validation Plans - Content and Format: Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    TRUCANO,TIMOTHY G.; MOYA,JAIME L.

    1999-12-01

    This report summarizes general guidelines for the development of Verification and Validation (V and V) plans for ASCI code projects at Sandia National Laboratories. The main content categories recommended by these guidelines for explicit treatment in Sandia V and V plans are (1) stockpile drivers influencing the code development project (2) the key phenomena to be modeled by the individual code; (3) software verification strategy and test plan; and (4) code validation strategy and test plans. The authors of this document anticipate that the needed content of the V and V plans for the Sandia ASCI codes will evolve as time passes. These needs will be reflected by future versions of this document.

  16. Coded Modulation in C and MATLAB

    Science.gov (United States)

    Hamkins, Jon; Andrews, Kenneth S.

    2011-01-01

    This software, written separately in C and MATLAB as stand-alone packages with equivalent functionality, implements encoders and decoders for a set of nine error-correcting codes and modulators and demodulators for five modulation types. The software can be used as a single program to simulate the performance of such coded modulation. The error-correcting codes implemented are the nine accumulate repeat-4 jagged accumulate (AR4JA) low-density parity-check (LDPC) codes, which have been approved for international standardization by the Consultative Committee for Space Data Systems, and which are scheduled to fly on a series of NASA missions in the Constellation Program. The software implements the encoder and decoder functions, and contains compressed versions of generator and parity-check matrices used in these operations.

  17. A cybernetic approach to the origin of the genetic coding mechanism. II. Formation of the code series.

    Science.gov (United States)

    Batchinsky, A G; Ratner, V A

    1976-08-01

    The sequential fulfillment of the principle of succession necessarily guides the main steps of the genetic code evolution to be reflected in its structure. The general scheme of the code series formation is proposed basing on the idea of "group coding" (Woese, 1970). The genetic code supposedly evolved by means of successive divergence of pra-ARS's loci, accompanied by increasing specification of recognition capacity of amino acids and triplets. The sense of codons had not been changed on any step of stochastic code evolution. The formulated rules for code series formation produce a code version, similar to the contemporary one. Based on these rules the scheme of pra-ARS's divergence is proposed resulting in the grouping of amino acids by their polarity and size. Later steps in the evolution of the genetic code were probably based on more detailed features of the amino acids (for example, on their functional similarities like their interchangeabilities in isofunctional proteins).

  18. Recent enhancements to the MARS15 code

    Energy Technology Data Exchange (ETDEWEB)

    Nikolai V. Mokhov et al.

    2004-05-12

    The MARS code is under continuous development and has recently undergone substantial improvements that further increase its reliability and predictive power in numerous shielding, accelerator, detector and space applications. The major developments and new features of the MARS15 (2004) version described in this paper concern an extended list of elementary particles and arbitrary heavy ions and their interaction cross-sections, inclusive and exclusive nuclear event generators, module for modeling particle electromagnetic interactions, enhanced geometry and histograming options, improved MAD-MARS Beam Line Builder, enhanced Graphical-User Interface, and an MPI-based parallelization of the code.

  19. Locally Orderless Registration Code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  20. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  1. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    Science.gov (United States)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  2. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    Science.gov (United States)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  3. User's Manual for LEWICE Version 3.2

    Science.gov (United States)

    Wright, William

    2008-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.

  4. OSIRIS 4.0: new version of the OSIRIS framework

    Science.gov (United States)

    Fonseca, Ricardo; Tableman, Adam; Vieira, Jorge; Decyk, Viktor; Mori, Warren; Silva, LuíS.

    2016-10-01

    OSIRIS is a state of the art, fully relativistic massively parallel particle in cell code, that is widely used in kinetic plasma modeling for many astrophysical and laboratory scenarios. Over the years the code has been continuously improved, adding new features and algorithms, resulting in a large and complex code base with the inherent difficulties on maintenance and development. We report on the new version of the OSIRIS framework, focusing on the new structure of the code that leverages on the object oriented features of Fortran 2003, that are now widely supported by available compilers. Details on the new object-oriented structure, that allows for the encapsulation of specific features, and better collaboration between the development team, are given. We also focus on the new strategy for run-time selection of simulation mode, that allows for a single binary to be used with all code features, and report on the template based code generation for multiple interpolation levels. Finally, we report on our experience on implementing these features with multiple compilers, and the code changes required to ensure a wide compiler support. This work was partially supported by NSF ACI 1339893 and PTDC/FIS-PLA/2940/2014.

  5. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  6. Constructing quantum codes

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Quantum error correcting codes are indispensable for quantum information processing and quantum computation.In 1995 and 1996,Shor and Steane gave first several examples of quantum codes from classical error correcting codes.The construction of efficient quantum codes is now an active multi-discipline research field.In this paper we review the known several constructions of quantum codes and present some examples.

  7. Combination of Endothelial-Monocyte-Activating Polypeptide-II with Temozolomide Suppress Malignant Biological Behaviors of Human Glioblastoma Stem Cells via miR-590-3p/MACC1 Inhibiting PI3K/AKT/mTOR Signal Pathway

    Science.gov (United States)

    Zhou, Wei; Liu, Libo; Xue, Yixue; Zheng, Jian; Liu, Xiaobai; Ma, Jun; Li, Zhen; Liu, Yunhui

    2017-01-01

    This study aims to investigate the effect of Endothelial-Monocyte-Activating Polypeptide-II (EMAP-II) combined with temozolomide (TMZ) upon glioblastoma stem cells (GSCs) and its possible molecular mechanisms. In this study, combination of EMAP-II with TMZ inhibited cell viability, migration and invasion in GSCs, and autophagy inhibitor 3-methyl adenine (3-MA) and chloroquine (CQ) partly reverse the anti-proliferative effect of the combination treatment. Autophagic vacuoles were formed in GSCs after the combination therapy, accompanied with the up-regulation of LC3-II and Beclin-1 as well as the down-regulation of p62/SQSTM1. Further, miR-590-3p was up-regulated and Metastasis-associated in colon cancer 1 (MACC1) was down-regulated by the combination treatment in GSCs; MiR-590-3p overexpression and MACC1 knockdown up-regulated LC3-II and Beclin-1 as well as down-regulated p62/SQSTM1 in GSCs; MACC1 was identified as a direct target of miR-590-3p, mediating the effects of miR-590-3p in the combination treatment. Furthermore, the combination treatment and MACC1 knockdown decreased p-PI3K, p-Akt, p-mTOR, p-S6 and p-4EBP in GSCs; PI3K/Akt agonist insulin-like growth factor-1(IGF-1) partly blocked the effect of the combination treatment. Moreover, in vivo xenograft models, the mice given stable overexpressed miR-590-3p cells and treated with EMAP-II and TMZ had the smallest tumor sizes, besides, miR-590-3p + EMAP-II + TMZ up-regulated the expression level of miR-590-3p, LC3-II and Beclin-1 as well as down-regulated p62/SQSTM1. In conclusion, these results elucidated anovel molecular mechanism of EMAP-II in combination with TMZ suppressed malignant biological behaviors of GSCs via miR-590-3p/MACC1 inhibiting PI3K/AKT/mTOR signaling pathway, and might provide potential therapeutic approaches for human GSCs.

  8. User`s manual for SNL-SAND-II code

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, P.J.; Kelly, J.G. [Sandia National Labs., Albuquerque, NM (United States); VanDenburg, J.W. [Science and Engineering Associates, Inc., Albuquerque, NM (United States)

    1994-04-01

    Sandia National Laboratories, in the process of characterizing the neutron environments at its reactor facilities, has developed an enhanced version of W. McElroy`s original SAND-II code. The enhanced input, output, and plotting interfaces make the code much easier to use. The basic physics and operation of the code remain unchanged. Important code enhancements include the interfaces to the latest ENDF/B-VI and IRDF-90 dosimetry-quality cross sections and the ability to use silicon displacement-sensitive devices as dosimetry sensors.

  9. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  10. Hybrid Noncoherent Network Coding

    CERN Document Server

    Skachek, Vitaly; Nedic, Angelia

    2011-01-01

    We describe a novel extension of subspace codes for noncoherent networks, suitable for use when the network is viewed as a communication system that introduces both dimension and symbol errors. We show that when symbol erasures occur in a significantly large number of different basis vectors transmitted through the network and when the min-cut of the networks is much smaller then the length of the transmitted codewords, the new family of codes outperforms their subspace code counterparts. For the proposed coding scheme, termed hybrid network coding, we derive two upper bounds on the size of the codes. These bounds represent a variation of the Singleton and of the sphere-packing bound. We show that a simple concatenated scheme that represents a combination of subspace codes and Reed-Solomon codes is asymptotically optimal with respect to the Singleton bound. Finally, we describe two efficient decoding algorithms for concatenated subspace codes that in certain cases have smaller complexity than subspace decoder...

  11. A one-dimensional material transfer model for HECTR version 1. 5

    Energy Technology Data Exchange (ETDEWEB)

    Geller, A.S.; Wong, C.C.

    1991-08-01

    HECTR (Hydrogen Event Containment Transient Response) is a lumped-parameter computer code developed for calculating the pressure-temperature response to combustion in a nuclear power plant containment building. The code uses a control-volume approach and subscale models to simulate the mass, momentum, and energy transfer occurring in the containment during a loss-of-collant-accident (LOCA). This document describes one-dimensional subscale models for mass and momentum transfer, and the modifications to the code required to implement them. Two problems were analyzed: the first corresponding to a standard problem studied with previous HECTR versions, the second to experiments. The performance of the revised code relative to previous HECTR version is discussed as is the ability of the code to model the experiments. 8 refs., 5 figs., 3 tabs.

  12. CORTRAN code user manual. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Cheatham, R.L.; Crawford, S.L.; Khan, E.U.

    1981-02-01

    CORTRAN has been developed as a relatively fast running design code for core-wide steady-state and transient analysis of Liquid Metal Fast Breeder Reactor (LMFBR) cores. The preliminary version of this computer program uses subchannel analysis techniques to compute the velocity and temperature fields on a multiassembly basis for three types of transient forcing functions: total power, total flow, and inlet coolant temperature. Interassembly heat transfer, intra-assembly heat transfer, and intra-assembly flow redistribution due to buoyancy are taken into account. Heat generation within the fuel rods and assembly duct walls is also included. Individual pin radial peaking factors (peak to average for each assembly) can be either read in or calculated from specified normalized neutronic power densities (six per assembly).

  13. Advanced Code for Photocathode Design

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Jensen, Kevin [Naval Research Lab. (NRL), Washington, DC (United States); Montgomery, Eric [Univ. of Maryland, College Park, MD (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  14. Network coding for computing: Linear codes

    CERN Document Server

    Appuswamy, Rathinakumar; Karamchandani, Nikhil; Zeger, Kenneth

    2011-01-01

    In network coding it is known that linear codes are sufficient to achieve the coding capacity in multicast networks and that they are not sufficient in general to achieve the coding capacity in non-multicast networks. In network computing, Rai, Dey, and Shenvi have recently shown that linear codes are not sufficient in general for solvability of multi-receiver networks with scalar linear target functions. We study single receiver networks where the receiver node demands a target function of the source messages. We show that linear codes may provide a computing capacity advantage over routing only when the receiver demands a `linearly-reducible' target function. % Many known target functions including the arithmetic sum, minimum, and maximum are not linearly-reducible. Thus, the use of non-linear codes is essential in order to obtain a computing capacity advantage over routing if the receiver demands a target function that is not linearly-reducible. We also show that if a target function is linearly-reducible,...

  15. Practices in Code Discoverability

    CERN Document Server

    Teuben, Peter; Nemiroff, Robert J; Shamir, Lior

    2012-01-01

    Much of scientific progress now hinges on the reliability, falsifiability and reproducibility of computer source codes. Astrophysics in particular is a discipline that today leads other sciences in making useful scientific components freely available online, including data, abstracts, preprints, and fully published papers, yet even today many astrophysics source codes remain hidden from public view. We review the importance and history of source codes in astrophysics and previous efforts to develop ways in which information about astrophysics codes can be shared. We also discuss why some scientist coders resist sharing or publishing their codes, the reasons for and importance of overcoming this resistance, and alert the community to a reworking of one of the first attempts for sharing codes, the Astrophysics Source Code Library (ASCL). We discuss the implementation of the ASCL in an accompanying poster paper. We suggest that code could be given a similar level of referencing as data gets in repositories such ...

  16. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  17. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  18. Quantitative code accuracy evaluation of ISP33

    Energy Technology Data Exchange (ETDEWEB)

    Kalli, H.; Miwrrin, A. [Lappeenranta Univ. of Technology (Finland); Purhonen, H. [VTT Energy, Lappeenranta (Finland)] [and others

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  19. A diffuser heat transfer and erosion code

    Science.gov (United States)

    Buzzard, G. H.

    1985-10-01

    A computer code for diffuser heat transfer and erosion analysis (DHTE) has been developed which improves upon the earlier Rocket Engine Diffuser Thermal Analysis Program (REDTAP). Improvements contained within DHTE include provision for a radial temperature gradient within the diffuser wall, an improved model for the particle impingement accommodation coefficient, a model for particle debris shielding, and a model for wall erosion by particle impact. DHTE differs from an earlier diffuser heat transfer code (DHT) to the extent that it incorporates a simple erosion model and utilizes a more recent diffuser version of the JANNAF Standardized Plume Flow Field Model (SCP2ND). The 77-inch diffuser was instrumented to record the water side wall temperature and water jacket temperature at selected sites along the initial seven feet of the diffuser during routine test firings. Data is presented that supports the predictions of DHTE but is inadequate to validate the code.

  20. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  1. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  2. Informal Control code logic

    CERN Document Server

    Bergstra, Jan A

    2010-01-01

    General definitions as well as rules of reasoning regarding control code production, distribution, deployment, and usage are described. The role of testing, trust, confidence and risk analysis is considered. A rationale for control code testing is sought and found for the case of safety critical embedded control code.

  3. Refactoring test code

    NARCIS (Netherlands)

    Deursen, A. van; Moonen, L.M.F.; Bergh, A. van den; Kok, G.

    2001-01-01

    Two key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from refactoring product

  4. Codes and standards and other guidance cited in regulatory documents

    Energy Technology Data Exchange (ETDEWEB)

    Nickolaus, J.R.; Bohlander, K.L.

    1996-08-01

    As part of the U.S. Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program (SRP-UDP), Pacific Northwest National Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. The SRP-UDP has been completed and the SRP-Maintenance Program (SRP-MP) is now maintaining this listing. Besides updating previous information, Revision 3 adds approximately 80 citations. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC`s Bulletins, Information Notices, Circulars, Enforcement Manual, Generic Letters, Inspection Manual, Policy Statements, Regulatory Guides, Standard Technical Specifications and the Standard Review Plan (NUREG-0800).

  5. ARC Code TI: CODE Software Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — CODE is a software framework for control and observation in distributed environments. The basic functionality of the framework allows a user to observe a distributed...

  6. ARC Code TI: ROC Curve Code Augmentation

    Data.gov (United States)

    National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...

  7. Fountain Codes: LT And Raptor Codes Implementation

    Directory of Open Access Journals (Sweden)

    Ali Bazzi, Hiba Harb

    2017-01-01

    Full Text Available Digital fountain codes are a new class of random error correcting codes designed for efficient and reliable data delivery over erasure channels such as internet. These codes were developed to provide robustness against erasures in a way that resembles a fountain of water. A digital fountain is rateless in a way that sender can send limitless number of encoded packets. The receiver doesn’t care which packets are received or lost as long as the receiver gets enough packets to recover original data. In this paper, the design of the fountain codes is explored with its implementation of the encoding and decoding algorithm so that the performance in terms of encoding/decoding symbols, reception overhead, data length, and failure probability is studied.

  8. Cross-Platform JavaScript Coding: Shifting Sand Dunes and Shimmering Mirages.

    Science.gov (United States)

    Merchant, David

    1999-01-01

    Most libraries don't have the resources to cross-platform and cross-version test all of their JavaScript coding. Many turn to WYSIWYG; however, WYSIWYG editors don't generally produce optimized coding. Web developers should: test their coding on at least one 3.0 browser, code by hand using tools to help speed that process up, and include a simple…

  9. ALEGRA : version 4.6.

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Michael K. W.; Summers, Randall M.; Petney, Sharon Joy Victor; Luchini, Christopher Bernard; Drake, Richard Roy; Carroll, Susan K.; Hensinger, David M.; Garasi, Christopher Joseph; Robinson, Allen Conrad; Voth, Thomas Eugene; Haill, Thomas A.; Mehlhorn, Thomas Alan; Robbins, Joshua H.; Brunner, Thomas A.

    2005-01-01

    ALEGRA is an arbitrary Lagrangian-Eulerian multi-material finite element code used for modeling solid dynamics problems involving large distortion and shock propagation. This document describes the basic user input language and instructions for using the software.

  10. Converge, Version 3.0

    Directory of Open Access Journals (Sweden)

    Lotfi Tadj

    1993-01-01

    Full Text Available Although intended for college teachers/students, Converge presents a feature that may interest all scientists: it allows an easy export of graphic files to most known word processors, specifically to the ℙ, Version 2.1, a powerful WYSIWYG mathematical word processor.

  11. TOUGH2 User's Guide Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Oldenburg, C.M.; Moridis, G.J.

    1999-11-01

    TOUGH2 is a numerical simulator for nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. The chief applications for which TOUGH2 is designed are in geothermal reservoir engineering, nuclear waste disposal, environmental assessment and remediation, and unsaturated and saturated zone hydrology. TOUGH2 was first released to the public in 1991; the 1991 code was updated in 1994 when a set of preconditioned conjugate gradient solvers was added to allow a more efficient solution of large problems. The current Version 2.0 features several new fluid property modules and offers enhanced process modeling capabilities, such as coupled reservoir-wellbore flow, precipitation and dissolution effects, and multiphase diffusion. Numerous improvements in previously released modules have been made and new user features have been added, such as enhanced linear equation solvers, and writing of graphics files. The T2VOC module for three-phase flows of water, air and a volatile organic chemical (VOC), and the T2DM module for hydrodynamic dispersion in 2-D flow systems have been integrated into the overall structure of the code and are included in the Version 2.0 package. Data inputs are upwardly compatible with the previous version. Coding changes were generally kept to a minimum, and were only made as needed to achieve the additional functionalities desired. TOUGH2 is written in standard FORTRAN77 and can be run on any platform, such as workstations, PCs, Macintosh, mainframe and supercomputers, for which appropriate FORTRAN compilers are available. This report is a self-contained guide to application of TOUGH2 to subsurface flow problems. It gives a technical description of the TOUGH2 code, including a discussion of the physical processes modeled, and the mathematical and numerical methods used. Illustrative sample problems are presented along with detailed instructions for preparing input data.

  12. Universal Rateless Codes From Coupled LT Codes

    CERN Document Server

    Aref, Vahid

    2011-01-01

    It was recently shown that spatial coupling of individual low-density parity-check codes improves the belief-propagation threshold of the coupled ensemble essentially to the maximum a posteriori threshold of the underlying ensemble. We study the performance of spatially coupled low-density generator-matrix ensembles when used for transmission over binary-input memoryless output-symmetric channels. We show by means of density evolution that the threshold saturation phenomenon also takes place in this setting. Our motivation for studying low-density generator-matrix codes is that they can easily be converted into rateless codes. Although there are already several classes of excellent rateless codes known to date, rateless codes constructed via spatial coupling might offer some additional advantages. In particular, by the very nature of the threshold phenomenon one expects that codes constructed on this principle can be made to be universal, i.e., a single construction can uniformly approach capacity over the cl...

  13. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  14. TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (SUN3 VERSION)

    Science.gov (United States)

    TAE SUPPORT OFFICE

    1994-01-01

    workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK

  15. TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (DEC VAX ULTRIX VERSION)

    Science.gov (United States)

    TAE SUPPORT OFFICE

    1994-01-01

    workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK

  16. Bi-level image compression with tree coding

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1996-01-01

    Presently, tree coders are the best bi-level image coders. The current ISO standard, JBIG, is a good example. By organising code length calculations properly a vast number of possible models (trees) can be investigated within reasonable time prior to generating code. Three general-purpose coders...... version that without sacrificing speed brings it close to the multi-pass coders in compression performance...

  17. DISFRAC Version 2.0 Users Guide

    Energy Technology Data Exchange (ETDEWEB)

    Cochran, Kristine B [ORNL; Erickson, Marjorie A [ORNL; Williams, Paul T [ORNL; Klasky, Hilda B [ORNL; Bass, Bennett Richard [ORNL

    2013-01-01

    DISFRAC is the implementation of a theoretical, multi-scale model for the prediction of fracture toughness in the ductile-to-brittle transition temperature (DBTT) region of ferritic steels. Empirically-derived models of the DBTT region cannot legitimately be extrapolated beyond the range of existing fracture toughness data. DISFRAC requires only tensile properties and microstructural information as input, and thus allows for a wider range of application than empirical, toughness data dependent models. DISFRAC is also a framework for investigating the roles of various microstructural and macroscopic effects on fracture behavior, including carbide particle sizes, grain sizes, strain rates, and material condition. DISFRAC s novel approach is to assess the interaction effects of macroscopic conditions (geometry, loading conditions) with variable microstructural features on cleavage crack initiation and propagation. The model addresses all stages of the fracture process, from microcrack initiation within a carbide particle, to propagation of that crack through grains and across grain boundaries, finally to catastrophic failure of the material. The DISFRAC procedure repeatedly performs a deterministic analysis of microcrack initiation and propagation within a macroscopic crack plastic zone to calculate a critical fracture toughness value for each microstructural geometry set. The current version of DISFRAC, version 2.0, is a research code for developing and testing models related to cleavage fracture and transition toughness. The various models and computations have evolved significantly over the course of development and are expected to continue to evolve as testing and data collection continue. This document serves as a guide to the usage and theoretical foundations of DISFRAC v2.0. Feedback is welcomed and encouraged.

  18. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  19. The Clawpack Community of Codes

    Science.gov (United States)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  20. Noisy Network Coding

    CERN Document Server

    Lim, Sung Hoon; Gamal, Abbas El; Chung, Sae-Young

    2010-01-01

    A noisy network coding scheme for sending multiple sources over a general noisy network is presented. For multi-source multicast networks, the scheme naturally extends both network coding over noiseless networks by Ahlswede, Cai, Li, and Yeung, and compress-forward coding for the relay channel by Cover and El Gamal to general discrete memoryless and Gaussian networks. The scheme also recovers as special cases the results on coding for wireless relay networks and deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme involves message repetition coding, relay signal compression, and simultaneous decoding. Unlike previous compress--forward schemes, where independent messages are sent over multiple blocks, the same message is sent multiple times using independent codebooks as in the network coding scheme for cyclic networks. Furthermore, the relays do not use Wyner--Ziv binning as in previous compress-forward sch...

  1. Testing algebraic geometric codes

    Institute of Scientific and Technical Information of China (English)

    CHEN Hao

    2009-01-01

    Property testing was initially studied from various motivations in 1990's.A code C (∩)GF(r)n is locally testable if there is a randomized algorithm which can distinguish with high possibility the codewords from a vector essentially far from the code by only accessing a very small (typically constant) number of the vector's coordinates.The problem of testing codes was firstly studied by Blum,Luby and Rubinfeld and closely related to probabilistically checkable proofs (PCPs).How to characterize locally testable codes is a complex and challenge problem.The local tests have been studied for Reed-Solomon (RS),Reed-Muller (RM),cyclic,dual of BCH and the trace subcode of algebraicgeometric codes.In this paper we give testers for algebraic geometric codes with linear parameters (as functions of dimensions).We also give a moderate condition under which the family of algebraic geometric codes cannot be locally testable.

  2. Chinese remainder codes

    Institute of Scientific and Technical Information of China (English)

    ZHANG Aili; LIU Xiufeng

    2006-01-01

    Chinese remainder codes are constructed by applying weak block designs and the Chinese remainder theorem of ring theory.The new type of linear codes take the congruence class in the congruence class ring R/I1 ∩ I2 ∩…∩ In for the information bit,embed R/Ji into R/I1 ∩ I2 ∩…∩ In,and assign the cosets of R/Ji as the subring of R/I1 ∩ I2 ∩…∩ In and the cosets of R/Ji in R/I1 ∩ I2 ∩…∩ In as check lines.Many code classes exist in the Chinese remainder codes that have high code rates.Chinese remainder codes are the essential generalization of Sun Zi codes.

  3. Chinese Remainder Codes

    Institute of Scientific and Technical Information of China (English)

    张爱丽; 刘秀峰; 靳蕃

    2004-01-01

    Chinese Remainder Codes are constructed by applying weak block designs and Chinese Remainder Theorem of ring theory. The new type of linear codes take the congruence class in the congruence class ring R/I1∩I2∩…∩In for the information bit, embed R/Ji into R/I1∩I2∩…∩In, and asssign the cosets of R/Ji as the subring of R/I1∩I2∩…∩In and the cosets of R/Ji in R/I1∩I2∩…∩In as check lines. There exist many code classes in Chinese Remainder Codes, which have high code rates. Chinese Remainder Codes are the essential generalization of Sun Zi Codes.

  4. Code of Ethics

    DEFF Research Database (Denmark)

    Adelstein, Jennifer; Clegg, Stewart

    2016-01-01

    Ethical codes have been hailed as an explicit vehicle for achieving more sustainable and defensible organizational practice. Nonetheless, when legal compliance and corporate governance codes are conflated, codes can be used to define organizational interests ostentatiously by stipulating norms...... for employee ethics. Such codes have a largely cosmetic and insurance function, acting subtly and strategically to control organizational risk management and protection. In this paper, we conduct a genealogical discourse analysis of a representative code of ethics from an international corporation...... to understand how management frames expectations of compliance. Our contribution is to articulate the problems inherent in codes of ethics, and we make some recommendations to address these to benefit both an organization and its employees. In this way, we show how a code of ethics can provide a foundation...

  5. Defeating the coding monsters.

    Science.gov (United States)

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  6. Testing algebraic geometric codes

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Property testing was initially studied from various motivations in 1990’s. A code C  GF (r)n is locally testable if there is a randomized algorithm which can distinguish with high possibility the codewords from a vector essentially far from the code by only accessing a very small (typically constant) number of the vector’s coordinates. The problem of testing codes was firstly studied by Blum, Luby and Rubinfeld and closely related to probabilistically checkable proofs (PCPs). How to characterize locally testable codes is a complex and challenge problem. The local tests have been studied for Reed-Solomon (RS), Reed-Muller (RM), cyclic, dual of BCH and the trace subcode of algebraicgeometric codes. In this paper we give testers for algebraic geometric codes with linear parameters (as functions of dimensions). We also give a moderate condition under which the family of algebraic geometric codes cannot be locally testable.

  7. Serially Concatenated IRA Codes

    CERN Document Server

    Cheng, Taikun; Belzer, Benjamin J

    2007-01-01

    We address the error floor problem of low-density parity check (LDPC) codes on the binary-input additive white Gaussian noise (AWGN) channel, by constructing a serially concatenated code consisting of two systematic irregular repeat accumulate (IRA) component codes connected by an interleaver. The interleaver is designed to prevent stopping-set error events in one of the IRA codes from propagating into stopping set events of the other code. Simulations with two 128-bit rate 0.707 IRA component codes show that the proposed architecture achieves a much lower error floor at higher SNRs, compared to a 16384-bit rate 1/2 IRA code, but incurs an SNR penalty of about 2 dB at low to medium SNRs. Experiments indicate that the SNR penalty can be reduced at larger blocklengths.

  8. SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Wang, L.

    1994-01-01

    representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.

  9. SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Wang, L.

    1994-01-01

    representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.

  10. APEX version 2.0: latest version of the cross-platform analysis program for EXAFS.

    Science.gov (United States)

    Dimakis, N; Bunker, G

    2001-03-01

    This report describes recent progress on APEX, a free, open source, cross platform set of EXAFS data analysis software. In a previous report we described APEX 1.0 (Dimakis, N. and Bunker, G., 1999), a free and open source code suite of basic X-Ray Absorption Fine Structure (XAFS) data analysis programs for classical data reduction and single scattering analysis. The first version of APEX was the only cross platform (linux/irix/windows/MacOS) EXAFS analysis program to our knowledge, but it lacked important features like multiple scattering fitting, generic format conversion from ASCII to University of Washington (UW) binary-type files, and user friendly interactive graphics. In the enhanced version described here we have added cross-platform interactive graphics based on the BLT package, which is an extension to TCL/TK. Some of the utilities have been rewritten in native TCL/TK, allowing for faster and more integrated functionality with the main package. The package also has been ported to SunOS. APEX 2.0 in its current form is suitable for routine data analysis and training. Addition of more advanced methods of data analysis are planned.

  11. Development of Advanced In core Management Codes for Ulchin Unit 1, 2

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Hee; Park, Moon Gyu [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1996-12-31

    As the first case of FRAMATOME`s plants, Ulchin Unit 1 Cycle 8 core was loaded with V5H fuel. Because of the heterogeneity in the axial enrichment of the V5H fuel, FRAMATOME`s 2-D in core management codes, CEDRIC-CARIN-ESTHER, are no longer valid to be used for in core management. In order to analyze Ulchin Unit 1 an 2 cores loaded with V5H fuel exactly, this study substituted them with WH`s In core-3D and Tote code. The previous IN CORE-Tote codes have been utilized on the HP workstation or the IBM mainframe which are not easily accessible by the site engineers and require the complicated manipulation in the computer network system. This study developed the PC-version of IN CORE-3D and Tote codes available in plants, including an interface code linking the data measured by the in core instrument system (RIC-KIT system) to IN CORE code. These codes reduce the time to manage in core and increase the economic benefits. We installed the developed codes in Ulchin Unit 1 and 2 and actually applied them to the core power distribution measurement performed during Cycle 8 power escalation tests. The results satisfied all limits of Technical Specification very well. The major contents of this study can be categorized as follows. 1. Analysis of the in core management codes. (a) Analysis of flux mapping system and measurement reduction algorithm. (b) Analysis of the methodology of in core management codes. 2. Development and verification of PC-version in core management codes. (a) Development of the measured-data processing code (C2I). (b) Development of PC-version IN CORE code. (c) Development of PC-version Tote code (d) Verification of the developed codes. 3. Application to core physics test of Ulchin until cycle 8. (a) Power distribution measurement at 75% and 100%. (author). 14 refs., figs., tabs.

  12. GENII Version 2 Software Design Document

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  13. Constructive version of Boolean algebra

    CERN Document Server

    Ciraulo, Francesco; Toto, Paola

    2012-01-01

    The notion of overlap algebra introduced by G. Sambin provides a constructive version of complete Boolean algebra. Here we first show some properties concerning overlap algebras: we prove that the notion of overlap morphism corresponds classically to that of map preserving arbitrary joins; we provide a description of atomic set-based overlap algebras in the language of formal topology, thus giving a predicative characterization of discrete locales; we show that the power-collection of a set is the free overlap algebra join-generated from the set. Then, we generalize the concept of overlap algebra and overlap morphism in various ways to provide constructive versions of the category of Boolean algebras with maps preserving arbitrary existing joins.

  14. Online Cake Cutting (published version)

    CERN Document Server

    Walsh, Toby

    2011-01-01

    We propose an online form of the cake cutting problem. This models situations where agents arrive and depart during the process of dividing a resource. We show that well known fair division procedures like cut-and-choose and the Dubins-Spanier moving knife procedure can be adapted to apply to such online problems. We propose some fairness properties that online cake cutting procedures can possess like online forms of proportionality and envy-freeness. We also consider the impact of collusion between agents. Finally, we study theoretically and empirically the competitive ratio of these online cake cutting procedures. Based on its resistance to collusion, and its good performance in practice, our results favour the online version of the cut-and-choose procedure over the online version of the moving knife procedure.

  15. The EGS5 Code System

    Energy Technology Data Exchange (ETDEWEB)

    Hirayama, Hideo; Namito, Yoshihito; /KEK, Tsukuba; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  16. TART97 a coupled neutron-photon 3-D, combinatorial geometry Monte Carlo transport code

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, D.E.

    1997-11-22

    TART97 is a coupled neutron-photon, 3 Dimensional, combinatorial geometry, time dependent Monte Carlo transport code. This code can on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART97 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system can save you a great deal of time and energy. TART97 is distributed on CD. This CD contains on- line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART97 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART97 and its data riles.

  17. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  18. Efficient coding of wavelet trees and its applications in image coding

    Science.gov (United States)

    Zhu, Bin; Yang, En-hui; Tewfik, Ahmed H.; Kieffer, John C.

    1996-02-01

    We propose in this paper a novel lossless tree coding algorithm. The technique is a direct extension of the bisection method, the simplest case of the complexity reduction method proposed recently by Kieffer and Yang, that has been used for lossless data string coding. A reduction rule is used to obtain the irreducible representation of a tree, and this irreducible tree is entropy-coded instead of the input tree itself. This reduction is reversible, and the original tree can be fully recovered from its irreducible representation. More specifically, we search for equivalent subtrees from top to bottom. When equivalent subtrees are found, a special symbol is appended to the value of the root node of the first equivalent subtree, and the root node of the second subtree is assigned to the index which points to the first subtree, an all other nodes in the second subtrees are removed. This procedure is repeated until it cannot be reduced further. This yields the irreducible tree or irreducible representation of the original tree. The proposed method can effectively remove the redundancy in an image, and results in more efficient compression. It is proved that when the tree size approaches infinity, the proposed method offers the optimal compression performance. It is generally more efficient in practice than direct coding of the input tree. The proposed method can be directly applied to code wavelet trees in non-iterative wavelet-based image coding schemes. A modified method is also proposed for coding wavelet zerotrees in embedded zerotree wavelet (EZW) image coding. Although its coding efficiency is slightly reduced, the modified version maintains exact control of bit rate and the scalability of the bit stream in EZW coding.

  19. Rewriting the Genetic Code.

    Science.gov (United States)

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  20. On Polynomial Remainder Codes

    CERN Document Server

    Yu, Jiun-Hung

    2012-01-01

    Polynomial remainder codes are a large class of codes derived from the Chinese remainder theorem that includes Reed-Solomon codes as a special case. In this paper, we revisit these codes and study them more carefully than in previous work. We explicitly allow the code symbols to be polynomials of different degrees, which leads to two different notions of weight and distance. Algebraic decoding is studied in detail. If the moduli are not irreducible, the notion of an error locator polynomial is replaced by an error factor polynomial. We then obtain a collection of gcd-based decoding algorithms, some of which are not quite standard even when specialized to Reed-Solomon codes.

  1. Kodo: An Open and Research Oriented Network Coding Library

    DEFF Research Database (Denmark)

    Pedersen, Morten Videbæk; Heide, Janus; Fitzek, Frank

    2011-01-01

    achieve a high coding throughput, and reduce energy consumption.We use an on-the-fly version of the Gauss-Jordan algorithm as a baseline, and provide several simple improvements to reduce the number of operations needed to perform decoding. Our tests show that the improvements can reduce the number...

  2. 20-Sim ANSI-C code on a 8051 target

    NARCIS (Netherlands)

    Geerlings, Joël

    2001-01-01

    In the forth-coming version of 20-sim the option code-generation for targets will be available. After selection of a template, it’s filled in with model specific information. Then this adapted template can be compiled and linked such that it can be run on the target. Theo Lammerink designed around t

  3. CERN access card: Introduction of a bar code

    CERN Multimedia

    Relations with the Host States Service

    2004-01-01

    Before the latest version of the implementation measures relating to Operational Circular No. 2 comes into force, we would like to inform you that, in future, CERN access cards may bear a bar code to transcribe the holder's identification number. Relations with the Host States Service http://www.cern.ch/relations/ Tel. 72848

  4. Generating code adapted for interlinking legacy scalar code and extended vector code

    Science.gov (United States)

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  5. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  6. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when....... Instead we propose that the promise of coding-based research lies in its ability to open up new qualitative questions....

  7. Industrial Computer Codes

    Science.gov (United States)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  8. RELAP5-3D Developmental Assessment: Comparison of Version 4.2.1i on Linux and Windows

    Energy Technology Data Exchange (ETDEWEB)

    Paul D. Bayless

    2014-06-01

    Figures have been generated comparing the parameters used in the developmental assessment of the RELAP5-3D code, version 4.2i, compiled on Linux and Windows platforms. The figures, which are the same as those used in Volume III of the RELAP5-3D code manual, compare calculations using the semi-implicit solution scheme with available experiment data. These figures provide a quick, visual indication of how the code predictions differ between the Linux and Windows versions.

  9. RELAP5-3D developmental assessment: Comparison of version 4.2.1i on Linux and Windows

    Energy Technology Data Exchange (ETDEWEB)

    Bayless, Paul D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-06-01

    Figures have been generated comparing the parameters used in the developmental assessment of the RELAP5-3D code, version 4.2i, compiled on Linux and Windows platforms. The figures, which are the same as those used in Volume III of the RELAP5-3D code manual, compare calculations using the semi-implicit solution scheme with available experiment data. These figures provide a quick, visual indication of how the code predictions differ between the Linux and Windows versions.

  10. RELAP5-3D Developmental Assessment. Comparison of Version 4.3.4i on Linux and Windows

    Energy Technology Data Exchange (ETDEWEB)

    Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-10-01

    Figures have been generated comparing the parameters used in the developmental assessment of the RELAP5-3D code, version 4.3i, compiled on Linux and Windows platforms. The figures, which are the same as those used in Volume III of the RELAP5-3D code manual, compare calculations using the semi-implicit solution scheme with available experiment data. These figures provide a quick, visual indication of how the code predictions differ between the Linux and Windows versions.

  11. SRAC95; general purpose neutronics code system

    Energy Technology Data Exchange (ETDEWEB)

    Okumura, Keisuke; Tsuchihashi, Keichiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-03-01

    SRAC is a general purpose neutronics code system applicable to core analyses of various types of reactors. Since the publication of JAERI-1302 for the revised SRAC in 1986, a number of additions and modifications have been made for nuclear data libraries and programs. Thus, the new version SRAC95 has been completed. The system consists of six kinds of nuclear data libraries(ENDF/B-IV, -V, -VI, JENDL-2, -3.1, -3.2), five modular codes integrated into SRAC95; collision probability calculation module (PIJ) for 16 types of lattice geometries, Sn transport calculation modules(ANISN, TWOTRAN), diffusion calculation modules(TUD, CITATION) and two optional codes for fuel assembly and core burn-up calculations(newly developed ASMBURN, revised COREBN). In this version, many new functions and data are implemented to support nuclear design studies of advanced reactors, especially for burn-up calculations. SRAC95 is available not only on conventional IBM-compatible computers but also on scalar or vector computers with the UNIX operating system. This report is the SRAC95 users manual which contains general description, contents of revisions, input data requirements, detail information on usage, sample input data and list of available libraries. (author).

  12. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  13. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  14. MORSE Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  15. The commerce of professional psychology and the new ethics code.

    Science.gov (United States)

    Koocher, G P

    1994-11-01

    The 1992 version of the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct brings some changes in requirements and new specificity to the practice of psychology. The impact of the new code on therapeutic contracts, informed consent to psychological services, advertising, financial aspects of psychological practice, and other topics related to the commerce of professional psychology are discussed. The genesis of many new thrusts in the code is reviewed from the perspective of psychological service provider. Specific recommendations for improved attention to ethical matters in professional practice are made.

  16. Recent Improvements in the SHIELD-HIT Code

    DEFF Research Database (Denmark)

    Hansen, David Christoffer; Lühr, Armin Christian; Herrmann, Rochus

    2012-01-01

    Purpose: The SHIELD-HIT Monte Carlo particle transport code has previously been used to study a wide range of problems for heavy-ion treatment and has been benchmarked extensively against other Monte Carlo codes and experimental data. Here, an improved version of SHIELD-HIT is developed...... of using accelerator control files as a basis for the primaries. Furthermore, the code has been parallelized and efficiency is improved. The physical description of inelastic ion collisions has been modified. Results: The simulation of an experimental depth-dose distribution including a ripple filter...

  17. Recent Advances in the HELIOS-2 Lattice Physics Code

    Energy Technology Data Exchange (ETDEWEB)

    Wemple, C.A. [Studsvik Scandpower, Inc., Idaho Falls, ID (United States); Gheorghiu, H.N.M. [Studsvik Scandpower, Inc., Boston, MA (United States); Stamm' ler, R.J.J. [Studsvik Scandpower AS, Kjeller (Norway); Villarino, E.A. [INVAP S.E., Bariloche (Argentina)

    2008-07-01

    Major advances have been made in the HELIOS code, resulting in the impending release of a new version, HELIOS-2. The new code includes a method of characteristics (MOC) transport solver to supplement the existing collision probabilities (CP) solver. A 177-group, ENDF/B-VII nuclear data library has been developed for inclusion with the new code package. Computational tests have been performed to verify the performance of the MOC solver against the CP solver, and validation testing against computational and measured benchmarks is underway. Results to-date of the verification and validation testing are presented, demonstrating the excellent performance of the new transport solver and nuclear data library. (authors)

  18. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, III, F. G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM@ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased level of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM@ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM@ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that

  19. Minimum cost content distribution using network coding: Replication vs. coding at the source nodes

    CERN Document Server

    Huang, Shurui; Medard, Muriel

    2009-01-01

    Consider a large file that needs to be multicast over a network to a given set of terminals. Storing the file at a single server may result in server overload. Accordingly, there are distributed storage solutions that operate by dividing the file into pieces and placing copies of the pieces (replication) or coded versions of the pieces (coding) at multiple source nodes. Suppose that the cost of a given network coding based solution to this problem is defined as the sum of the storage cost and the cost of the flows required to support the multicast. In this work, we consider a network with a set of source nodes that can either contain subsets or coded versions of the pieces of the file and are interested in finding the storage capacities and flows at minimum cost. We provide succinct formulations of the corresponding optimization problems by using information measures. In particular, we show that when there are two source nodes, there is no loss in considering subset sources. For three source nodes, we derive ...

  20. Pathways of Genetic Code Evolution in Ancient and Modern Organisms.

    Science.gov (United States)

    Sengupta, Supratim; Higgs, Paul G

    2015-06-01

    There have been two distinct phases of evolution of the genetic code: an ancient phase--prior to the divergence of the three domains of life, during which the standard genetic code was established--and a modern phase, in which many alternative codes have arisen in specific groups of genomes that differ only slightly from the standard code. Here we discuss the factors that are most important in these two phases, and we argue that these are substantially different. In the modern phase, changes are driven by chance events such as tRNA gene deletions and codon disappearance events. Selection acts as a barrier to prevent changes in the code. In contrast, in the ancient phase, selection for increased diversity of amino acids in the code can be a driving force for addition of new amino acids. The pathway of code evolution is constrained by avoiding disruption of genes that are already encoded by earlier versions of the code. The current arrangement of the standard code suggests that it evolved from a four-column code in which Gly, Ala, Asp, and Val were the earliest encoded amino acids.

  1. The Numerical Electromagnetics Code (NEC) - A Brief History

    Energy Technology Data Exchange (ETDEWEB)

    Burke, G J; Miller, E K; Poggio, A J

    2004-01-20

    The Numerical Electromagnetics Code, NEC as it is commonly known, continues to be one of the more widely used antenna modeling codes in existence. With several versions in use that reflect different levels of capability and availability, there are now 450 copies of NEC4 and 250 copies of NEC3 that have been distributed by Lawrence Livermore National Laboratory to a limited class of qualified recipients, and several hundred copies of NEC2 that had a recorded distribution by LLNL. These numbers do not account for numerous copies (perhaps 1000s) that were acquired through other means capitalizing on the open source code, the absence of distribution controls prior to NEC3 and the availability of versions on the Internet. In this paper we briefly review the history of the code that is concisely displayed in Figure 1. We will show how it capitalized on the research of prominent contributors in the early days of computational electromagnetics, how a combination of events led to the tri-service-supported code development program that ultimately led to NEC and how it evolved to the present day product. The authors apologize that space limitations do not allow us to provide a list of references or to acknowledge the numerous contributors to the code both of which can be found in the code documents.

  2. Clustal W and Clustal X version 2.0.

    Science.gov (United States)

    Larkin, M A; Blackshields, G; Brown, N P; Chenna, R; McGettigan, P A; McWilliam, H; Valentin, F; Wallace, I M; Wilm, A; Lopez, R; Thompson, J D; Gibson, T J; Higgins, D G

    2007-11-01

    The Clustal W and Clustal X multiple sequence alignment programs have been completely rewritten in C++. This will facilitate the further development of the alignment algorithms in the future and has allowed proper porting of the programs to the latest versions of Linux, Macintosh and Windows operating systems. The programs can be run on-line from the EBI web server: http://www.ebi.ac.uk/tools/clustalw2. The source code and executables for Windows, Linux and Macintosh computers are available from the EBI ftp site ftp://ftp.ebi.ac.uk/pub/software/clustalw2/

  3. Research on universal combinatorial coding.

    Science.gov (United States)

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value.

  4. Neutrino Mass Seesaw Version 3: Recent Developments

    CERN Document Server

    Ma, Ernest

    2009-01-01

    The origin of neutrino mass is usually attributed to a seesaw mechanism, either through a heavy Majorana fermion singlet (version 1) or a heavy scalar triplet (version 2). Recently, the idea of using a heavy Majorana fermion triplet (version 3) has gained some attention. This is a review of the basic idea involved, its U(1) gauge extension, and some recent developments.

  5. Safety Code A12

    CERN Multimedia

    SC Secretariat

    2005-01-01

    Please note that the Safety Code A12 (Code A12) entitled "THE SAFETY COMMISSION (SC)" is available on the web at the following url: https://edms.cern.ch/document/479423/LAST_RELEASED Paper copies can also be obtained from the SC Unit Secretariat, e-mail: sc.secretariat@cern.ch SC Secretariat

  6. Dress Codes for Teachers?

    Science.gov (United States)

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  7. Nuremberg code turns 60

    OpenAIRE

    Thieren, Michel; Mauron, Alex

    2007-01-01

    This month marks sixty years since the Nuremberg code – the basic text of modern medical ethics – was issued. The principles in this code were articulated in the context of the Nuremberg trials in 1947. We would like to use this anniversary to examine its ability to address the ethical challenges of our time.

  8. Pseudonoise code tracking loop

    Science.gov (United States)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  9. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  10. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  11. READING A NEURAL CODE

    NARCIS (Netherlands)

    BIALEK, W; RIEKE, F; VANSTEVENINCK, RRD; WARLAND, D

    1991-01-01

    Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task - extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from

  12. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...

  13. Embrittlement data base, version 1

    Energy Technology Data Exchange (ETDEWEB)

    Wang, J.A.

    1997-08-01

    The aging and degradation of light-water-reactor (LWR) pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel (RPV) materials depends on many different factors such as flux, fluence, fluence spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Based on embrittlement predictions, decisions must be made concerning operating parameters and issues such as low-leakage-fuel management, possible life extension, and the need for annealing the pressure vessel. Large amounts of data from surveillance capsules and test reactor experiments, comprising many different materials and different irradiation conditions, are needed to develop generally applicable damage prediction models that can be used for industry standards and regulatory guides. Version 1 of the Embrittlement Data Base (EDB) is such a comprehensive collection of data resulting from merging version 2 of the Power Reactor Embrittlement Data Base (PR-EDB). Fracture toughness data were also integrated into Version 1 of the EDB. For power reactor data, the current EDB lists the 1,029 Charpy transition-temperature shift data points, which include 321 from plates, 125 from forgoings, 115 from correlation monitor materials, 246 from welds, and 222 from heat-affected-zone (HAZ) materials that were irradiated in 271 capsules from 101 commercial power reactors. For test reactor data, information is available for 1,308 different irradiated sets (352 from plates, 186 from forgoings, 303 from correlation monitor materials, 396 from welds and 71 from HAZs) and 268 different irradiated plus annealed data sets.

  14. Transformation invariant sparse coding

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel Nørgaard

    2011-01-01

    Sparse coding is a well established principle for unsupervised learning. Traditionally, features are extracted in sparse coding in specific locations, however, often we would prefer invariant representation. This paper introduces a general transformation invariant sparse coding (TISC) model....... The model decomposes images into features invariant to location and general transformation by a set of specified operators as well as a sparse coding matrix indicating where and to what degree in the original image these features are present. The TISC model is in general overcomplete and we therefore invoke...... sparse coding to estimate its parameters. We demonstrate how the model can correctly identify components of non-trivial artificial as well as real image data. Thus, the model is capable of reducing feature redundancies in terms of pre-specified transformations improving the component identification....

  15. The SIFT Code Specification

    Science.gov (United States)

    1983-01-01

    The specification of Software Implemented Fault Tolerance (SIFT) consists of two parts, the specifications of the SIFT models and the specifications of the SIFT PASCAL program which actually implements the SIFT system. The code specifications are the last of a hierarchy of models describing the operation of the SIFT system and are related to the SIFT models as well as the PASCAL program. These Specifications serve to link the SIFT models to the running program. The specifications are very large and detailed and closely follow the form and organization of the PASCAL code. In addition to describing each of the components of the SIFT code, the code specifications describe the assumptions of the upper SIFT models which are required to actually prove that the code will work as specified. These constraints are imposed primarily on the schedule tables.

  16. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    discusses code as the artist’s material and, further, formulates a critique of Cramer. The seductive magic in computer-generated art does not lie in the magical expression, but nor does it lie in the code/material/text itself. It lies in the nature of code to do something – as if it was magic......Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  17. Combustion chamber analysis code

    Science.gov (United States)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-05-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  18. Astrophysics Source Code Library

    CERN Document Server

    Allen, Alice; Berriman, Bruce; Hanisch, Robert J; Mink, Jessica; Teuben, Peter J

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  19. User's manual for the CDC-6600 version of AMPX

    Energy Technology Data Exchange (ETDEWEB)

    Pickard, P. S.; Vandevender, W. H.

    1977-11-01

    A CDC-6600 version of AMPX-I, the Oak Ridge National Laboratory's modular code system for generating coupled multigroup neutron--gamma libraries from ENDF/B data is described. The AMPX modules can generate multigroup neutron cross sections; generate multigroup gamma cross sections; generate gamma yields for gamma-producing neutron interactions; combine neutron cross sections, gamma cross sections, and gamma yields into ''couples sets;'' perform one-dimensional discrete-ordinates-transport or diffusion-theory calculations for neutrons and gammas and, on option, collapse the cross sections to broad-group structure, with the one-dimensional results used as weighting functions; and output multigroup cross sections in convenient formats for other codes. 7 figures, 4 tables.

  20. The use of diagnostic coding in chiropractic practice

    DEFF Research Database (Denmark)

    Testern, Cecilie D; Hestbæk, Lise; French, Simon D

    2015-01-01

    of chiropractors about diagnostic coding and explore the use of it in a chiropractic setting. A secondary aim was to compare the diagnostic coding undertaken by chiropractors and an independent coder. METHOD: A codin exercise based on the International Classification of Primary Care version 2, PLUS extension (ICPC...... of agreement between the chiropractors and the coder was determined and Cohen's Kappa was used to determine the agreement beyond that expected by chance. RESULTS: From the interviews the three emerging themes were: 1) Advantages and disadvantages of using a clinical coding system in chiropractic practice, 2......) ICPC-2 PLUS terminology issues for chiropractic practice and 3) Implementation of a coding system into chiropractic practice. The participating chiropractors did not uniformly support or condemn the idea of using diagnostic coding. However there was a strong agreement that the terminology in ICPC-2...

  1. Validation of the G-PASS code : status report.

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Nuclear Engineering Division

    2009-03-12

    Validation is the process of determining whether the models in a computer code can describe the important phenomena in applications of interest. This report describes past work and proposed future work for validating the Gas Plant Analyzer and System Simulator (G-PASS) code. The G-PASS code was developed for simulating gas reactor and chemical plant system behavior during operational transients and upset events. Results are presented comparing code properties, individual component models, and integrated system behavior against results from four other computer codes. Also identified are two experiment facilities nearing completion that will provide additional data for individual component and integrated system model validation. The main goal of the validation exercise is to ready a version of G-PASS for use as a tool in evaluating vendor designs and providing guidance to vendors on design directions in nuclear-hydrogen applications.

  2. Error-Correcting Codes for Reliable Communications in Microgravity Platforms

    CERN Document Server

    Filho, Décio L Gazzoni; Tosin, Marcelo C; Granziera, Francisco

    2012-01-01

    The PAANDA experiment was conceived to characterize the acceleration ambient of a rocket launched microgravity platform, specially the microgravity phase. The recorded data was transmitted to ground stations, leading to loss of telemetry information sent during the reentry period. Traditionally, an error-correcting code for this channel consists of a block code with very large block size to protect against long periods of data loss. Instead, we propose the use of digital fountain codes along with conventional Reed-Solomon block codes to protect against long and short burst error periods, respectively. Aiming to use this approach for a second version of PAANDA to prevent data corruption, we propose a model for the communication channel based on information extracted from Cum\\~a II's telemetry data, and simulate the performance of our proposed error-correcting code under this channel model. Simulation results show that nearly all telemetry data can be recovered, including data from the reentry period.

  3. NAEYC Code of Ethical Conduct. Revised = Codigo de Conducta Etica. Revisada

    Science.gov (United States)

    National Association of Elementary School Principals (NAESP), 2005

    2005-01-01

    This document presents a code of ethics for early childhood educators that offers guidelines for responsible behavior and sets forth a common basis for resolving ethical dilemmas encountered in early education. It represents the English and Spanish versions of the revised code. Its contents were approved by the NAEYC Governing Board in April 2005…

  4. Structural Analysis and Visualization of C++ Code Evolution using Syntax Trees

    NARCIS (Netherlands)

    Chevalier, Fanny; Auber, David; Telea, Alexandru

    2007-01-01

    We present a method to detect and visualize evolution patterns in C++ source code. Our method consists of three steps. First, we extract an annotated syntax tree (AST) from each version of a given C++ source code. Next, we hash the extracted syntax nodes based on a metric combining structure and typ

  5. NAEYC Code of Ethical Conduct. Revised = Codigo de Conducta Etica. Revisada

    Science.gov (United States)

    National Association of Elementary School Principals (NAESP), 2005

    2005-01-01

    This document presents a code of ethics for early childhood educators that offers guidelines for responsible behavior and sets forth a common basis for resolving ethical dilemmas encountered in early education. It represents the English and Spanish versions of the revised code. Its contents were approved by the NAEYC Governing Board in April 2005…

  6. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system

    Energy Technology Data Exchange (ETDEWEB)

    Shapiro, A.; Huria, H.C.; Cho, K.W. (Cincinnati Univ., OH (United States))

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing to disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.

  7. UNICOS Evolution: CPC Version 6

    CERN Document Server

    Blanco Vinuela, E; Bradu, B; Durand, Ph; Fernandez Adiego, B; Izquierdo Rosas, S; Merezhin, A; Ortola Vidal, J; Rochez, J; Willeman, D

    2011-01-01

    The UNICOS (UNified Industrial COntrol System) framework was created back in 1998. Since then a noticeable number of applications in different domains have used this framework. Furthermore UNICOS has been formalized and its supervision layer has been reused in other kinds of applications (e.g. monitoring or supervisory tasks) where a control layer is not necessarily UNICOS oriented. The process control package has been reformulated as the UNICOS CPC package (Continuous Process Control) and a reengineering process has been followed. The drive behind these noticeable changes was (1) being able to upgrade to the new more performing IT technologies in the automatic code generation, (2) being flexible enough to create new additional device types to cope with other needs (e.g. Vacuum or Cooling and Ventilation applications) without major impact on the framework or the PLC code baselines and (3) enhance the framework with new functionalities (e.g. recipes). This publication addresses the motivation, changes, new fun...

  8. Self-Inverse Interleavers for Turbo Codes

    CERN Document Server

    Sakzad, Amin; Panario, Daniel; Eshghi, Nasim

    2010-01-01

    In this work we introduce and study a set of new interleavers based on permutation polynomials and functions with known inverses over a finite field $\\mathbb{F}_q$ for using in turbo code structures. We use Monomial, Dickson, M\\"{o}bius and R\\'edei functions in order to get new interleavers. In addition we employ Skolem sequences in order to find new interleavers with known cycle structure. As a byproduct we give an exact formula for the inverse of every R\\'edei function. The cycle structure of R\\'edei functions are also investigated. Finally, self-inverse versions of permutation functions are used to construct interleavers. These interleavers are their own de-interleavers and are useful for turbo coding and turbo decoding. Experiments carried out for self-inverse interleavers constructed using these kind of permutation polynomials and functions show excellent agreement with our theoretical results.

  9. Vector lifting schemes for stereo image coding.

    Science.gov (United States)

    Kaaniche, Mounir; Benazza-Benyahia, Amel; Pesquet-Popescu, Béatrice; Pesquet, Jean-Christophe

    2009-11-01

    Many research efforts have been devoted to the improvement of stereo image coding techniques for storage or transmission. In this paper, we are mainly interested in lossy-to-lossless coding schemes for stereo images allowing progressive reconstruction. The most commonly used approaches for stereo compression are based on disparity compensation techniques. The basic principle involved in this technique first consists of estimating the disparity map. Then, one image is considered as a reference and the other is predicted in order to generate a residual image. In this paper, we propose a novel approach, based on vector lifting schemes (VLS), which offers the advantage of generating two compact multiresolution representations of the left and the right views. We present two versions of this new scheme. A theoretical analysis of the performance of the considered VLS is also conducted. Experimental results indicate a significant improvement using the proposed structures compared with conventional methods.

  10. Comparative evaluation of the impact of WRF/NMM and WRF/ARW meteorology on CMAQ simulations for PM2.5 and its related precursors during the 2006 TexAQS/GoMACCS study

    Directory of Open Access Journals (Sweden)

    S. T. Rao

    2012-05-01

    Full Text Available This study presents a comparative evaluation of the impact of WRF-NMM and WRF-ARW meteorology on CMAQ simulations of PM2.5, its composition and related precursors over the eastern United States with the intensive observations obtained by aircraft (NOAA WP-3, ship and surface monitoring networks (AIRNow, IMPROVE, CASTNet and STN during the 2006 TexAQS/GoMACCS study. The results at the AIRNow surface sites show that both ARW-CMAQ and NMM-CMAQ reproduced day-to-day variations of observed PM2.5 and captured the majority of observed PM2.5 within a factor of 2 with a NMB value of −0.4% for ARW-CMAQ and −18% for NMM-CMAQ. Both models performed much better at the urban sites than at the rural sites, with greater underpredictions at the rural sites. Both models consistently underestimated the observed PM2.5 at the rural IMPROVE sites by −1% for the ARW-CMAQ and −19% for the NMM-CMAQ. The greater underestimations of SO42−, OC and EC by the NMM-CMAQ contributed to increased underestimation of PM2.5 at the IMPROVE sites. The NMB values for PM2.5 at the STN urban sites are 15% and −16% for the ARW-CMAQ and NMM-CMAQ, respectively. The underestimation of PM2.5 at the STN sites by the NMM-CMAQ mainly results from the underestimations of the SO42−, NH4+ and TCM components, whereas the overestimation of PM2.5 at the STN sites by the ARW-CMAQ results from the overestimations of SO42−, NO3−, and NH4+. The Comparison with WP-3 aircraft measurements reveals that both ARW-CMAQ and NMM-CMAQ have very similar model performance for vertical profiles for PM2.5 chemical components (SO42−, NH4+ and related gaseous species (HNO3, SO2, NH3, isoprene, toluene, terpenes as both models used the same chemical mechanisms and emissions. The results of ship along the coast of southeastern Texas over the Gulf of Mexico show that both models captured the temporal variations and broad synoptic change seen in the observed HCHO and acetaldehyde with the means NMB 2.

  11. Comparative evaluation of the impact of WRF/NMM and WRF/ARW meteorology on CMAQ simulations for PM2.5 and its related precursors during the 2006 TexAQS/GoMACCS study

    Directory of Open Access Journals (Sweden)

    S. T. Rao

    2011-12-01

    Full Text Available This study presents a comparative evaluation of the impact of WRF-NMM and WRF-ARW meteorology on CMAQ simulations of PM2.5, its composition and related precursors over the eastern United States with the intensive observations obtained by aircraft (NOAA P-3, ship and surface monitoring networks (AIRNow, IMPROVE, CASTNet and STN during the 2006 TexAQS/GoMACCS study. The results at the AIRNow surface sites show that both ARW-CMAQ and NMM-CMAQ reproduced day-to-day variations of observed PM2.5 and captured the majority of observed PM2.5 within a factor of 2 with a NMB value of −0.4% for ARW-CMAQ and −18% for NMM-CMAQ. Both models performed much better at the urban sites than at the rural sites, with greater underpredictions at the rural sites. Both models consistently underestimated the observed PM2.5 at the rural IMPROVE sites by −1% for the ARW-CMAQ and −19% for the NMM-CMAQ. The greater underestimations of SO42−, OC and EC by the NMM-CMAQ contributed to increased underestimation of PM2.5 at the IMPROVE sites. The NMB values for PM2.5 at the STN urban sites are 15% and −16% for the ARW-CMAQ and NMM-CMAQ, respectively. The underestimation of PM2.5 at the STN sites by the NMM-CMAQ mainly results from the underestimations of the SO42−, NH4+ and TCM components, whereas the overestimation of PM2.5 at the STN sites by the ARW-CMAQ results from the overestimations of SO42−, NO3−, and NH4+. The comparison with P-3 aircraft measurements reveals that both ARW-CMAQ and NMM-CMAQ have very similar model performance for vertical profiles for PM2.5 chemical components (SO42−, NH4+ and related gaseous species (HNO3, SO2, NH3, isoprene, toluene, terpenes as both models used the same chemical mechanisms and emissions. The results of ship along the coast of southeastern Texas over the Gulf of Mexico show that both models captured the temporal variations and broad synoptic change seen in the observed HCHO and acetaldehyde with the means NMB 2.

  12. SPAM- SPECTRAL ANALYSIS MANAGER (UNIX VERSION)

    Science.gov (United States)

    Solomon, J. E.

    1994-01-01

    machine environments. There is a DEC VAX/VMS version with a central memory requirement of approximately 242K of 8 bit bytes and a machine independent UNIX 4.2 version. The display device currently supported is the Raster Technologies display processor. Other 512 x 512 resolution color display devices, such as De Anza, may be added with minor code modifications. This program was developed in 1986.

  13. User s Guide for REFoffSpec Version 1.5.4

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Bilheux, Jean-Christophe [ORNL; Lauter, Valeria [ORNL; Ambaye, Haile Arena [ORNL

    2012-09-01

    This document is a user s guide for the IDL software REFoffSpec version 1.5.4 whose purpose is to aggregate for analysis NeXus data files from the magnetism and liquids reflectometer experiments at the Oak Ridge National Laboratory Spallation Neutron Source. The software is used to scale and align multiple data files that constitute a continuous set for an experimental run. The User s Guide for REFoffSepc explains step by step the process using a specific example run. Output screens are provided to orient the user at each step. The guide documents in detail changes made to the original REFoffSpec code during the period November 2009 and January 2011. At the time of the completion of this version of the code it was accessible from the sns_tools interface as a beta version.

  14. HPSim, Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2016-08-23

    HPSim is a GPU-accelerated online multi-particle beam dynamics simulation tool for ion linacs. It was originally developed for use on the Los Alamos 800-MeV proton linac. It is a “z-code” that contains typical linac beam transport elements. The linac RF-gap transformation utilizes transit-time-factors to calculate the beam acceleration therein. The space-charge effects are computed using the 2D SCHEFF (Space CHarge EFFect) algorithm, which calculates the radial and longitudinal space charge forces for cylindrically symmetric beam distributions. Other space- charge routines to be incorporated include the 3D PICNIC and a 3D Poisson solver. HPSim can simulate beam dynamics in drift tube linacs (DTLs) and coupled cavity linacs (CCLs). Elliptical superconducting cavity (SC) structures will also be incorporated into the code. The computational core of the code is written in C++ and accelerated using the NVIDIA CUDA technology. Users access the core code, which is wrapped in Python/C APIs, via Pythons scripts that enable ease-of-use and automation of the simulations. The overall linac description including the EPICS PV machine control parameters is kept in an SQLite database that also contains calibration and conversion factors required to transform the machine set points into model values used in the simulation.

  15. Efficient Management of Biomedical Ontology Versions

    Science.gov (United States)

    Kirsten, Toralf; Hartung, Michael; Groß, Anika; Rahm, Erhard

    Ontologies have become very popular in life sciences and other domains. They mostly undergo continuous changes and new ontology versions are frequently released. However, current analysis studies do not consider the ontology changes reflected in different versions but typically limit themselves to a specific ontology version which may quickly become obsolete. To allow applications easy access to different ontology versions we propose a central and uniform management of the versions of different biomedical ontologies. The proposed database approach takes concept and structural changes of succeeding ontology versions into account thereby supporting different kinds of change analysis. Furthermore, it is very space-efficient by avoiding redundant storage of ontology components which remain unchanged in different versions. We evaluate the storage requirements and query performance of the proposed approach for the Gene Ontology.

  16. Embedded foveation image coding.

    Science.gov (United States)

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  17. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity in the net...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof.......Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...

  18. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  19. NEW BINARY USER CODES FOR DS CDMA COMMUNICATION

    Directory of Open Access Journals (Sweden)

    NEELAM SRIVASTAVA

    2011-12-01

    Full Text Available Spread spectrum (SS is a modulation technique in which the signal occupies a bandwidth much larger than the minimum necessary to send the information. A synchronized reception with the code at the receiver is used for despreading the information before data recovery. From a long period, Walsh codes and Gold codes have been used as spread spectrum codes in Code Division Multiple Access (CDMA communications because of their ease of generation than the efficiency of these codes. Walsh codes are perfectly orthogonal binary user codes that have many popular applications in synchronous multicarrier communications although they perform poorly for asynchronous multi-user communications. Therefore, the nearly orthogonal Gold codes with their superior performance are the preferred user codes in asynchronous CDMA communications with small number of simultaneous users in the system due to their good auto-correlation (intracode correlation and cross-correlation (inter-code properties. Major drawback of these codes is that they are limited in number and in their lengths. In this paper, we performed MATLAB (7.1version algorithm to obtain the new orthogonal sets of binary space for multiuser spread-spectrum communications. We compared their performance with existing codes like Gold and Walsh code families. Our comparisons include their time domain properties like auto and cross-correlations along with bit error rate (BER performances in additive white Gaussian noise (AWGN and Rayleigh channel for the synchronous and asynchronous DS-CDMA communications. It is shown that these codes outperform the Walsh codes significantly and they match in performance with the popular nearly orthogonal Gold codes closely for asynchronous multiuser communications in AWGN noise. It is also shown that all of the binary code families considered performed comparable for Rayleigh flat-fading channels. So these new codes can be used both for asynchronous and synchronous direct sequence

  20. Application of RS Codes in Decoding QR Code

    Institute of Scientific and Technical Information of China (English)

    Zhu Suxia(朱素霞); Ji Zhenzhou; Cao Zhiyan

    2003-01-01

    The QR Code is a 2-dimensional matrix code with high error correction capability. It employs RS codes to generate error correction codewords in encoding and recover errors and damages in decoding. This paper presents several QR Code's virtues, analyzes RS decoding algorithm and gives a software flow chart of decoding the QR Code with RS decoding algorithm.

  1. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...

  2. Technical Support Document for Version 3.9.1 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2012-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.

  3. Phase II evaluation of clinical coding schemes: completeness, taxonomy, mapping, definitions, and clarity. CPRI Work Group on Codes and Structures.

    Science.gov (United States)

    Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J

    1997-01-01

    To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity

  4. Distributed multiple description coding

    CERN Document Server

    Bai, Huihui; Zhao, Yao

    2011-01-01

    This book examines distributed video coding (DVC) and multiple description coding (MDC), two novel techniques designed to address the problems of conventional image and video compression coding. Covering all fundamental concepts and core technologies, the chapters can also be read as independent and self-sufficient, describing each methodology in sufficient detail to enable readers to repeat the corresponding experiments easily. Topics and features: provides a broad overview of DVC and MDC, from the basic principles to the latest research; covers sub-sampling based MDC, quantization based MDC,

  5. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  6. Coded MapReduce

    OpenAIRE

    Li, Songze; Maddah-Ali, Mohammad Ali; Avestimehr, A. Salman

    2015-01-01

    MapReduce is a commonly used framework for executing data-intensive jobs on distributed server clusters. We introduce a variant implementation of MapReduce, namely "Coded MapReduce", to substantially reduce the inter-server communication load for the shuffling phase of MapReduce, and thus accelerating its execution. The proposed Coded MapReduce exploits the repetitive mapping of data blocks at different servers to create coding opportunities in the shuffling phase to exchange (key,value) pair...

  7. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Smith, A.B. [ed.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  8. On the structure of Lattice code WIMSD-5B

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won Young; Min, Byung Joo

    2004-03-15

    The WIMS-D code is a freely available thermal reactor physics lattice code used widely for thermal research and power reactor calculation. Now the code WIMS-AECL, developed on the basis of WIMS-D, has been used as one of lattice codes for the cell calculation in Canada and also, in 1998, the latest version WIMSD-5B is released for OECD/NEA Data Bank. While WIMS-KAERI was developed and has been used, originated from WIMS-D, in Korea, it was adjusted for the cell calculation of research reactor HANARO and so it has no confirmaty to CANDU reactor. Therefore, the code development applicable to cell calculation of CANDU reactor is necessary not only for technological independence and but also for the establishment of CANDU safety analysis system. A lattice code WIMSD-5B was analyzed in order to set the system of reactor physics computer codes, to be used in the assessment of void reactivity effect. In order to improve and validate WIMSD-5B code, the analysis of the structure of WIMSD-5B lattice code was made and so its structure, algorithm and the subroutines of WIMSD-5B were presented for the cluster type and the pij method modelling the CANDU-6 fuel

  9. Recent developments in KTF. Code optimization and improved numerics

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin [Karlsruhe Institute of Technology (KIT) (Germany). Inst. for Neutron Physics and Reactor Technology (INR)

    2012-11-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  10. THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE

    Energy Technology Data Exchange (ETDEWEB)

    WATERS, LAURIE S. [Los Alamos National Laboratory; MCKINNEY, GREGG W. [Los Alamos National Laboratory; DURKEE, JOE W. [Los Alamos National Laboratory; FENSIN, MICHAEL L. [Los Alamos National Laboratory; JAMES, MICHAEL R. [Los Alamos National Laboratory; JOHNS, RUSSELL C. [Los Alamos National Laboratory; PELOWITZ, DENISE B. [Los Alamos National Laboratory

    2007-01-10

    MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.

  11. TRACK The New Beam Dynamics Code

    CERN Document Server

    Mustapha, Brahim; Ostroumov, Peter; Schnirman-Lessner, Eliane

    2005-01-01

    The new ray-tracing code TRACK was developed* to fulfill the special requirements of the RIA accelerator systems. The RIA lattice includes an ECR ion source, a LEBT containing a MHB and a RFQ followed by three SC linac sections separated by two stripping stations with appropriate magnetic transport systems. No available beam dynamics code meet all the necessary requirements for an end-to-end simulation of the RIA driver linac. The latest version of TRACK was used for end-to-end simulations of the RIA driver including errors and beam loss analysis.** In addition to the standard capabilities, the code includes the following new features: i) multiple charge states ii) realistic stripper model; ii) static and dynamic errors iii) automatic steering to correct for misalignments iv) detailed beam-loss analysis; v) parallel computing to perform large scale simulations. Although primarily developed for simulations of the RIA machine, TRACK is a general beam dynamics code. Currently it is being used for the design and ...

  12. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  13. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  14. New version: GRASP2K relativistic atomic structure package

    Science.gov (United States)

    Jönsson, P.; Gaigalas, G.; Bieroń, J.; Fischer, C. Froese; Grant, I. P.

    2013-09-01

    A revised version of GRASP2K [P. Jönsson, X. He, C. Froese Fischer, I.P. Grant, Comput. Phys. Commun. 177 (2007) 597] is presented. It supports earlier non-block and block versions of codes as well as a new block version in which the njgraf library module [A. Bar-Shalom, M. Klapisch, Comput. Phys. Commun. 50 (1988) 375] has been replaced by the librang angular package developed by Gaigalas based on the theory of [G. Gaigalas, Z.B. Rudzikas, C. Froese Fischer, J. Phys. B: At. Mol. Phys. 30 (1997) 3747, G. Gaigalas, S. Fritzsche, I.P. Grant, Comput. Phys. Commun. 139 (2001) 263]. Tests have shown that errors encountered by njgraf do not occur with the new angular package. The three versions are denoted v1, v2, and v3, respectively. In addition, in v3, the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Changes in v2 include minor improvements. For example, the new version of rci2 may be used to compute quantum electrodynamic (QED) corrections only from selected orbitals. In v3, a new program, jj2lsj, reports the percentage composition of the wave function in LSJ and the program rlevels has been modified to report the configuration state function (CSF) with the largest coefficient of an LSJ expansion. The bioscl2 and bioscl3 application programs have been modified to produce a file of transition data with one record for each transition in the same format as in ATSP2K [C. Froese Fischer, G. Tachiev, G. Gaigalas, M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. All versions of the codes have been adapted for 64-bit computer architecture. Program SummaryProgram title: GRASP2K, version 1_1 Catalogue identifier: ADZL_v1_1 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADZL_v1_1.html Program obtainable from: CPC Program Library

  15. Structural Analysis and Visualization of C++ Code Evolution using Syntax Trees

    OpenAIRE

    Chevalier, Fanny; Auber, David; Telea, Alexandru

    2007-01-01

    International audience; We present a method to detect and visualize evolution patterns in C++ source code. Our method consists of three steps. First, we extract an annotated syntax tree (AST) from each version of a given C++ source code. Next, we hash the extracted syntax nodes based on a metric combining structure and type information, and construct matches (correspondences) between similar-hash subtrees. Our technique detects code fragments which have not changed, or changed little, during ...

  16. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  17. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  18. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  19. GOOGLE SUMMER OF CODE

    National Research Council Canada - National Science Library

    Leslie Hawthorn

    2008-01-01

      This article examines the Google Summer of Code (GSoC) program, the world's first global initiative to introduce College and University students to free/libre open source software (F/LOSS) development...

  20. Importance of Building Code

    Directory of Open Access Journals (Sweden)

    Reshmi Banerjee

    2015-06-01

    Full Text Available A building code, or building control, is a set of rules that specify the minimum standards for constructed objects such as buildings and non building structures. The main purpose of building codes are to protect public health, safety and general welfare as they relate to the construction and occupancy of buildings and structures. The building code becomes law of a particular jurisdiction when formally enacted by the appropriate governmental or private authority. Building codes are generally intended to be applied by architects, engineers, constructors and regulators but are also used for various purposes by safety inspectors, environmental scientists, real estate developers, subcontractors, manufacturers of building products and materials, insurance companies, facility managers, tenants and others.

  1. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  2. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    , in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded......The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access...

  3. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  4. Annotated Raptor Codes

    CERN Document Server

    Mahdaviani, Kaveh; Tellambura, Chintha

    2011-01-01

    In this paper, an extension of raptor codes is introduced which keeps all the desirable properties of raptor codes, including the linear complexity of encoding and decoding per information bit, unchanged. The new design, however, improves the performance in terms of the reception rate. Our simulations show a 10% reduction in the needed overhead at the benchmark block length of 64,520 bits and with the same complexity per information bit.

  5. Methods, algorithms and computer codes for calculation of electron-impact excitation parameters

    CERN Document Server

    Bogdanovich, P; Stonys, D

    2015-01-01

    We describe the computer codes, developed at Vilnius University, for the calculation of electron-impact excitation cross sections, collision strengths, and excitation rates in the plane-wave Born approximation. These codes utilize the multireference atomic wavefunctions which are also adopted to calculate radiative transition parameters of complex many-electron ions. This leads to consistent data sets suitable in plasma modelling codes. Two versions of electron scattering codes are considered in the present work, both of them employing configuration interaction method for inclusion of correlation effects and Breit-Pauli approximation to account for relativistic effects. These versions differ only by one-electron radial orbitals, where the first one employs the non-relativistic numerical radial orbitals, while another version uses the quasirelativistic radial orbitals. The accuracy of produced results is assessed by comparing radiative transition and electron-impact excitation data for neutral hydrogen, helium...

  6. Space Images for NASA JPL Android Version

    Science.gov (United States)

    Nelson, Jon D.; Gutheinz, Sandy C.; Strom, Joshua R.; Arca, Jeremy M.; Perez, Martin; Boggs, Karen; Stanboli, Alice

    2013-01-01

    This software addresses the demand for easily accessible NASA JPL images and videos by providing a user friendly and simple graphical user interface that can be run via the Android platform from any location where Internet connection is available. This app is complementary to the iPhone version of the application. A backend infrastructure stores, tracks, and retrieves space images from the JPL Photojournal and Institutional Communications Web server, and catalogs the information into a streamlined rating infrastructure. This system consists of four distinguishing components: image repository, database, server-side logic, and Android mobile application. The image repository contains images from various JPL flight projects. The database stores the image information as well as the user rating. The server-side logic retrieves the image information from the database and categorizes each image for display. The Android mobile application is an interfacing delivery system that retrieves the image information from the server for each Android mobile device user. Also created is a reporting and tracking system for charting and monitoring usage. Unlike other Android mobile image applications, this system uses the latest emerging technologies to produce image listings based directly on user input. This allows for countless combinations of images returned. The backend infrastructure uses industry-standard coding and database methods, enabling future software improvement and technology updates. The flexibility of the system design framework permits multiple levels of display possibilities and provides integration capabilities. Unique features of the software include image/video retrieval from a selected set of categories, image Web links that can be shared among e-mail users, sharing to Facebook/Twitter, marking as user's favorites, and image metadata searchable for instant results.

  7. Methodology for Developing the REScheckTM Software through Version 4.2

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Connell, Linda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gowri, Krishnan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lucas, R. G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schultz, Robert W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Zachary T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wiberg, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2009-08-01

    This report explains the methodology used to develop Version 4.2 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, and 2006 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these five editions is similar. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  8. SLR-PLUS version 1.0 user's manual

    Science.gov (United States)

    Hill, J. M.

    1982-11-01

    Version 1.0 of Solar Load Ratio heating plus cooling (SLR-PLUS), developed as an advanced passive solar system design and evaluation tool, is discussed. SLR-PLUS maintains the friendly user interface structure developed for the active solar system FCHART program. Users familiar with the FCHART programs and the FCHART/SLR program will find the operation of the SLR-PLUS program very familiar SLR-PLUS differs significantly from its parent program in three major ways. First, SLR-PLUS is strictly for the evaluation of passive solar energy systems. Second, the latest correlations from the Los Alamos National Laboratory serve as the basis to the passive heating analysis used by SLR-PLUS. Finally, SLR-PLUS includes cooling loads imposed by passive systems in the form of an annual cooling load for the building modelled and for the individual passive systems. The present version was developed on an Hewlett-Packard 1000 minicomputer using an RTE-IVB operating system. The present version requires approximately 22K 16-bit words of core with overlays to run. The FORTRAN source code will compile with minor changes on any FORTRAN 77 compiler.

  9. Robust Nonlinear Neural Codes

    Science.gov (United States)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  10. Scalable motion vector coding

    Science.gov (United States)

    Barbarien, Joeri; Munteanu, Adrian; Verdicchio, Fabio; Andreopoulos, Yiannis; Cornelis, Jan P.; Schelkens, Peter

    2004-11-01

    Modern video coding applications require transmission of video data over variable-bandwidth channels to a variety of terminals with different screen resolutions and available computational power. Scalable video coding is needed to optimally support these applications. Recently proposed wavelet-based video codecs employing spatial domain motion compensated temporal filtering (SDMCTF) provide quality, resolution and frame-rate scalability while delivering compression performance comparable to that of the state-of-the-art non-scalable H.264-codec. These codecs require scalable coding of the motion vectors in order to support a large range of bit-rates with optimal compression efficiency. Scalable motion vector coding algorithms based on the integer wavelet transform followed by embedded coding of the wavelet coefficients were recently proposed. In this paper, a new and fundamentally different scalable motion vector codec (MVC) using median-based motion vector prediction is proposed. Extensive experimental results demonstrate that the proposed MVC systematically outperforms the wavelet-based state-of-the-art solutions. To be able to take advantage of the proposed scalable MVC, a rate allocation mechanism capable of optimally dividing the available rate among texture and motion information is required. Two rate allocation strategies are proposed and compared. The proposed MVC and rate allocation schemes are incorporated into an SDMCTF-based video codec and the benefits of scalable motion vector coding are experimentally demonstrated.

  11. On Expanded Cyclic Codes

    CERN Document Server

    Wu, Yingquan

    2008-01-01

    The paper has a threefold purpose. The first purpose is to present an explicit description of expanded cyclic codes defined in $\\GF(q^m)$. The proposed explicit construction of expanded generator matrix and expanded parity check matrix maintains the symbol-wise algebraic structure and thus keeps many important original characteristics. The second purpose of this paper is to identify a class of constant-weight cyclic codes. Specifically, we show that a well-known class of $q$-ary BCH codes excluding the all-zero codeword are constant-weight cyclic codes. Moreover, we show this class of codes achieve the Plotkin bound. The last purpose of the paper is to characterize expanded cyclic codes utilizing the proposed expanded generator matrix and parity check matrix. We analyze the properties of component codewords of a codeword and particularly establish the precise conditions under which a codeword can be represented by a subbasis. With the new insights, we present an improved lower bound on the minimum distance of...

  12. Non-Binary Polar Codes using Reed-Solomon Codes and Algebraic Geometry Codes

    CERN Document Server

    Mori, Ryuhei

    2010-01-01

    Polar codes, introduced by Arikan, achieve symmetric capacity of any discrete memoryless channels under low encoding and decoding complexity. Recently, non-binary polar codes have been investigated. In this paper, we calculate error probability of non-binary polar codes constructed on the basis of Reed-Solomon matrices by numerical simulations. It is confirmed that 4-ary polar codes have significantly better performance than binary polar codes on binary-input AWGN channel. We also discuss an interpretation of polar codes in terms of algebraic geometry codes, and further show that polar codes using Hermitian codes have asymptotically good performance.

  13. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    Nowadays, emerging applications such as wireless visual sensor networks and wireless video surveillance are requiring lightweight video encoding with high coding efficiency and error-resilience. Distributed Video Coding (DVC) is a new coding paradigm which exploits the source statistics...

  14. Polynomial weights and code constructions

    DEFF Research Database (Denmark)

    Massey, J; Costello, D; Justesen, Jørn

    1973-01-01

    polynomial included. This fundamental property is then used as the key to a variety of code constructions including 1) a simplified derivation of the binary Reed-Muller codes and, for any primepgreater than 2, a new extensive class ofp-ary "Reed-Muller codes," 2) a new class of "repeated-root" cyclic codes...... that are subcodes of the binary Reed-Muller codes and can be very simply instrumented, 3) a new class of constacyclic codes that are subcodes of thep-ary "Reed-Muller codes," 4) two new classes of binary convolutional codes with large "free distance" derived from known binary cyclic codes, 5) two new classes...... of long constraint length binary convolutional codes derived from2^r-ary Reed-Solomon codes, and 6) a new class ofq-ary "repeated-root" constacyclic codes with an algebraic decoding algorithm....

  15. Atmospheric Dispersion Analysis using MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    Glaser, R; Yang, J M

    2004-02-02

    The Nuclear Regulatory Commission (NRC) Regulatory Guide 1.145 requires an evaluation of the offsite atmospheric dispersion coefficient, {Chi}/Q, as a part of the acceptance criteria in the accident analysis. In it, it requires in sequence computations of (1) the overall site 95th percentile {Chi}/Q, (2) the maximum of the sixteen sector 99.5th percentile {Chi}/Q, and (3) comparison and selection of the worst of the two values for reporting in the safety analysis report (SAR). In all cases, the site-specific meteorology and sector-specific site boundary distances are employed in the evaluation. There are sixteen 22.5-sectors, the nearest site boundary of which is determined within the 45-arc centered on each of the sixteen compass directions.

  16. Product Codes for Optical Communication

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    2002-01-01

    Many optical communicaton systems might benefit from forward-error-correction. We present a hard-decision decoding algorithm for the "Block Turbo Codes", suitable for optical communication, which makes this coding-scheme an alternative to Reed-Solomon codes.......Many optical communicaton systems might benefit from forward-error-correction. We present a hard-decision decoding algorithm for the "Block Turbo Codes", suitable for optical communication, which makes this coding-scheme an alternative to Reed-Solomon codes....

  17. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  18. Algebraic geometric codes with applications

    Institute of Scientific and Technical Information of China (English)

    CHEN Hao

    2007-01-01

    The theory of linear error-correcting codes from algebraic geomet-ric curves (algebraic geometric (AG) codes or geometric Goppa codes) has been well-developed since the work of Goppa and Tsfasman, Vladut, and Zink in 1981-1982. In this paper we introduce to readers some recent progress in algebraic geometric codes and their applications in quantum error-correcting codes, secure multi-party computation and the construction of good binary codes.

  19. Modular ORIGEN-S for multi-physics code systems

    Energy Technology Data Exchange (ETDEWEB)

    Yesilyurt, Gokhan; Clarno, Kevin T.; Gauld, Ian C., E-mail: yesilyurtg@ornl.gov, E-mail: clarnokt@ornl.gov, E-mail: gauldi@ornl.gov [Oak Ridge National Laboratory, TN (United States); Galloway, Jack, E-mail: jack@galloways.net [Los Alamos National Laboratory, Los Alamos, NM (United States)

    2011-07-01

    The ORIGEN-S code in the SCALE 6.0 nuclear analysis code suite is a well-validated tool to calculate the time-dependent concentrations of nuclides due to isotopic depletion, decay, and transmutation for many systems in a wide range of time scales. Application areas include nuclear reactor and spent fuel storage analyses, burnup credit evaluations, decay heat calculations, and environmental assessments. Although simple to use within the SCALE 6.0 code system, especially with the ORIGEN-ARP graphical user interface, it is generally complex to use as a component within an externally developed code suite because of its tight coupling within the infrastructure of the larger SCALE 6.0 system. The ORIGEN2 code, which has been widely integrated within other simulation suites, is no longer maintained by Oak Ridge National Laboratory (ORNL), has obsolete data, and has a relatively small validation database. Therefore, a modular version of the SCALE/ORIGEN-S code was developed to simplify its integration with other software packages to allow multi-physics nuclear code systems to easily incorporate the well-validated isotopic depletion, decay, and transmutation capability to perform realistic nuclear reactor and fuel simulations. SCALE/ORIGEN-S was extensively restructured to develop a modular version that allows direct access to the matrix solvers embedded in the code. Problem initialization and the solver were segregated to provide a simple application program interface and fewer input/output operations for the multi-physics nuclear code systems. Furthermore, new interfaces were implemented to access and modify the ORIGEN-S input variables and nuclear cross-section data through external drivers. Three example drivers were implemented, in the C, C++, and Fortran 90 programming languages, to demonstrate the modular use of the new capability. This modular version of SCALE/ORIGEN-S has been embedded within several multi-physics software development projects at ORNL, including

  20. A Distinguisher-Based Attack of a Homomorphic Encryption Scheme Relying on Reed-Solomon Codes

    CERN Document Server

    Gauthier, Valérie; Tillich, Jean-Pierre

    2012-01-01

    Bogdanov and Lee suggested a homomorphic public-key encryption scheme based on error correcting codes. The underlying public code is a modified Reed-Solomon code obtained from inserting a zero submatrix in the Vandermonde generating matrix defining it. The columns that define this submatrix are kept secret and form a set $L$. We give here a distinguisher that detects if one or several columns belong to $L$ or not. This distinguisher is obtained by considering the code generated by component-wise products of codewords of the public code (the so called "square code"). This operation is applied to punctured versions of this square code obtained by picking a subset $I$ of the whole set of columns. It turns out that the dimension of the punctured square code is directly related to the cardinality of the intersection of $I$ with $L$. This allows an attack which recovers the full set $L$ and which can then decrypt any ciphertext.

  1. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  2. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  3. The limits of mathematics alternative version

    CERN Document Server

    Chaitin, G J

    1994-01-01

    This is an alternative version of the course notes in chao-dyn/9407003. The previous version is based on measuring the size of lisp s-expressions. This version is based on measuring the size of what I call lisp m-expressions, which are lisp s-expressions with most parentheses omitted. This formulation of algorithmic information theory is harder to understand than the one that was presented in chao-dyn/9407003, but the constants obtained in all theorems are now less than half the size that they were before. It is not clear to me which version of algorithmic information theory is to be preferred.

  4. Golden Coded Multiple Beamforming

    CERN Document Server

    Li, Boyu

    2010-01-01

    The Golden Code is a full-rate full-diversity space-time code, which achieves maximum coding gain for Multiple-Input Multiple-Output (MIMO) systems with two transmit and two receive antennas. Since four information symbols taken from an M-QAM constellation are selected to construct one Golden Code codeword, a maximum likelihood decoder using sphere decoding has the worst-case complexity of O(M^4), when the Channel State Information (CSI) is available at the receiver. Previously, this worst-case complexity was reduced to O(M^(2.5)) without performance degradation. When the CSI is known by the transmitter as well as the receiver, beamforming techniques that employ singular value decomposition are commonly used in MIMO systems. In the absence of channel coding, when a single symbol is transmitted, these systems achieve the full diversity order provided by the channel. Whereas this property is lost when multiple symbols are simultaneously transmitted. However, uncoded multiple beamforming can achieve the full div...

  5. Coded source neutron imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bingham, Philip R [ORNL; Santos-Villalobos, Hector J [ORNL

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  6. Coded source neutron imaging

    Science.gov (United States)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  7. Fusion safety codes International modeling with MELCOR and ATHENA- INTRA

    CERN Document Server

    Marshall, T; Topilski, L; Merrill, B

    2002-01-01

    For a number of years, the world fusion safety community has been involved in benchmarking their safety analyses codes against experiment data to support regulatory approval of a next step fusion device. This paper discusses the benchmarking of two prominent fusion safety thermal-hydraulic computer codes. The MELCOR code was developed in the US for fission severe accident safety analyses and has been modified for fusion safety analyses. The ATHENA code is a multifluid version of the US-developed RELAP5 code that is also widely used for fusion safety analyses. The ENEA Fusion Division uses ATHENA in conjunction with the INTRA code for its safety analyses. The INTRA code was developed in Germany and predicts containment building pressures, temperatures and fluid flow. ENEA employs the French-developed ISAS system to couple ATHENA and INTRA. This paper provides a brief introduction of the MELCOR and ATHENA-INTRA codes and presents their modeling results for the following breaches of a water cooling line into the...

  8. Wyner-Ziv Coding Based on Multidimensional Nested Lattices

    CERN Document Server

    Ling, Cong; Belfiore, Jean-Claude

    2011-01-01

    Distributed source coding (DSC) addresses the compression of correlated sources without communication links among them. This paper is concerned with the Wyner-Ziv problem: coding of an information source with side information available only at the decoder in the form of a noisy version of the source. Both the theoretical analysis and code design are addressed in the framework of multi-dimensional nested lattice coding (NLC). For theoretical analysis, accurate computation of the rate-distortion function is given under the high-resolution assumption, and a new upper bound using the derivative of the theta series is derived. For practical code design, several techniques with low complexity are proposed. Compared to the existing Slepian-Wolf coded nested quantization (SWC-NQ) for Wyner-Ziv coding based on one or two-dimensional lattices, our proposed multi-dimensional NLC can offer better performance at arguably lower complexity, since it does not require the second stage of Slepian-Wolf coding.

  9. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  10. NASA Glenn Steady-State Heat Pipe Code GLENHP: Compilation for 64- and 32-Bit Windows Platforms

    Science.gov (United States)

    Tower, Leonard K.; Geng, Steven M.

    2016-01-01

    A new version of the NASA Glenn Steady State Heat Pipe Code, designated "GLENHP," is introduced here. This represents an update to the disk operating system (DOS) version LERCHP reported in NASA/TM-2000-209807. The new code operates on 32- and 64-bit Windows-based platforms from within the 32-bit command prompt window. An additional evaporator boundary condition and other features are provided.

  11. Phase-coded pulse aperiodic transmitter coding

    Directory of Open Access Journals (Sweden)

    I. I. Virtanen

    2009-07-01

    Full Text Available Both ionospheric and weather radar communities have already adopted the method of transmitting radar pulses in an aperiodic manner when measuring moderately overspread targets. Among the users of the ionospheric radars, this method is called Aperiodic Transmitter Coding (ATC, whereas the weather radar users have adopted the term Simultaneous Multiple Pulse-Repetition Frequency (SMPRF. When probing the ionosphere at the carrier frequencies of the EISCAT Incoherent Scatter Radar facilities, the range extent of the detectable target is typically of the order of one thousand kilometers – about seven milliseconds – whereas the characteristic correlation time of the scattered signal varies from a few milliseconds in the D-region to only tens of microseconds in the F-region. If one is interested in estimating the scattering autocorrelation function (ACF at time lags shorter than the F-region correlation time, the D-region must be considered as a moderately overspread target, whereas the F-region is a severely overspread one. Given the technical restrictions of the radar hardware, a combination of ATC and phase-coded long pulses is advantageous for this kind of target. We evaluate such an experiment under infinitely low signal-to-noise ratio (SNR conditions using lag profile inversion. In addition, a qualitative evaluation under high-SNR conditions is performed by analysing simulated data. The results show that an acceptable estimation accuracy and a very good lag resolution in the D-region can be achieved with a pulse length long enough for simultaneous E- and F-region measurements with a reasonable lag extent. The new experiment design is tested with the EISCAT Tromsø VHF (224 MHz radar. An example of a full D/E/F-region ACF from the test run is shown at the end of the paper.

  12. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  13. TECATE - a code for anisotropic thermoelasticity in high-average-power laser technology. Phase 1 final report

    Energy Technology Data Exchange (ETDEWEB)

    Gelinas, R.J.; Doss, S.K.; Carlson, N.N.

    1985-01-01

    This report describes a totally Eulerian code for anisotropic thermoelasticity (code name TECATE) which may be used in evaluations of prospective crystal media for high-average-power lasers. The present TECATE code version computes steady-state distributions of material temperatures, stresses, strains, and displacement fields in 2-D slab geometry. Numerous heat source and coolant boundary condition options are available in the TECATE code for laser design considerations. Anisotropic analogues of plane stress and plane strain evaluations can be executed for any and all crystal symmetry classes. As with all new and/or large physics codes, it is likely that some code imperfections will emerge at some point in time.

  14. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  15. MCNP(TM) Version 5.

    Energy Technology Data Exchange (ETDEWEB)

    Cox, L. J. (Lawrence J.); Barrett, R. F. (Richard F.); Booth, Thomas Edward; Briesmeister, Judith F.; Brown, F. B. (Forrest B.); Bull, J. S. (Jeffrey S.); Giesler, G. C. (Gregg Carl); Goorley, J. T. (John T.); Mosteller, R. D. (Russell D.); Forster, R. A. (R. Arthur); Post, S. E. (Susan E.); Prael, R. E. (Richard E.); Selcow, Elizabeth Carol,; Sood, A. (Avneet)

    2002-01-01

    The Monte Carlo transport workhorse, MCNP, is undergoing a massive renovation at Los Alamos National Laboratory (LANL) in support of the Eolus Project of the Advanced Simulation and Computing (ASCI) Program. MCNP Version 5 (V5) (expected to be released to RSICC in Spring, 2002) will consist of a major restructuring from FORTRAN-77 (with extensions) to ANSI-standard FORTRAN-90 with support for all of the features available in the present release (MCNP-4C2/4C3). To most users, the look-and-feel of MCNP will not change much except for the improvements (improved graphics, easier installation, better online documentation). For example, even with the major format change, full support for incremental patching will still be provided. In addition to the language and style updates, MCNP V5 will have various new user features. These include improved photon physics, neutral particle radiography, enhancements and additions to variance reduction methods, new source options, and improved parallelism support (PVM, MPI, OpenMP).

  16. Nested Quantum Error Correction Codes

    CERN Document Server

    Wang, Zhuo; Fan, Hen; Vedral, Vlatko

    2009-01-01

    The theory of quantum error correction was established more than a decade ago as the primary tool for fighting decoherence in quantum information processing. Although great progress has already been made in this field, limited methods are available in constructing new quantum error correction codes from old codes. Here we exhibit a simple and general method to construct new quantum error correction codes by nesting certain quantum codes together. The problem of finding long quantum error correction codes is reduced to that of searching several short length quantum codes with certain properties. Our method works for all length and all distance codes, and is quite efficient to construct optimal or near optimal codes. Two main known methods in constructing new codes from old codes in quantum error-correction theory, the concatenating and pasting, can be understood in the framework of nested quantum error correction codes.

  17. MHD Generation Code

    CERN Document Server

    Frutos-Alfaro, Francisco

    2015-01-01

    A program to generate codes in Fortran and C of the full Magnetohydrodynamic equations is shown. The program used the free computer algebra system software REDUCE. This software has a package called EXCALC, which is an exterior calculus program. The advantage of this program is that it can be modified to include another complex metric or spacetime. The output of this program is modified by means of a LINUX script which creates a new REDUCE program to manipulate the MHD equations to obtain a code that can be used as a seed for a MHD code for numerical applications. As an example, we present part of output of our programs for Cartesian coordinates and how to do the discretization.

  18. Autocatalysis, information and coding.

    Science.gov (United States)

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  19. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  20. Adjoint code generator

    Institute of Scientific and Technical Information of China (English)

    CHENG Qiang; CAO JianWen; WANG Bin; ZHANG HaiBin

    2009-01-01

    The adjoint code generator (ADG) is developed to produce the adjoint codes, which are used to analytically calculate gradients and the Hessian-vector products with the costs independent of the number of the independent variables. Different from other automatic differentiation tools, the implementation of ADG has advantages of using the least program behavior decomposition method and several static dependence analysis techniques. In this paper we first address the concerned concepts and fundamentals, and then introduce the functionality and the features of ADG. In particular, we also discuss the design architecture of ADG and implementation details including the recomputation and storing strategy and several techniques for code optimization. Some experimental results in several applications are presented at the end.

  1. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  2. Spread codes and spread decoding in network coding

    OpenAIRE

    Manganiello, F; Gorla, E.; Rosenthal, J.

    2008-01-01

    In this paper we introduce the class of spread codes for the use in random network coding. Spread codes are based on the construction of spreads in finite projective geometry. The major contribution of the paper is an efficient decoding algorithm of spread codes up to half the minimum distance.

  3. Graph Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2006-01-01

    We treat a specific case of codes based on bipartite expander graphs coming from finite geometries. The code symbols are associated with the branches and the symbols connected to a given node are restricted to be codewords in a Reed-Solomon code. We give results on the parameters of the codes...

  4. A prescription and fast code for the long-term evolution of star clusters - II. Unbalanced and core evolution

    NARCIS (Netherlands)

    M. Gieles; P.E.R. Alexander; H.J.G.L.M. Lamers; H. Baumgardt

    2013-01-01

    We introduce version two of the fast star cluster evolution code Evolve Me A Cluster of StarS (emacss). The first version (Alexander and Gieles) assumed that cluster evolution is balanced for the majority of the life cycle, meaning that the rate of energy generation in the core of the cluster equals

  5. A prescription and fast code for the long-term evolution of star clusters - II. Unbalanced and core evolution

    NARCIS (Netherlands)

    M. Gieles; P.E.R. Alexander; H.J.G.L.M. Lamers; H. Baumgardt

    2014-01-01

    We introduce version two of the fast star cluster evolution code Evolve Me A Cluster of StarS (emacss). The first version (Alexander and Gieles) assumed that cluster evolution is balanced for the majority of the life cycle, meaning that the rate of energy generation in the core of the cluster equals

  6. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  7. Securing mobile code.

    Energy Technology Data Exchange (ETDEWEB)

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called &apos

  8. TART98 a coupled neutron-photon 3-D, combinatorial geometry time dependent Monte Carlo Transport code

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, D E

    1998-11-22

    TART98 is a coupled neutron-photon, 3 Dimensional, combinatorial geometry, time dependent Monte Carlo radiation transport code. This code can run on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART98 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system can save you a great deal of time and energy. TART98 is distributed on CD. This CD contains on-line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART98 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART98 and its data files.

  9. Global Historical Climatology Network - Daily (GHCN-Daily), Version 2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  10. Hybrid Parallel Contour Trees, Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-03

    A common operation in scientific visualization is to compute and render a contour of a data set. Given a function of the form f : R^d -> R, a level set is defined as an inverse image f^-1(h) for an isovalue h, and a contour is a single connected component of a level set. The Reeb graph can then be defined to be the result of contracting each contour to a single point, and is well defined for Euclidean spaces or for general manifolds. For simple domains, the graph is guaranteed to be a tree, and is called the contour tree. Analysis can then be performed on the contour tree in order to identify isovalues of particular interest, based on various metrics, and render the corresponding contours, without having to know such isovalues a priori. This code is intended to be the first data-parallel algorithm for computing contour trees. Our implementation will use the portable data-parallel primitives provided by Nvidia’s Thrust library, allowing us to compile our same code for both GPUs and multi-core CPUs. Native OpenMP and purely serial versions of the code will likely also be included. It will also be extended to provide a hybrid data-parallel / distributed algorithm, allowing scaling beyond a single GPU or CPU.

  11. DoD Planewave: A General Scalable Density Functional Code For Solids And Clusters

    Science.gov (United States)

    Kim, Seong-Gon; Singh, D. J.; Kajihara, S. A.; Woodward, C.

    2000-03-01

    We will present our latest version of the DoD Planewave code, a general purpose scalable planewave basis density functional code. DoD Planewave is written in highly portable Fortran 90 and runs on many high-performance parallel machines including IBM SP2, SGI Orgin 2000 clusters and Pentium machines running Linux. The package including the complete source code and example runs is freely available. The code is capable of treating clusters or bulk structures of insulators, semiconductors, metals and magnetic materials, with general symmetry. The present version performs self-consistent electronic structure, total energy and force calculations within the Local Density Approximation (LDA) and Generalized Gradient Approximation (GGA). It also does automatic structure optimization and ab initio molecular dynamics. Calculations demonstrating the capabilities of the code are presented. Further information may be found on our web-site (http://cst- www.nrl.navy.mil/people/singh/planewave/).

  12. Development of a subchannel analysis code MATRA (Ver. {alpha})

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Y. J.; Hwang, D. H

    1998-04-01

    A subchannel analysis code MATRA-{alpha}, an interim version of MATRA, has been developed to be run on an IBM PC or HP WS based on the existing CDC CYBER mainframe version of COBRA-IV-I. This MATRA code is a thermal-hydraulic analysis code based on the subchannel approach for calculating the enthalpy and flow distribution in fuel assemblies and reactor cores for both steady-state and transient conditions. MATRA-{alpha} has been provided with an improved structure, various functions, and models to give the more convenient user environment and to increase the code accuracy, various functions, and models to give the more convenient user environment and to increase the code accuracy. Among them, the pressure drop model has been improved to be applied to non-square-lattice rod arrays, and the lateral transport models between adjacent subchannels have been improved to increase the accuracy in predicting two-phase flow phenomena. Also included in this report are the detailed instructions for input data preparation and for auxiliary pre-processors to serve as a guide to those who want to use MATRA-{alpha}. In addition, we compared the predictions of MATRA-{alpha} with the experimental data on the flow and enthalpy distribution in three sample rod-bundle cases to evaluate the performance of MATRA-{alpha}. All the results revealed that the prediction of MATRA-{alpha} were better than those of COBRA-IV-I. (author). 16 refs., 1 tab., 13 figs.

  13. Texts in multiple versions: histories of editions

    NARCIS (Netherlands)

    Giuliani, L.; Brinkman, H.; Lernout, G.; Mathijsen, M.

    2006-01-01

    Texts in multiple versions constitute the core problem of textual scholarship. For texts from antiquity and the medieval period, the many versions may be the result of manuscript transmission, requiring editors and readers to discriminate between levels of authority in variant readings produced

  14. Schema Versioning for Multitemporal Relational Databases.

    Science.gov (United States)

    De Castro, Cristina; Grandi, Fabio; Scalas, Maria Rita

    1997-01-01

    Investigates new design options for extended schema versioning support for multitemporal relational databases. Discusses the improved functionalities they may provide. Outlines options and basic motivations for the new design solutions, as well as techniques for the management of proposed schema versioning solutions, includes algorithms and…

  15. Texts in multiple versions: histories of editions

    NARCIS (Netherlands)

    L. Giuliani; H. Brinkman; G. Lernout; M. Mathijsen

    2006-01-01

    Texts in multiple versions constitute the core problem of textual scholarship. For texts from antiquity and the medieval period, the many versions may be the result of manuscript transmission, requiring editors and readers to discriminate between levels of authority in variant readings produced alon

  16. Low Delay Wyner-Ziv Coding Using Optical Flow

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Forchhammer, Søren

    2014-01-01

    Distributed Video Coding (DVC) is a video coding paradigm that exploits the source statistics at the decoder based on the availability of the Side Information (SI). The SI can be seen as a noisy version of the source, and the lower the noise the higher the RD performance of the decoder. The SI...... on preceding frames for the generation of the SI by means of Optical Flow (OF), which is also used in the refinement step of the SI for enhanced RD performance. Compared with a state-of-the-art extrapolation-based decoder the proposed solution achieves RD Bjontegaard gains up to 1.3 dB....

  17. TRIPOLI-3: a neutron/photon Monte Carlo transport code

    Energy Technology Data Exchange (ETDEWEB)

    Nimal, J.C.; Vergnaud, T. [Commissariat a l' Energie Atomique, Gif-sur-Yvette (France). Service d' Etudes de Reacteurs et de Mathematiques Appliquees

    2001-07-01

    The present version of TRIPOLI-3 solves the transport equation for coupled neutron and gamma ray problems in three dimensional geometries by using the Monte Carlo method. This code is devoted both to shielding and criticality problems. The most important feature for particle transport equation solving is the fine treatment of the physical phenomena and sophisticated biasing technics useful for deep penetrations. The code is used either for shielding design studies or for reference and benchmark to validate cross sections. Neutronic studies are essentially cell or small core calculations and criticality problems. TRIPOLI-3 has been used as reference method, for example, for resonance self shielding qualification. (orig.)

  18. Min-Max decoding for non binary LDPC codes

    CERN Document Server

    Savin, Valentin

    2008-01-01

    Iterative decoding of non-binary LDPC codes is currently performed using either the Sum-Product or the Min-Sum algorithms or slightly different versions of them. In this paper, several low-complexity quasi-optimal iterative algorithms are proposed for decoding non-binary codes. The Min-Max algorithm is one of them and it has the benefit of two possible LLR domain implementations: a standard implementation, whose complexity scales as the square of the Galois field's cardinality and a reduced complexity implementation called selective implementation, which makes the Min-Max decoding very attractive for practical purposes.

  19. PCS a code system for generating production cross section libraries

    Energy Technology Data Exchange (ETDEWEB)

    Cox, L.J.

    1997-04-01

    This document outlines the use of the PCS Code System. It summarizes the execution process for generating FORMAT2000 production cross section files from FORMAT2000 reaction cross section files. It also describes the process of assembling the ASCII versions of the high energy production files made from ENDL and Mark Chadwick`s calculations. Descriptions of the function of each code along with its input and output and use are given. {ital This document is under construction. Please submit entries, suggestions, questions, and corrections to} {bold (ljc@llnl.gov)} 3 tabs.

  20. New code match strategy for wideband code division multiple access code tree management

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Orthogonal variable spreading factor channelization codes are widely used to provide variable data rates for supporting different bandwidth requirements in wideband code division multiple access (WCDMA) systems. A new code match scheme for WCDMA code tree management was proposed. The code match scheme is similar to the existing crowed-first scheme. When choosing a code for a user, the code match scheme only compares the one up layer of the allocated codes, unlike the crowed-first scheme which perhaps compares all up layers. So the operation of code match scheme is simple, and the average time delay is decreased by 5.1%. The simulation results also show that the code match strategy can decrease the average code blocking probability by 8.4%.

  1. Reed-Solomon convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Schmale, W

    2005-01-01

    In this paper we will introduce a specific class of cyclic convolutional codes. The construction is based on Reed-Solomon block codes. The algebraic parameters as well as the distance of these codes are determined. This shows that some of these codes are optimal or near optimal.

  2. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  3. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  4. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and compositi

  5. Polar Code Validation

    Science.gov (United States)

    1989-09-30

    SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3. POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL- ASMA Su ^"ru5 I1LS SH A...of this problem. 1.1. The Charge-2 Rocket The Charge-2 payload was launched on a Black Brant VB from White Sands Mis- sile Range in New Mexico in

  6. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  7. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  8. Corner neutronic code

    Directory of Open Access Journals (Sweden)

    V.P. Bereznev

    2015-10-01

    An iterative solution process is used, including external iterations for the fission source and internal iterations for the scattering source. The paper presents the results of a cross-verification against the Monte Carlo MMK code [3] and on a model of the BN-800 reactor core.

  9. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  10. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  11. Integrated burnup calculation code system SWAT

    Energy Technology Data Exchange (ETDEWEB)

    Suyama, Kenya; Hirakawa, Naohiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Iwasaki, Tomohiko

    1997-11-01

    SWAT is an integrated burnup code system developed for analysis of post irradiation examination, transmutation of radioactive waste, and burnup credit problem. It enables us to analyze the burnup problem using neutron spectrum depending on environment of irradiation, combining SRAC which is Japanese standard thermal reactor analysis code system and ORIGEN2 which is burnup code widely used all over the world. SWAT makes effective cross section library based on results by SRAC, and performs the burnup analysis with ORIGEN2 using that library. SRAC and ORIGEN2 can be called as external module. SWAT has original cross section library on based JENDL-3.2 and libraries of fission yield and decay data prepared from JNDC FP Library second version. Using these libraries, user can use latest data in the calculation of SWAT besides the effective cross section prepared by SRAC. Also, User can make original ORIGEN2 library using the output file of SWAT. This report presents concept and user`s manual of SWAT. (author)

  12. Prospective Coding by Spiking Neurons.

    Directory of Open Access Journals (Sweden)

    Johanni Brea

    2016-06-01

    Full Text Available Animals learn to make predictions, such as associating the sound of a bell with upcoming feeding or predicting a movement that a motor command is eliciting. How predictions are realized on the neuronal level and what plasticity rule underlies their learning is not well understood. Here we propose a biologically plausible synaptic plasticity rule to learn predictions on a single neuron level on a timescale of seconds. The learning rule allows a spiking two-compartment neuron to match its current firing rate to its own expected future discounted firing rate. For instance, if an originally neutral event is repeatedly followed by an event that elevates the firing rate of a neuron, the originally neutral event will eventually also elevate the neuron's firing rate. The plasticity rule is a form of spike timing dependent plasticity in which a presynaptic spike followed by a postsynaptic spike leads to potentiation. Even if the plasticity window has a width of 20 milliseconds, associations on the time scale of seconds can be learned. We illustrate prospective coding with three examples: learning to predict a time varying input, learning to predict the next stimulus in a delayed paired-associate task and learning with a recurrent network to reproduce a temporally compressed version of a sequence. We discuss the potential role of the learning mechanism in classical trace conditioning. In the special case that the signal to be predicted encodes reward, the neuron learns to predict the discounted future reward and learning is closely related to the temporal difference learning algorithm TD(λ.

  13. Fundamentals of coding and reimbursement.

    Science.gov (United States)

    Price, Paula

    2002-01-01

    After completing this introduction to radiology coding and reimbursement, readers will: Understand how health care reimbursement evolved over the past 50 years. Know the importance of documenting the patient's history. Have an overall picture of the standardized numerical coding system. Understand how accurate coding affects reimbursement. Understand coding functions as they pertain to regulatory compliance in the radiology department. Be familiar with the U.S. Justice Department's use of coding in tracking health care fraud.

  14. On Asymmetric Quantum MDS Codes

    CERN Document Server

    Ezerman, Martianus Frederic; Ling, San

    2010-01-01

    Assuming the validity of the MDS Conjecture, the weight distribution of all MDS codes is known. Using a recently-established characterization of asymmetric quantum error-correcting codes, linear MDS codes can be used to construct asymmetric quantum MDS codes with $d_{z} \\geq d_{x}\\geq 2$ for all possible values of length $n$ for which linear MDS codes over $\\F_{q}$ are known to exist.

  15. ParSplice, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-05

    The ParSplice code implements the Parallel Trajectory Splicing algorithm described in [1]. This method is part of the Accelerated Molecular Dynamics family of techniques developed in Los Alamos National Laboratory over the last 16 years. These methods aim at generating high-quality trajectories of ensembles of atoms in materials. ParSplice uses multiple independent replicas of the system in order to parallelize the generation of such trajectories in the time domain, enabling simulations of systems of modest size over very long timescales. ParSplice includes capabilities to store configurations of the system, to generate and distribute tasks across a large number of processors, and to harvest the results of these tasks to generate long trajectories. ParSplice is a management layer that orchestrate large number of calculations, but it does not perform the actual molecular dynamics itself; this is done by external molecular dynamics engines. [1] Danny Perez, Ekin D Cubuk, Amos Waterland, Efthimios Kaxiras, Arthur F Voter, Long-time dynamics through parallel trajectory splicing, Journal of chemical theory and computation 12, 18 (2015)

  16. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  17. A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Wyss, Gregory Dane

    2004-07-01

    This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a library that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.

  18. TERS v2.0: An improved version of TERS

    Science.gov (United States)

    Nath, S.

    2009-11-01

    We present a new version of the semimicroscopic Monte Carlo code "TERS". The procedure for calculating multiple small angle Coulomb scattering of the residues in the target has been modified. Target-backing and residue charge-reset foils, which are often used in heavy ion-induced complete fusion reactions, are included in the code. New version program summaryProgram title: TERS v2.0 Catalogue identifier: AEBD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBD_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7309 No. of bytes in distributed program, including test data, etc.: 1 219 555 Distribution format: tar.gz Programming language: C Computer: The code has been developed and tested on a PC with Intel Pentium IV processor. Operating system: Linux RAM: About 8 Mbytes Classification: 17.7 External routines: pgplot graphics subroutine library [1] should be installed in the system for generating residue trajectory plots. (The library is included in the CPC distribution file.) Catalogue identifier of previous version: AEBD_v1_0 Journal reference of previous version: Comput. Phys. Comm. 179 (2008) 492 Does the new version supersede the previous version?: Yes Nature of problem: Recoil separators are employed to select and identify nuclei of interest, produced in a nuclear reaction, rejecting unreacted beam and other undesired reaction products. It is important to know what fraction of the selected nuclei, leaving the target, reach the detection system. This information is crucial for determining absolute cross section of the studied reaction. Solution method: Interaction of projectiles with target nuclei is treated event by event, semimicro-scopically. Position and angle (with respect to beam direction), energy and charge state of the reaction products are

  19. Fountain Codes with Multiplicatively Repeated Non-Binary LDPC Codes

    CERN Document Server

    Kasai, Kenta

    2010-01-01

    We study fountain codes transmitted over the binary-input symmetric-output channel. For channels with small capacity, receivers needs to collects many channel outputs to recover information bits. Since a collected channel output yields a check node in the decoding Tanner graph, the channel with small capacity leads to large decoding complexity. In this paper, we introduce a novel fountain coding scheme with non-binary LDPC codes. The decoding complexity of the proposed fountain code does not depend on the channel. Numerical experiments show that the proposed codes exhibit better performance than conventional fountain codes, especially for small number of information bits.

  20. Quantum codes from linear codes over finite chain rings

    Science.gov (United States)

    Liu, Xiusheng; Liu, Hualu

    2017-10-01

    In this paper, we provide two methods of constructing quantum codes from linear codes over finite chain rings. The first one is derived from the Calderbank-Shor-Steane (CSS) construction applied to self-dual codes over finite chain rings. The second construction is derived from the CSS construction applied to Gray images of the linear codes over finite chain ring {\\mathbb {F}}_{p^{2m}}+u{\\mathbb {F}}_{p^{2m}}. The good parameters of quantum codes from cyclic codes over finite chain rings are obtained.