WorldWideScience

Sample records for performance code conceptual

  1. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  2. Conceptual frameworks of individual work performance: A systematic review

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Schaufeli, W.B.; Vet, H.C.W. de; Beek, A.J. van der

    2011-01-01

    Objective: Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework.

  3. Conceptual frameworks of individual work performance a systematic review

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Schaufeli, W.B.; de Vet, H.C.W.; van der Beek, A.J.

    2011-01-01

    Objective: Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework.

  4. The fuel performance code future

    International Nuclear Information System (INIS)

    Ronchi, C.; Van de Laar, J.

    1988-01-01

    The paper describes the LWR version of the fuel performance code FUTURE, which was recently developed to calculate the fuel response (swelling, cladding deformation, release) to reactor transient conditions, starting from a broad-based description of the processes of major concern. The main physical models assumed are presented together with the scheme of the computer program

  5. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  6. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1991-10-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues are raised by various critics. In this paper I provide some perspective on issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass-action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes

  7. Driver Performance Model: 1. Conceptual Framework

    National Research Council Canada - National Science Library

    Heimerl, Joseph

    2001-01-01

    ...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.

  8. Safety performance of preliminary KALIMER conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong [Korea atomic Energy Resarch Inst., Taejon (Korea)

    1999-07-01

    The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)

  9. Safety performance of preliminary KALIMER conceptual design

    International Nuclear Information System (INIS)

    Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong

    1999-01-01

    The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)

  10. Nuclear-thermal-coupled optimization code for the fusion breeding blanket conceptual design

    International Nuclear Information System (INIS)

    Li, Jia; Jiang, Kecheng; Zhang, Xiaokang; Nie, Xingchen; Zhu, Qinjun; Liu, Songlin

    2016-01-01

    Highlights: • A nuclear-thermal-coupled predesign code has been developed for optimizing the radial build arrangement of fusion breeding blanket. • Coupling module aims at speeding up the efficiency of design progress by coupling the neutronics calculation code with the thermal-hydraulic analysis code. • Radial build optimization algorithm aims at optimal arrangement of breeding blanket considering one or multiple specified objectives subject to the design criteria such as material temperature limit and available TBR. - Abstract: Fusion breeding blanket as one of the key in-vessel components performs the functions of breeding the tritium, removing the nuclear heat and heat flux from plasma chamber as well as acting as part of shielding system. The radial build design which determines the arrangement of function zones and material properties on the radial direction is the basis of the detailed design of fusion breeding blanket. For facilitating the radial build design, this study aims for developing a pre-design code to optimize the radial build of blanket with considering the performance of nuclear and thermal-hydraulic simultaneously. Two main features of this code are: (1) Coupling of the neutronics analysis with the thermal-hydraulic analysis to speed up the analysis progress; (2) preliminary optimization algorithm using one or multiple specified objectives subject to the design criteria in the form of constrains imposed on design variables and performance parameters within the possible engineering ranges. This pre-design code has been applied to the conceptual design of water-cooled ceramic breeding blanket in project of China fusion engineering testing reactor (CFETR).

  11. Nuclear-thermal-coupled optimization code for the fusion breeding blanket conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jia, E-mail: lijia@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230027, Anhui (China); Jiang, Kecheng; Zhang, Xiaokang [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031, Anhui (China); Nie, Xingchen [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230027, Anhui (China); Zhu, Qinjun; Liu, Songlin [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031, Anhui (China)

    2016-12-15

    Highlights: • A nuclear-thermal-coupled predesign code has been developed for optimizing the radial build arrangement of fusion breeding blanket. • Coupling module aims at speeding up the efficiency of design progress by coupling the neutronics calculation code with the thermal-hydraulic analysis code. • Radial build optimization algorithm aims at optimal arrangement of breeding blanket considering one or multiple specified objectives subject to the design criteria such as material temperature limit and available TBR. - Abstract: Fusion breeding blanket as one of the key in-vessel components performs the functions of breeding the tritium, removing the nuclear heat and heat flux from plasma chamber as well as acting as part of shielding system. The radial build design which determines the arrangement of function zones and material properties on the radial direction is the basis of the detailed design of fusion breeding blanket. For facilitating the radial build design, this study aims for developing a pre-design code to optimize the radial build of blanket with considering the performance of nuclear and thermal-hydraulic simultaneously. Two main features of this code are: (1) Coupling of the neutronics analysis with the thermal-hydraulic analysis to speed up the analysis progress; (2) preliminary optimization algorithm using one or multiple specified objectives subject to the design criteria in the form of constrains imposed on design variables and performance parameters within the possible engineering ranges. This pre-design code has been applied to the conceptual design of water-cooled ceramic breeding blanket in project of China fusion engineering testing reactor (CFETR).

  12. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream

    Science.gov (United States)

    Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY

    2018-01-01

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853

  13. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream.

    Science.gov (United States)

    Martin, Chris B; Douglas, Danielle; Newsome, Rachel N; Man, Louisa Ly; Barense, Morgan D

    2018-02-02

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. © 2018, Martin et al.

  14. Blood and Books: Performing Code Switching

    Directory of Open Access Journals (Sweden)

    Jeff Friedman

    2008-05-01

    Full Text Available Code switching is a linguistic term that identifies ways individuals use communication modes and registers to negotiate difference in social relations. This essay suggests that arts-based inquiry, in the form of choreography and performance, provides a suitable and efficacious location within which both verbal and nonverbal channels of code switching can be investigated. Blood and Books, a case study of dance choreography within the context of post-colonial Maori performance in Aotearoa/New Zealand, is described and analyzed for its performance of code switching. The essay is framed by a discussion of how arts-based research within tertiary higher education requires careful negotiation in the form of code switching, as performed by the author's reflexive use of vernacular and formal registers in the essay. URN: urn:nbn:de:0114-fqs0802462

  15. Performance measures for transform data coding.

    Science.gov (United States)

    Pearl, J.; Andrews, H. C.; Pratt, W. K.

    1972-01-01

    This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.

  16. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  17. The NMC code: conduct, performance and ethics.

    Science.gov (United States)

    Goldsmith, Jan

    The Code: Standards of Conduct, Performance and Ethics for Nurses and Midwives is a set of key principles that should underpin the practice of all nurses and midwives, and remind them of their professional responsibilities. It is not just a tool used in fitness-to-practise cases--it should be used to guide daily practice for all nurses and midwives. Alongside other standards, guidance and advice from the NMC, the code should be used to support professional development.

  18. Performance analysis of conceptual waste package designs in salt repositories

    International Nuclear Information System (INIS)

    Jansen, G. Jr.; Raines, G.E.; Kircher, J.F.

    1984-01-01

    A performance analysis of commercial high-level waste and spent fuel conceptual package designs in reference repositories in three salt formations was conducted with the WAPPA waste package code. Expected conditions for temperature, stress, brine composition, radiation level, and brine flow rate were used as boundary conditions to compute expected corrosion of a thick-walled overpack of 1025 wrought steel. In all salt formations corrosion by low Mg salt-dissolution brines typical of intrusion scenarios was too slow to cause the package to fail for thousands of years after burial. In high Mg brines judged typical of thermally migrating brines in bedded salt formations, corrosion rates which would otherwise have caused the packages to fail within a few hundred years were limited by brine availability. All of the brine reaching the package was consumed by reaction with the iron in the overpack, thus preventing further corrosion. Uniform brine distribution over the package surface was an important factor in predicting long package lifetimes for the high Mg brines. 14 references, 15 figures

  19. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  20. The METEOR/TRANSURANUS fuel performance code

    International Nuclear Information System (INIS)

    Struzik, C.; Guerin, Y.

    1996-01-01

    The first calculations for the FUMEX exercise were performed using version 1.1 of the METEOR/TRANSURANUS code. Since then, important improvements have been implemented on several models. In its present state, the code describes fuel rod behaviour in standard PWR conditions. Its validity extends to UO 2 and MOX fuels clad in Zircaloy-4. Power transient calculations for UO 2 and Gd doped fuel calculations are possible, but further developments are in progress, and the applications will be fully qualified in version 2.0. A considerable effort is made to replace semi-empirical models with models that have a sounder physical basis. (authors). 14 refs

  1. The UK core performance code package

    International Nuclear Information System (INIS)

    Hutt, P.K.; Gaines, N.; McEllin, M.; White, R.J.; Halsall, M.J.

    1991-01-01

    Over the last few years work has been co-ordinated by Nuclear Electric, originally part of the Central Electricity Generating Board, with contributions from the United Kingdom Atomic Energy Authority and British Nuclear Fuels Limited, to produce a generic, easy-to-use and integrated package of core performance codes able to perform a comprehensive range of calculations for fuel cycle design, safety analysis and on-line operational support for Light Water Reactor and Advanced Gas Cooled Reactor plant. The package consists of modern rationalized generic codes for lattice physics (WIMS), whole reactor calculations (PANTHER), thermal hydraulics (VIPRE) and fuel performance (ENIGMA). These codes, written in FORTRAN77, are highly portable and new developments have followed modern quality assurance standards. These codes can all be run ''stand-alone'' but they are also being integrated within a new UNIX-based interactive system called the Reactor Physics Workbench (RPW). The RPW provides an interactive user interface and a sophisticated data management system. It offers quality assurance features to the user and has facilities for defining complex calculational sequences. The Paper reviews the current capabilities of these components, their integration within the package and outlines future developments underway. Finally, the Paper describes the development of an on-line version of this package which is now being commissioned on UK AGR stations. (author)

  2. Alternative conceptual models and codes for unsaturated flow in fractured tuff: Preliminary assessments for GWTT-95

    International Nuclear Information System (INIS)

    Ho, C.K.; Altman, S.J.; Arnold, B.W.

    1995-09-01

    Groundwater travel time (GWTT) calculations will play an important role in addressing site-suitability criteria for the potential high-level nuclear waste repository at Yucca Mountain,Nevada. In support of these calculations, Preliminary assessments of the candidate codes and models are presented in this report. A series of benchmark studies have been designed to address important aspects of modeling flow through fractured media representative of flow at Yucca Mountain. Three codes (DUAL, FEHMN, and TOUGH 2) are compared in these benchmark studies. DUAL is a single-phase, isothermal, two-dimensional flow simulator based on the dual mixed finite element method. FEHMN is a nonisothermal, multiphase, multidimensional simulator based primarily on the finite element method. TOUGH2 is anon isothermal, multiphase, multidimensional simulator based on the integral finite difference method. Alternative conceptual models of fracture flow consisting of the equivalent continuum model (ECM) and the dual permeability (DK) model are used in the different codes

  3. Neutron transport-burnup code MCORGS and its application in fusion fission hybrid blanket conceptual research

    Science.gov (United States)

    Shi, Xue-Ming; Peng, Xian-Jue

    2016-09-01

    Fusion science and technology has made progress in the last decades. However, commercialization of fusion reactors still faces challenges relating to higher fusion energy gain, irradiation-resistant material, and tritium self-sufficiency. Fusion Fission Hybrid Reactors (FFHR) can be introduced to accelerate the early application of fusion energy. Traditionally, FFHRs have been classified as either breeders or transmuters. Both need partition of plutonium from spent fuel, which will pose nuclear proliferation risks. A conceptual design of a Fusion Fission Hybrid Reactor for Energy (FFHR-E), which can make full use of natural uranium with lower nuclear proliferation risk, is presented. The fusion core parameters are similar to those of the International Thermonuclear Experimental Reactor. An alloy of natural uranium and zirconium is adopted in the fission blanket, which is cooled by light water. In order to model blanket burnup problems, a linkage code MCORGS, which couples MCNP4B and ORIGEN-S, is developed and validated through several typical benchmarks. The average blanket energy Multiplication and Tritium Breeding Ratio can be maintained at 10 and 1.15 respectively over tens of years of continuous irradiation. If simple reprocessing without separation of plutonium from uranium is adopted every few years, FFHR-E can achieve better neutronic performance. MCORGS has also been used to analyze the ultra-deep burnup model of Laser Inertial Confinement Fusion Fission Energy (LIFE) from LLNL, and a new blanket design that uses Pb instead of Be as the neutron multiplier is proposed. In addition, MCORGS has been used to simulate the fluid transmuter model of the In-Zinerater from Sandia. A brief comparison of LIFE, In-Zinerater, and FFHR-E will be given.

  4. SCANAIR: A transient fuel performance code

    International Nuclear Information System (INIS)

    Moal, Alain; Georgenthum, Vincent; Marchand, Olivier

    2014-01-01

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  5. SCANAIR: A transient fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Moal, Alain, E-mail: alain.moal@irsn.fr; Georgenthum, Vincent; Marchand, Olivier

    2014-12-15

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  6. A Constructive Conceptual Approach to Strategic Performance Measurement

    DEFF Research Database (Denmark)

    Mitchell, Falconer; Nielsen, Lars Bråd; Nørreklit, Hanne

    This paper focuses on identifying the key characteristics of a good strategic performance. It does this from a conceptual base founded in the paradigm of pragmatic constructivism. This involves analysing real world activities such as strategy setting and implementation in terms of the facts on wh...

  7. Performance in Public Organizations: Clarifying the Conceptual Space

    DEFF Research Database (Denmark)

    Andersen, Lotte Bøgh; Boesen, Andreas; Holm Pedersen, Lene

    2016-01-01

    's perspective is performance being assessed? Are the criteria formal or informal? Are the criteria subjective? Which process focus and product focus do they have, if any? What is the unit of analysis? Based on these distinctions, the performance criteria of existing studies used in an empirical review...... of management and performance are classified. The results illustrate how a systematization of the conceptual space of performance in public organizations can help researchers select what to study and what to leave out with greater accuracy while also bringing greater clarity to public debates about performance....

  8. Conceptual Approach to Forming the Basic Code of Neo-Industrial Development of a Region

    Directory of Open Access Journals (Sweden)

    Elena Leonidovna Andreeva

    2017-09-01

    Full Text Available In the article, the authors propose the conceptual fundamentals of the “code approach” to the regional neo-industrial development. The purpose of the research is to reveal the essence of the transition to a new type of industrial and economic relations through a prism of “genetic codes” of the region. We consider these codes as a system of the “racial memory” of a territory, which determines the specificity and features of neo-industrialization realization. We substantiated the hypothesis about the influence of the “genetic codes” of the region on the effectiveness of the neo-industrialization. We have defined the participants, or else the carriers of the codes in the transformation of regional inheritance for the stimulation of the neoindustrial development of region’s economy. The subject matter of the research is the distinctive features of the functioning of the determinative region’s codes. Their content determines the socio-economic specificity of the region and the features of innovative, informational, value-based and competence-based development of the territory. The determinative codes generate the dynamic codes of the region, which are understood as their derivatives. They have a high probability of occurrence, higher speed of development and distribution, internal forces that make possible the self-development of the region. The scientific contribution is the substantiation of the basic code of the regional neo-industrial development. It represents the evolutionary accumulation of the rapid changes of its innovative, informational, value-based and competence-based codes stimulating the generation and implementation of new ideas regarding to economic entities adapted to the historical and cultural conditions. The article presents the code model of neo-industrial development of the region described by formulas. We applied the system analysis methods, historical and civilization approaches, evolutionary and

  9. Middle school teachers' familiarity with, interest in, performance on, and conceptual and pedagogical knowledge of light

    Science.gov (United States)

    Mbewe, Simeon

    The purpose of this study was threefold: Examine middle school teachers' familiarity with, interest in, conceptual knowledge of and performance on light; Examine their ability to identify misconceptions on light and their suggested pedagogical ideas to address the identified misconceptions; and Establish the relationship between the middle school teachers' interest, familiarity, conceptual understanding, performance, misconception identification, and pedagogical ideas for light. Sixty six (66) middle school science teachers enrolled in three math and science teacher professional development projects at Southern Illinois University Carbondale participated in this study. This study used mixed-methods approach to collect and analyze data. The participants responded in writing to four different instruments: Familiarity and Interest Questionnaire, Conceptual Knowledge Test, Two-tier Performance Test, and Misconceptions Identification Questionnaire. Data was analyzed quantitatively by conducting non-parametric (Wilcoxon, Mann-Whitney U, and Kruskal-Wallis) and parametric (paired samples, independent samples, and One-Way ANOVA) tests. Qualitative data was analyzed using thematic analysis and open coding to identify emerging themes and categories. The results showed that the teachers reported high levels of familiarity with and interest in learning more about light concepts. However, they had low conceptual knowledge and performance on light concepts. As such, middle school teachers' perceived knowledge of light concepts was not consistent with their actual knowledge of light. To some extent, the teachers identified students' misconceptions expressed in some scenarios on light and also suggested pedagogical ideas for addressing such misconceptions in middle school science classrooms. However, most teachers did not provide details on their pedagogical ideas for light. Correlations among the four constructs (familiarity, interest, conceptual understanding, and performance

  10. Performance studies of the parallel VIM code

    International Nuclear Information System (INIS)

    Shi, B.; Blomquist, R.N.

    1996-01-01

    In this paper, the authors evaluate the performance of the parallel version of the VIM Monte Carlo code on the IBM SPx at the High Performance Computing Research Facility at ANL. Three test problems with contrasting computational characteristics were used to assess effects in performance. A statistical method for estimating the inefficiencies due to load imbalance and communication is also introduced. VIM is a large scale continuous energy Monte Carlo radiation transport program and was parallelized using history partitioning, the master/worker approach, and p4 message passing library. Dynamic load balancing is accomplished when the master processor assigns chunks of histories to workers that have completed a previously assigned task, accommodating variations in the lengths of histories, processor speeds, and worker loads. At the end of each batch (generation), the fission sites and tallies are sent from each worker to the master process, contributing to the parallel inefficiency. All communications are between master and workers, and are serial. The SPx is a scalable 128-node parallel supercomputer with high-performance Omega switches of 63 microsec latency and 35 MBytes/sec bandwidth. For uniform and reproducible performance, they used only the 120 identical regular processors (IBM RS/6000) and excluded the remaining eight planet nodes, which may be loaded by other's jobs

  11. Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter

    Science.gov (United States)

    Russell, Carl; Johnson, Wayne

    2012-01-01

    A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.

  12. High performance APCS conceptual design and evaluation scoping study

    International Nuclear Information System (INIS)

    Soelberg, N.; Liekhus, K.; Chambers, A.; Anderson, G.

    1998-02-01

    This Air Pollution Control System (APCS) Conceptual Design and Evaluation study was conducted to evaluate a high-performance (APC) system for minimizing air emissions from mixed waste thermal treatment systems. Seven variations of high-performance APCS designs were conceptualized using several design objectives. One of the system designs was selected for detailed process simulation using ASPEN PLUS to determine material and energy balances and evaluate performance. Installed system capital costs were also estimated. Sensitivity studies were conducted to evaluate the incremental cost and benefit of added carbon adsorber beds for mercury control, specific catalytic reduction for NO x control, and offgas retention tanks for holding the offgas until sample analysis is conducted to verify that the offgas meets emission limits. Results show that the high-performance dry-wet APCS can easily meet all expected emission limits except for possibly mercury. The capability to achieve high levels of mercury control (potentially necessary for thermally treating some DOE mixed streams) could not be validated using current performance data for mercury control technologies. The engineering approach and ASPEN PLUS modeling tool developed and used in this study identified APC equipment and system performance, size, cost, and other issues that are not yet resolved. These issues need to be addressed in feasibility studies and conceptual designs for new facilities or for determining how to modify existing facilities to meet expected emission limits. The ASPEN PLUS process simulation with current and refined input assumptions and calculations can be used to provide system performance information for decision-making, identifying best options, estimating costs, reducing the potential for emission violations, providing information needed for waste flow analysis, incorporating new APCS technologies in existing designs, or performing facility design and permitting activities

  13. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  14. Word meaning in the ventral visual path: a perceptual to conceptual gradient of semantic coding.

    Science.gov (United States)

    Borghesani, Valentina; Pedregosa, Fabian; Buiatti, Marco; Amadon, Alexis; Eger, Evelyn; Piazza, Manuela

    2016-12-01

    The meaning of words referring to concrete items is thought of as a multidimensional representation that includes both perceptual (e.g., average size, prototypical color) and conceptual (e.g., taxonomic class) dimensions. Are these different dimensions coded in different brain regions? In healthy human subjects, we tested the presence of a mapping between the implied real object size (a perceptual dimension) and the taxonomic categories at different levels of specificity (conceptual dimensions) of a series of words, and the patterns of brain activity recorded with functional magnetic resonance imaging in six areas along the ventral occipito-temporal cortical path. Combining multivariate pattern classification and representational similarity analysis, we found that the real object size implied by a word appears to be primarily encoded in early visual regions, while the taxonomic category and sub-categorical cluster in more anterior temporal regions. This anteroposterior gradient of information content indicates that different areas along the ventral stream encode complementary dimensions of the semantic space. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Methodology for thermal hydraulic conceptual design and performance analysis of KALIMER core

    International Nuclear Information System (INIS)

    Young-Gyun Kim; Won-Seok Kim; Young-Jin Kim; Chang-Kue Park

    2000-01-01

    This paper summarizes the methodology for thermal hydraulic conceptual design and performance analysis which is used for KALIMER core, especially the preliminary methodology for flow grouping and peak pin temperature calculation in detail. And the major technical results of the conceptual design for the KALIMER 98.03 core was shown and compared with those of KALIMER 97.07 design core. The KALIMER 98.03 design core is proved to be more optimized compared to the 97.07 design core. The number of flow groups are reduced from 16 to 11, and the equalized peak cladding midwall temperature from 654 deg. C to 628 deg. C. It was achieved from the nuclear and thermal hydraulic design optimization study, i.e. core power flattening and increase of radial blanket power fraction. Coolant flow distribution to the assemblies and core coolant/component temperatures should be determined in core thermal hydraulic analysis. Sodium flow is distributed to core assemblies with the overall goal of equalizing the peak cladding midwall temperatures for the peak temperature pin of each bundle, thus pin cladding damage accumulation and pin reliability. The flow grouping and the peak pin temperature calculation for the preliminary conceptual design is performed with the modules ORFCE-F60 and ORFCE-T60 respectively. The basic subchannel analysis will be performed with the SLTHEN code, and the detailed subchannel analysis will be done with the MATRA-LMR code which is under development for the K-Core system. This methodology was proved practical to KALIMER core thermal hydraulic design from the related benchmark calculation studies, and it is used to KALIMER core thermal hydraulic conceptual design. (author)

  16. Development of intelligent code system to support conceptual design of nuclear reactor core

    International Nuclear Information System (INIS)

    Kugo, Teruhiko; Nakagawa, Masayuki; Tsuchihashi, Keichiro

    1997-01-01

    An intelligent reactor design system IRDS has been developed to support conceptual design of new type reactor cores in the fields of neutronics, thermal-hydraulics and fuel behavior. The features of IRDS are summarized as follows: 1) a variety of computer codes to cover various design tasks relevant to 'static' and 'burnup' problems are implemented, 2) all the information necessary to the codes implemented is unified in a data base, 3) several data and knowledge bases are referred to in order to proceed design process efficiently for non-expert users, 4) advanced man-machine interface to communicate with the system through an interactive and graphical user interface is equipped and 5) a function to search automatically a design window, which is defined as a feasible parameter range to satisfy design requirement and criteria is employed to support the optimization or satisfication process. Applicability and productivity of the system are demonstrated by the design study of fuel pin for new type FBR cores. (author)

  17. CONCEPTUAL FRAMEWORK FOR ESTIMATING THE PERFORMANCE OF INTERORGANIZATIONAL COLLABORATIVE INNOVATION

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2013-10-01

    Full Text Available The construction of a sequential model performance based on indicators broken down by factors and variables is relevant because it provides a complete picture of the effectiveness of collaborative structures built and operated on the basis of policies induced by the organizations involved. It also provides a longitudinal analysis of the effectiveness of collaboration for innovation. I will define the performance analysis model of collaborative innovation as a conceptual tool consisting of a set of elements and relationships between them, allowing the quantification of the expression of innovation performance as a result of interaction of several organizations. Applying such a model involves the analysis and estimation of the added value of each segment of interorganizational innovation cycle as part of the overall performance obtained by combining existing or created knowledge. From this point of view, it is necessary to develop an ontology, a common ground on which this model can be built.

  18. Conceptual-driven classification for coding advise in health insurance reimbursement.

    Science.gov (United States)

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in

  19. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-09-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location. Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams. Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed. Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model. Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  20. Performance testing of thermal analysis codes for nuclear fuel casks

    International Nuclear Information System (INIS)

    Sanchez, L.C.

    1987-01-01

    In 1982 Sandia National Laboratories held the First Industry/Government Joint Thermal and Structural Codes Information Exchange and presented the initial stages of an investigation of thermal analysis computer codes for use in the design of nuclear fuel shipping casks. The objective of the investigation was to (1) document publicly available computer codes, (2) assess code capabilities as determined from their user's manuals, and (3) assess code performance on cask-like model problems. Computer codes are required to handle the thermal phenomena of conduction, convection and radiation. Several of the available thermal computer codes were tested on a set of model problems to assess performance on cask-like problems. Solutions obtained with the computer codes for steady-state thermal analysis were in good agreement and the solutions for transient thermal analysis differed slightly among the computer codes due to modeling differences

  1. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  2. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  3. A fuel performance code TRUST VIc and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, M; Kogai, T [Nippon Nuclear Fuel Development Co. Ltd., Oarai, Ibaraki (Japan)

    1997-08-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs.

  4. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  5. A fuel performance code TRUST VIc and its validation

    International Nuclear Information System (INIS)

    Ishida, M.; Kogai, T.

    1997-01-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs

  6. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  7. Student performance on conceptual questions: Does instruction matter?

    Science.gov (United States)

    Heron, Paula R. L.

    2013-01-01

    As part of the tutorial component of introductory calculus-based physics at the University of Washington, students take weekly pretests that consist of conceptual questions. Pretests are so named because they precede each tutorial, but they are frequently administered after lecture instruction. Many variables associated with class composition and prior instruction (if any) could, in principle, affect student performance on these questions. Nonetheless, the results are often found to be "essentially the same" in all classes. With data available from a large number of classes, it is possible to characterize the typical variation quantitatively. In this paper three questions for which we have accumulated thousands of responses, from dozens of classes representing different conditions with respect to the textbook in use, the amount of prior instruction, etc., serve as examples. For each question, we examine the variation in student performance across all classes. We also compare subsets categorized according to the amount of relevant prior instruction each class had received. A preliminary analysis suggests that the variation in performance is essentially random. No statistically significant difference is observed between results obtained before relevant instruction begins and after it has been completed. The results provide evidence that exposure to concepts in lecture and textbook is not sufficient to ensure an improvement in performance on questions that require qualitative reasoning.

  8. Predictive Bias and Sensitivity in NRC Fuel Performance Codes

    Energy Technology Data Exchange (ETDEWEB)

    Geelhood, Kenneth J.; Luscher, Walter G.; Senor, David J.; Cunningham, Mitchel E.; Lanning, Donald D.; Adkins, Harold E.

    2009-10-01

    The latest versions of the fuel performance codes, FRAPCON-3 and FRAPTRAN were examined to determine if the codes are intrinsically conservative. Each individual model and type of code prediction was examined and compared to the data that was used to develop the model. In addition, a brief literature search was performed to determine if more recent data have become available since the original model development for model comparison.

  9. Performance of code 'FAIR' in IAEA CRP on FUMEX

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Kakodkar, A.

    1996-01-01

    A modern fuel performance analysis code FAIR has been developed for analysing high burnup fuel pins of water/heavy water cooled reactors. The code employs finite element method for modelling thermo mechanical behaviour of fuel pins and mechanistic models for modelling various physical and chemical phenomena affecting the behaviour of nuclear reactor fuel pins. High burnup affects such as pellet thermal conductivity degradation, enhanced fission gas release and radial flux redistribution are incorporated in the code FAIR. The code FAIR is capable of performing statistical analysis of fuel pins using Monte Carlo technique. The code is implemented on BARC parallel processing system ANUPAM. The code has recently participated in an International Atomic Energy Agency (IAEA) coordinated research program (CRP) on fuel modelling at extended burnups (FUMEX). Nineteen agencies from different countries participated in this exercise. In this CRP, spread over a period of three years, a number of high burnup fuel pins irradiated at Halden reactor are analysed. The first phase of the CRP is a blind code comparison exercise, where the computed results are compared with experimental results. The second phase consists of modifications to the code based on the experimental results of first phase and statistical analysis of fuel pins. The performance of the code FAIR in this CRP has been very good. The present report highlights the main features of code FAIR and its performance in the IAEA CRP on FUMEX. 14 refs., 5 tabs., ills

  10. Performance Evaluation of Spectral Amplitude Codes for OCDMA PON

    DEFF Research Database (Denmark)

    Binti Othman, Maisara; Jensen, Jesper Bevensee; Zhang, Xu

    2011-01-01

    the MAI effects in OCDMA. The performance has been characterized through received optical power (ROP) sensitivity and dispersion tolerance assessments. The numerical results show that the ZCC code has a slightly better performance compared to the other two codes for the ROP and similar behavior against...

  11. Developments of fuel performance analysis codes in KEPCO NF

    International Nuclear Information System (INIS)

    Han, H. T.; Choi, J. M.; Jung, C. D.; Yoo, J. S.

    2012-01-01

    The KEPCO NF has developed fuel performance analysis and design code named as ROPER, and utility codes of XGCOL and XDNB in order to perform fuel rod design evaluation for Korean nuclear power plants. The ROPER code intends to cover full range of fuel performance evaluation. The XGCOL code is for the clad flattening evaluation and the XDNB code is for the extensive DNB propagation evaluation. In addition to these, the KEPCO NF is now in the developing stage for 3-dimensional fuel performance analysis code, named as OPER3D, using 3-dimensional FEM for the nest generation within the joint project CANDU ENERGY in order to analyze PCMI behavior and fuel performance under load following operation. Of these, the ROPER code is now in the stage of licensing activities by Korean regulatory body and the other two are almost in the final developing stage. After finishing the developing, licensing activities are to be performed. These activities are intending to acquire competitiveness, originality, vendor-free ownership of fuel performance codes in the KEPCO NF

  12. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  13. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  14. Performance of JPEG Image Transmission Using Proposed Asymmetric Turbo Code

    Directory of Open Access Journals (Sweden)

    Siddiqi Mohammad Umar

    2007-01-01

    Full Text Available This paper gives the results of a simulation study on the performance of JPEG image transmission over AWGN and Rayleigh fading channels using typical and proposed asymmetric turbo codes for error control coding. The baseline JPEG algorithm is used to compress a QCIF ( "Suzie" image. The recursive systematic convolutional (RSC encoder with generator polynomials , that is, (13/11 in decimal, and 3G interleaver are used for the typical WCDMA and CDMA2000 turbo codes. The proposed asymmetric turbo code uses generator polynomials , that is, (13/11; 13/9 in decimal, and a code-matched interleaver. The effect of interleaver in the proposed asymmetric turbo code is studied using weight distribution and simulation. The simulation results and performance bound for proposed asymmetric turbo code for the frame length , code rate with Log-MAP decoder over AWGN channel are compared with the typical system. From the simulation results, it is observed that the image transmission using proposed asymmetric turbo code performs better than that with the typical system.

  15. IS ENVIRONMENTAL ALIGNMENT AND BUSINESS PERFORMANCE: A CONCEPTUAL APPROACH

    Directory of Open Access Journals (Sweden)

    K. Garg

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper proposes a conceptual model termed "IS environmental alignment" that focuses on the support provided by IS strategy to minimize the gap between perceived environmental uncertainty and realized/objective environmental conditions. The model uses the Chan et al [9] alignment measurement method to measure IS strategic alignment as it provides a quantitative measure. In due course the proposed model would be tested in industry and would examine the affect of IS environmental alignment on business performance. The implication of the model lies in the effective use of deployed IS systems by organizations.

    AFRIKAANSE OPSOMMING: 'n Konsepmodel word voorgelê wat handel oor "IS-omgewingsaanpassing". IS-strategie-ondersteuning by die minimisering van die gaping tussen waargenome omgewingsonsekerheid en gerealiseerde/objektiewe omgewingstoestande. Die model maak gebruik van die Chan et al [9] aanpassingsmeetmetode om IS-strategie-aanpassing op 'n kwantitatiewe basis te bepaal. Met die verloop van tyd sal die konsepmodel in die praktyk getoets word om te toon hoe IS-strategie-aanpassing sakevertoning affekteer. Die effektiewe ontplooiing van IS-stelsels by ondernemings word voorgehou.

  16. Presenting a conceptual pattern of HSE performance of oil trucks.

    Science.gov (United States)

    Ghaleh, Sahar; Omidvari, Manouchehr; Nassiri, Parvin; Momeni, Mansour; Lavasani, Seyed Mohammadreza Miri

    2018-01-25

    Accidents are among the main problems in the oil product supply chain. The most important effective factors in these events are the kind of trucks used and their health, safety, and environment (HSE) condition. The aim of this study was to present a conceptual pattern of the HSE performance of oil trucks in oil industries. In this study, 20 truck models (with fixed tanks), in use over different periods of time, were investigated. In this regard, the criteria and sub-criteria were first determined in two parts-carrier and tank-and weighted by fuzzy analytical hierarchy process (FAHP). The results showed that the most important sub-criteria regarding the HSE factors of the carrier were resistance and strength of the front and rear shields, the brake system, and the ventilation system. The most important sub-criteria regarding the HSE factors of the tank were tank shell thickness and a good tank design shape with respect to portable material. It should be noted that the weight of the criteria with each other and sub-criteria with each other are not equal. This issue is important for decision-making. The main reason for the use of trucks with the lowest score in developing countries is the lack of attention by managers to safety issues and international standards and agreements such as the ADR.

  17. On the performance of diagonal lattice space-time codes

    KAUST Repository

    Abediseid, Walid; Alouini, Mohamed-Slim

    2013-01-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple output (MIMO) channel. All the coding design up-to-date focuses on either high-performance, high rates, low complexity encoding

  18. Performance of Product Codes and Related Structures with Iterated Decoding

    DEFF Research Database (Denmark)

    Justesen, Jørn

    2011-01-01

    Several modifications of product codes have been suggested as standards for optical networks. We show that the performance exhibits a threshold that can be estimated from a result about random graphs. For moderate input bit error probabilities, the output error rates for codes of finite length can...

  19. PAPIRUS - a computer code for FBR fuel performance analysis

    International Nuclear Information System (INIS)

    Kobayashi, Y.; Tsuboi, Y.; Sogame, M.

    1991-01-01

    The FBR fuel performance analysis code PAPIRUS has been developed to design fuels for demonstration and future commercial reactors. A pellet structural model was developed to describe the generation, depletion and transport of vacancies and atomic elements in unified fashion. PAPIRUS results in comparison with the power - to - melt test data from HEDL showed validity of the code at the initial reactor startup. (author)

  20. Performance Analysis of Optical Code Division Multiplex System

    Science.gov (United States)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  1. On the Performance of the Cache Coding Protocol

    DEFF Research Database (Denmark)

    Maboudi, Behnaz; Sehat, Hadi; Pahlevani, Peyman

    2018-01-01

    Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent thes...

  2. Performance Assessment Strategies: A computational framework for conceptual design of large roofs

    Directory of Open Access Journals (Sweden)

    Michela Turrin

    2014-01-01

    Full Text Available Using engineering performance evaluations to explore design alternatives during the conceptual phase of architectural design helps to understand the relationships between form and performance; and is crucial for developing well-performing final designs. Computer aided conceptual design has the potential to aid the design team in discovering and highlighting these relationships; especially by means of procedural and parametric geometry to support the generation of geometric design, and building performance simulation tools to support performance assessments. However, current tools and methods for computer aided conceptual design in architecture do not explicitly reveal nor allow for backtracking the relationships between performance and geometry of the design. They currently support post-engineering, rather than the early design decisions and the design exploration process. Focusing on large roofs, this research aims at developing a computational design approach to support designers in performance driven explorations. The approach is meant to facilitate the multidisciplinary integration and the learning process of the designer; and not to constrain the process in precompiled procedures or in hard engineering formulations, nor to automatize it by delegating the design creativity to computational procedures. PAS (Performance Assessment Strategies as a method is the main output of the research. It consists of a framework including guidelines and an extensible library of procedures for parametric modelling. It is structured on three parts. Pre-PAS provides guidelines for a design strategy-definition, toward the parameterization process. Model-PAS provides guidelines, procedures and scripts for building the parametric models. Explore-PAS supports the solutions-assessment based on numeric evaluations and performance simulations, until the identification of a suitable design solution. PAS has been developed based on action research. Several case studies

  3. On the Performance of the Cache Coding Protocol

    Directory of Open Access Journals (Sweden)

    Behnaz Maboudi

    2018-03-01

    Full Text Available Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent these attacks while allowing for the benefits of coding in mesh networks, the cache coding protocol was proposed. This protocol only allows recoding at the relays when the relay has received enough coded packets to decode an entire generation of packets. At that point, the relay node recodes and signs the recoded packets with its own private key, allowing the system to detect and minimize the effect of pollution attacks and making the relays accountable for changes on the data. This paper analyzes the delay performance of cache coding to understand the security-performance trade-off of this scheme. We introduce an analytical model for the case of two relays in an erasure channel relying on an absorbing Markov chain and an approximate model to estimate the performance in terms of the number of transmissions before successfully decoding at the receiver. We confirm our analysis using simulation results. We show that cache coding can overcome the security issues of unrestricted recoding with only a moderate decrease in system performance.

  4. User manual for the probabilistic fuel performance code FRP

    International Nuclear Information System (INIS)

    Friis Jensen, J.; Misfeldt, I.

    1980-10-01

    This report describes the use of the probabilistic fuel performance code FRP. Detailed description of both input to and output from the program are given. The use of the program is illustrated by an example. (author)

  5. On the performance of diagonal lattice space-time codes

    KAUST Repository

    Abediseid, Walid

    2013-11-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple output (MIMO) channel. All the coding design up-to-date focuses on either high-performance, high rates, low complexity encoding and decoding, or targeting a combination of these criteria [1]-[9]. In this paper, we analyze in details the performance limits of diagonal lattice space-time codes under lattice decoding. We present both lower and upper bounds on the average decoding error probability. We first derive a new closed-form expression for the lower bound using the so-called sphere lower bound. This bound presents the ultimate performance limit a diagonal lattice space-time code can achieve at any signal-to-noise ratio (SNR). The upper bound is then derived using the union-bound which demonstrates how the average error probability can be minimized by maximizing the minimum product distance of the code. Combining both the lower and the upper bounds on the average error probability yields a simple upper bound on the the minimum product distance that any (complex) lattice code can achieve. At high-SNR regime, we discuss the outage performance of such codes and provide the achievable diversity-multiplexing tradeoff under lattice decoding. © 2013 IEEE.

  6. Performance analysis of LDPC codes on OOK terahertz wireless channels

    International Nuclear Information System (INIS)

    Liu Chun; Wang Chang; Cao Jun-Cheng

    2016-01-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. (paper)

  7. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  8. Preserving Envelope Efficiency in Performance Based Code Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A. [Thornton Energy Consulting (United States); Sullivan, Greg P. [Efficiency Solutions (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baechler, Michael C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-20

    The City of Seattle 2012 Energy Code (Seattle 2014), one of the most progressive in the country, is under revision for its 2015 edition. Additionally, city personnel participate in the development of the next generation of the Washington State Energy Code and the International Energy Code. Seattle has pledged carbon neutrality by 2050 including buildings, transportation and other sectors. The United States Department of Energy (DOE), through Pacific Northwest National Laboratory (PNNL) provided technical assistance to Seattle in order to understand the implications of one potential direction for its code development, limiting trade-offs of long-lived building envelope components less stringent than the prescriptive code envelope requirements by using better-than-code but shorter-lived lighting and heating, ventilation, and air-conditioning (HVAC) components through the total building performance modeled energy compliance path. Weaker building envelopes can permanently limit building energy performance even as lighting and HVAC components are upgraded over time, because retrofitting the envelope is less likely and more expensive. Weaker building envelopes may also increase the required size, cost and complexity of HVAC systems and may adversely affect occupant comfort. This report presents the results of this technical assistance. The use of modeled energy code compliance to trade-off envelope components with shorter-lived building components is not unique to Seattle and the lessons and possible solutions described in this report have implications for other jurisdictions and energy codes.

  9. Conceptual Designs for the Performance Improvement of APR1400 SIT and Preliminary Performance Evaluation

    International Nuclear Information System (INIS)

    Chu, In-Cheol; Kwon, Tae-Soon; Song, Chul-Hwa

    2008-01-01

    Some evolutionary type PWRs such as APR1400 and APWR adopt advanced safety injection tank (SIT). The SIT of APR1400 has a fluidic device (FD) which passively controls ECC water injection flow rate into reactor coolant system during refill and reflood phases of LB-LOCA (i.e., a high injection flow rate during the refill phase and a low injection flow rate during the reflood phase). The benefit of the FD is the elimination of the function of low pressure safety injection pump from the safety injection system. The flow controlling performance of the APR1400 FD was evaluated using a prototypical full-scale test facility, called VAPER (Valve Performance Evaluation Rig). Even though the performance of the APR1400 FD satisfied major design and licensing requirements, further improvement of the performance is expected such as the extension of total injection period, the delay of nitrogen gas discharge. Several conceptual designs have been being drawn out in order to improve the performance of the APR1400 SIT. The performance of some designs was evaluated using a small scale SIT test rig. The present paper introduces some of the conceptual designs and shows the performance evaluation experimental results

  10. Development of LWR fuel performance code FEMAXI-6

    International Nuclear Information System (INIS)

    Suzuki, Motoe

    2006-01-01

    LWR fuel performance code: FEMAXI-6 (Finite Element Method in AXIs-symmetric system) is a representative fuel analysis code in Japan. Development history, background, design idea, features of model, and future are stated. Characteristic performance of LWR fuel and analysis code, what is model, development history of FEMAXI, use of FEMAXI code, fuel model, and a special feature of FEMAXI model is described. As examples of analysis, PCMI (Pellet-Clad Mechanical Interaction), fission gas release, gap bonding, and fission gas bubble swelling are reported. Thermal analysis and dynamic analysis system of FEMAXI-6, function block at one time step of FEMAXI-6, analytical example of PCMI in the output increase test by FEMAXI-III, analysis of fission gas release in Halden reactor by FEMAXI-V, comparison of the center temperature of fuel in Halden reactor, and analysis of change of diameter of fuel rod in high burn up BWR fuel are shown. (S.Y.)

  11. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  12. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Yidong Xia; Mitch Plummer; Robert Podgorney; Ahmad Ghassemi

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation angle for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.

  13. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    Science.gov (United States)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  14. Conceptual OOP design of Pilot Code for Two-Fluid, Three-field Model with C++ 6.0

    International Nuclear Information System (INIS)

    Chung, B. D.; Lee, Y. J.

    2006-09-01

    To establish the concept of the objective oriented program (OOP) design for reactor safety analysis code, the preliminary OOP design for PILOT code, which based on one dimensional two fluid three filed model, has been attempted with C++ language feature. Microsoft C++ language has been used since it is available as groupware utilization in KAERI. The language has can be merged with Compac Visual Fortran 6.6 in Visual Studio platform. In the development platform, C++ has been used as main language and Fortran has been used as mixed language in connection with C++ main drive program. The mixed language environment is a specific feature provided in visual studio. Existing Fortran source was utilized for input routine of reading steam table from generated file and routine of steam property calculation. The calling convention and passing argument from C++ driver was corrected. The mathematical routine, such as inverse matrix conversion and tridiagonal matrix solver, has been used as PILOT Fortran routines. Simple volume and junction utilized in PILOT code can be treated as objects, since they are the basic construction elements of code system. Other routines for overall solution scheme have been realized as procedure C functions. The conceptual design which consists of hydraulic loop, component, volume, and junction class has been described in the appendix in order to give the essential OOP structure of system safety analysis code. The attempt shows that many part of system analysis code can be expressed as objects, although the overall structure should be maintained as procedure functions. The encapsulation of data and functions within an object can provide many beneficial aspects in programming of system code

  15. Conceptual OOP design of Pilot Code for Two-Fluid, Three-field Model with C++ 6.0

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B. D.; Lee, Y. J

    2006-09-15

    To establish the concept of the objective oriented program (OOP) design for reactor safety analysis code, the preliminary OOP design for PILOT code, which based on one dimensional two fluid three filed model, has been attempted with C++ language feature. Microsoft C++ language has been used since it is available as groupware utilization in KAERI. The language has can be merged with Compac Visual Fortran 6.6 in Visual Studio platform. In the development platform, C++ has been used as main language and Fortran has been used as mixed language in connection with C++ main drive program. The mixed language environment is a specific feature provided in visual studio. Existing Fortran source was utilized for input routine of reading steam table from generated file and routine of steam property calculation. The calling convention and passing argument from C++ driver was corrected. The mathematical routine, such as inverse matrix conversion and tridiagonal matrix solver, has been used as PILOT Fortran routines. Simple volume and junction utilized in PILOT code can be treated as objects, since they are the basic construction elements of code system. Other routines for overall solution scheme have been realized as procedure C functions. The conceptual design which consists of hydraulic loop, component, volume, and junction class has been described in the appendix in order to give the essential OOP structure of system safety analysis code. The attempt shows that many part of system analysis code can be expressed as objects, although the overall structure should be maintained as procedure functions. The encapsulation of data and functions within an object can provide many beneficial aspects in programming of system code.

  16. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K. [Cray Inc., St. Paul, MN 55101 (United States); Porter, D. [Minnesota Supercomputing Institute for Advanced Computational Research, Minneapolis, MN USA (United States); O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W. [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Edmon, P., E-mail: pjm@cray.com, E-mail: nradclif@cray.com, E-mail: kkandalla@cray.com, E-mail: oneill@astro.umn.edu, E-mail: nolt0040@umn.edu, E-mail: donnert@ira.inaf.it, E-mail: twj@umn.edu, E-mail: dhp@umn.edu, E-mail: pedmon@cfa.harvard.edu [Institute for Theory and Computation, Center for Astrophysics, Harvard University, Cambridge, MA 02138 (United States)

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  17. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    International Nuclear Information System (INIS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W.; Edmon, P.

    2017-01-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  18. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  19. To master or perform? Exploring relations between achievement goals and conceptual change learning.

    Science.gov (United States)

    Ranellucci, John; Muis, Krista R; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M

    2013-09-01

    Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Seventy-three undergraduate students were assessed on their prior knowledge and misconceptions about Newtonian mechanics, and then reported their achievement goals and participated in think-aloud protocols while reading Newtonian physics texts. A mastery-approach goal orientation positively predicted deep processing strategies, shallow processing strategies, and conceptual change. In contrast, a performance-approach goal orientation did not predict either of the processing strategies, but negatively predicted conceptual change. A performance-avoidance goal orientation negatively predicted deep processing strategies and conceptual change. Moreover, deep and shallow processing strategies positively predicted conceptual change as well as recall. Finally, both deep and shallow processing strategies mediated relations between mastery-approach goals and conceptual change. Results provide some support for Dole and Sinatra's (1998) Cognitive Reconstruction of Knowledge Model of conceptual change but also challenge specific facets with regard to the role of depth of processing in conceptual change. © 2012 The British Psychological Society.

  20. The JAERI code system for evaluation of BWR ECCS performance

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Akimoto, Masayuki; Asahi, Yoshiro; Abe, Kiyoharu; Muramatsu, Ken; Araya, Fumimasa; Sato, Kazuo

    1982-12-01

    Development of respective computer code system of BWR and PWR for evaluation of ECCS has been conducted since 1973 considering the differences of the reactor cooling system, core structure and ECCS. The first version of the BWR code system, of which developmental work started earlier than that of the PWR, has been completed. The BWR code system is designed to provide computational tools to analyze all phases of LOCAs and to evaluate the performance of the ECCS including an ''Evaluation Model (EM)'' feature in compliance with the requirements of the current Japanese Evaluation Guideline of ECCS. The BWR code system could be used for licensing purpose, i.e. for ECCS performance evaluation or audit calculations to cross-examine the methods and results of applicants or vendors. The BWR code system presented in this report comprises several computer codes, each of which analyzes a particular phase of a LOCA or a system blowdown depending on a range of LOCAs, i.e. large and small breaks in a variety of locations in the reactor system. The system includes ALARM-B1, HYDY-B1 and THYDE-B1 for analysis of the system blowdown for various break sizes, THYDE-B-REFLOOD for analysis of the reflood phase and SCORCH-B2 for the calculation of the fuel assembl hot plane temperature. When the multiple codes are used to analyze a broad range of LOCA as stated above, it is very important to evaluate the adequacy and consistency between the codes used to cover an entire break spectrum. The system consistency together with the system performance are discussed for a large commercial BWR. (author)

  1. Deconstructing Masculinity: A Qualitative Study of College Men's Masculine Conceptualizations and Gender Performance

    Science.gov (United States)

    Harris, Frank, III

    2008-01-01

    Informed by the constructionist epistemological perspective, the purpose of this study was to examine socially constructed conceptualizations of masculinity and gender performance among 12 culturally diverse undergraduate men. The participants espoused seemingly productive conceptualizations of masculinity, yet their gendered behaviors were…

  2. Iterative optimization of performance libraries by hierarchical division of codes

    International Nuclear Information System (INIS)

    Donadio, S.

    2007-09-01

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  3. Analysis of parallel computing performance of the code MCNP

    International Nuclear Information System (INIS)

    Wang Lei; Wang Kan; Yu Ganglin

    2006-01-01

    Parallel computing can reduce the running time of the code MCNP effectively. With the MPI message transmitting software, MCNP5 can achieve its parallel computing on PC cluster with Windows operating system. Parallel computing performance of MCNP is influenced by factors such as the type, the complexity level and the parameter configuration of the computing problem. This paper analyzes the parallel computing performance of MCNP regarding with these factors and gives measures to improve the MCNP parallel computing performance. (authors)

  4. Structure of fuel performance audit code for SFR metal fuel

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yong Sik; Kim, Hyo Chan [KAERI, Daejeon (Korea, Republic of); Jeong, Hye Dong; Shin, An Dong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    A Sodium Cooled Fast Reactor (SFR) is a promising option to solve the spent fuel problems, but, there are still much technical issues to commercialize a SFR. One of issues is a development of advanced fuel which can solve the safety and the economic issues at the same time. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured. In Korea Institute of Nuclear Safety (KINS), the new project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. To develop the new code system, the code structure design and its requirements need to be studied. Various performance models and code systems are reviewed and their characteristics are analyzed in this paper. Based on this study, the fundamental performance models are deduced and basic code requirements and structure are established.

  5. Performance of FSO-OFDM based on BCH code

    Directory of Open Access Journals (Sweden)

    Jiao Xiao-lu

    2016-01-01

    Full Text Available As contrasted with the traditional OOK (on-off key system, FSO-OFDM system can resist the atmospheric scattering and improve the spectrum utilization rate effectively. Due to the instability of the atmospheric channel, the system will be affected by various factors, and resulting in a high BER. BCH code has a good error correcting ability, particularly in the short-length and medium-length code, and its performance is close to the theoretical value. It not only can check the burst errors but also can correct the random errors. Therefore, the BCH code is applied to the system to reduce the system BER. At last, the semi-physical simulation has been conducted with MATLAB. The simulation results show that when the BER is 10-2, the performance of OFDM is superior 4dB compared with OOK. In different weather conditions (extension rain, advection fog, dust days, when the BER is 10-5, the performance of BCH (255,191 channel coding is superior 4~5dB compared with uncoded system. All in all, OFDM technology and BCH code can reduce the system BER.

  6. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  7. Fusion PIC code performance analysis on the Cori KNL system

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, Tuomas S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Friesen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Raman, Karthic [INTEL Corp. (United States)

    2017-05-25

    We study the attainable performance of Particle-In-Cell codes on the Cori KNL system by analyzing a miniature particle push application based on the fusion PIC code XGC1. We start from the most basic building blocks of a PIC code and build up the complexity to identify the kernels that cost the most in performance and focus optimization efforts there. Particle push kernels operate at high AI and are not likely to be memory bandwidth or even cache bandwidth bound on KNL. Therefore, we see only minor benefits from the high bandwidth memory available on KNL, and achieving good vectorization is shown to be the most beneficial optimization path with theoretical yield of up to 8x speedup on KNL. In practice we are able to obtain up to a 4x gain from vectorization due to limitations set by the data layout and memory latency.

  8. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  9. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  10. Inter Organizational Relationships Performance in Third Party Logistics: conceptual framework and case study

    NARCIS (Netherlands)

    Aziz, Romana; Aziz, R.; van Hillegersberg, Jos; Kumar, Kuldeep; Kersten, W.; Blecker, T.; Luthje, C.

    2010-01-01

    Supplier relationship management is an important challenge for shippers in logistics outsourcing. This paper attempts to understand the factors which affect inter organizational relationships performance in third party logistics and proposes a conceptual framework specifically for inter

  11. NWTS conceptual reference repository description (CRRD). Volume III. Criteria, bases, special studies, and codes

    International Nuclear Information System (INIS)

    1981-05-01

    This volume documents the criteria, design bases, and special studies and provides the backup for the design presented in Volume II. The criteria presented here were developed by ONWI as a draft version for consideration in this conceptual report. Since these criteria were developed subsequent to preparation of the reports used as a basis for the CRRD, not all of the criteria could be fully considered in preparation of the CRRD. However, they were used as guidelines wherever possible. The criteria for terminal storage of waste are still in development. The chapter on the design bases identifies the important design considerations and provides the justification for their selection. The design bases were developed not so much to give exact values for parameters as to identify the parameters that are significant to the design. They also serve as a common basis for coordinating analysis and design studies unitl the next design phase is completed. Some of the design bases presented here were taken directly from the Stearns-Roger NWTS-R1 Conceptual Design Report. The special studies document technical aspects of the design that are of particular importance or that furnish additional information pertaining to the design

  12. Sensitivity assessment of fuel performance codes for LOCA accident scenario

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Alfredo; Gomes, Daniel; Silva, Antonio Teixeira e; Muniz, Rafael O.R. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Giovedi, Claudia; Martins, Marcelo, E-mail: ayabe@ipen.br, E-mail: claudia.giovedi@labrisco.usp.br [Universidade de Sao Paulo (LABRISCO/USP), Sao Paulo, SP (Brazil). Lab. de Análise, Avaliação e Gerenciamento de Risco

    2017-07-01

    FRAPCON code predicts fuel rod performance in LWR (Light Water Reactor) by modeling fuel responses under normal operating conditions and anticipated operational occurrences; FRAPTRAN code is applied for fuel transient under fast transient and accident conditions. The codes are well known and applied for different purposes and one of the use is to address sensitivity analysis considering fuel design parameters associated to fabrication, moreover can address the effect of physical models bias. The objective of this work was to perform an assessment of fuel manufacturing parameters tolerances and fuel models bias using FRAPCON and FRAPTRAN codes for Loss of Coolant Accident (LOCA) scenario. The preliminary analysis considered direct approach taken into account most relevant manufacturing tolerances (lower and upper bounds) related to design parameters and physical models bias without considering their statistical distribution. The simulations were carried out using the data available in the open literature related to the series of LOCA experiment performed at the Halden reactor (specifically IFA-650.5). The manufacturing tolerances associated to design parameters considered in this paper were: enrichment, cladding thickness, pellet diameter, pellet density, and filling gas pressure. The physical models considered were: fuel thermal expansion, fission gas release, fuel swelling, irradiation creep, cladding thermal expansion, cladding corrosion, and cladding hydrogen pickup. The results obtained from sensitivity analysis addressed the impact of manufacturing tolerances and physical models in the fuel cladding burst time observed for the IFA-650.5 experiment. (author)

  13. Sensitivity assessment of fuel performance codes for LOCA accident scenario

    International Nuclear Information System (INIS)

    Abe, Alfredo; Gomes, Daniel; Silva, Antonio Teixeira e; Muniz, Rafael O.R.; Giovedi, Claudia; Martins, Marcelo

    2017-01-01

    FRAPCON code predicts fuel rod performance in LWR (Light Water Reactor) by modeling fuel responses under normal operating conditions and anticipated operational occurrences; FRAPTRAN code is applied for fuel transient under fast transient and accident conditions. The codes are well known and applied for different purposes and one of the use is to address sensitivity analysis considering fuel design parameters associated to fabrication, moreover can address the effect of physical models bias. The objective of this work was to perform an assessment of fuel manufacturing parameters tolerances and fuel models bias using FRAPCON and FRAPTRAN codes for Loss of Coolant Accident (LOCA) scenario. The preliminary analysis considered direct approach taken into account most relevant manufacturing tolerances (lower and upper bounds) related to design parameters and physical models bias without considering their statistical distribution. The simulations were carried out using the data available in the open literature related to the series of LOCA experiment performed at the Halden reactor (specifically IFA-650.5). The manufacturing tolerances associated to design parameters considered in this paper were: enrichment, cladding thickness, pellet diameter, pellet density, and filling gas pressure. The physical models considered were: fuel thermal expansion, fission gas release, fuel swelling, irradiation creep, cladding thermal expansion, cladding corrosion, and cladding hydrogen pickup. The results obtained from sensitivity analysis addressed the impact of manufacturing tolerances and physical models in the fuel cladding burst time observed for the IFA-650.5 experiment. (author)

  14. WIPP conceptual design report. Addendum L. Mine safety code review for Waste Isolation Pilot Plant (WIPP)

    International Nuclear Information System (INIS)

    1977-06-01

    An initial review of New Mexico and Federal mining standards and regulations has been made to determine their applicability to the WIPP conceptual design. These standards and regulations are reviewed point by point and the enclosed listing includes comments and recommendations for those which will affect the design and/or eventual operations of WIPP. The majority of the standards, both federal and state, are standard safe mining practices. Those standards are listed which are thought should be emphasized for development of the design; also those that would increase the hazard risk by strict compliance. Because the WIPP facility is different in many respects from mines for which the regulations were intended, strict compliance in some respects would provide an increased hazard, while in other instances the regulations are less strict than is desirable. These are noted in the attached review

  15. Neutronic performance of Indian LLCB TBM set conceptual design in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Swami, H.L., E-mail: hswami@ipr.res.in; Shaw, A.K.; Mistry, A.N.; Danani, C.

    2016-12-15

    Highlights: • Neutronic analyses of conceptual design of LLCB test blanket module in ITER have been performed. • The estimated total tritium production rate in the LLCB TBM is 1.66E + 17 tritons/s. • Total heat deposited in the LLCB TBM is 0.46 MW and highest power density at TBM first wall is 5.2 Watt/cc. • The estimation shows the maximum DPA 2.72 at TBM FW. - Abstract: Tritium breeding blanket testing program in ITER is an important milestone towards the development of the fusion reactors. ITER organization is providing an opportunity to the partner countries to test their breeding blanket concepts. A mock-up of Indian Lead Lithium Ceramic Breeder (LLCB) tritium breeding blanket known as LLCB Test Blanket Module (TBM) will be tested in ITER equatorial port no. 2. LLCB blanket consists of lead lithium (PbLi) as a neutron multiplier & tritium breeder, ceramic breeder (Li{sub 2}TiO{sub 3}) as a tritium breeder and India specific Reduced Activation Ferretic Martinic Steel (IN-RAFMS) as a structural material. A stainless steel block which is cooled by water, called as shield block, is attached with TBM to provide neutron shield to ITER TBM port. A comprehensive neutronic performance evaluation is required for the design of the LLCB TBM set (TBM + shield block) and associated ancillary systems in ITER. The neutronic performance of the conceptual design of TBM set in ITER has been carried out and reported here. In order to carry out the neutronic performance evaluation, the neutronic models of the LLCB TBM set along with TBM frame have been constructed and inserted in the equatorial port of ITER reference neutronic model C-lite. Neutronic responses such as tritium production rate, nuclear heating, neutron flux & spectra, gas production & DPA in the LLCB TBM set are calculated considering 500 MW fusion power & fluence level of 0.3 MWa/m{sup 2}. Radiation transport code MCNP6 and FENDL 2.1 nuclear cross-section data library are used to perform the neutronic

  16. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  17. The Performative Gift: A Feminist Materialist Conceptual Model

    Directory of Open Access Journals (Sweden)

    Jörgen Skågeby

    2013-09-01

    Full Text Available Gifting (or gift-giving is a particularly interesting form of communication that envelops both material and social dimensions. Objects are transformed into gifts through particular socio-material practices. While these practices are, of course, interesting in themselves, this paper will take a step back and revisit attempts to define and theorize the gift as a concept. In a time when the gift economy is often called upon as a potential candidate for more “participatory alternatives to capitalist totality”, particularly in relation to theorizing of labour on and through the Internet, theories of gifting provide an important foundation for discussing the boundaries of alternative futures and economies. So far, little effort has been taken to advance gift theory into a new materialist or posthumanist thinking. In an attempt to take that first step, this paper provides two contributions. First, it highlights how feminist theorizing of the gift comprises interesting forerunners in a new materialist conception of the gift. Second, it explores the analytical traction that can be gained from interlocking theories of the gift, feminist materialism and digital media, the result being a conceptual model that addresses the gift as a form of virtual-digital-material communication.

  18. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  19. Performance of Low-Density Parity-Check Coded Modulation

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.

  20. Organizational Culture and Safety Performance in the Manufacturing Companies in Malaysia: A Conceptual Analysis

    OpenAIRE

    Ong Choon Hee; Lim Lee Ping

    2014-01-01

    The purpose of this paper is to provide a conceptual analysis of organizational culture and safety performance in the manufacturing companies in Malaysia. Our conceptual analysis suggests that manufacturing companies that adopt group culture or hierarchical culture are more likely to demonstrate safety compliance and safety participation. Manufacturing companies that adopt rational culture or developmental culture are less likely to demonstrate safety compliance and safety participation. Give...

  1. Interim performance specifications for conceptual waste-package designs for geologic isolation in salt repositories

    International Nuclear Information System (INIS)

    1983-06-01

    The interim performance specifications and data requirements presented apply to conceptual waste package designs for all waste forms which will be isolated in salt geologic repositories. The waste package performance specifications and data requirements respond to the waste package performance criteria. Subject areas treated include: containment and controlled release, operational period safety, criticality control, identification, and waste package performance testing requirements. This document was generated for use in the development of conceptual waste package designs in salt. It will be revised as additional data, analyses, and regulatory requirements become available

  2. Setting live coding performance in wider historical contexts

    OpenAIRE

    Norman, Sally Jane

    2016-01-01

    This paper sets live coding in the wider context of performing arts, construed as the poetic modelling and projection of liveness. Concepts of liveness are multiple, evolving, and scale-dependent: entities considered live from different cultural perspectives range from individual organisms and social groupings to entire ecosystems, and consequently reflect diverse temporal and spatial orders. Concepts of liveness moreover evolve with our tools, which generate and reveal new senses and places ...

  3. Use of advanced simulations in fuel performance codes

    International Nuclear Information System (INIS)

    Van Uffelen, P.

    2015-01-01

    The simulation of the cylindrical fuel rod behaviour in a reactor or a storage pool for spent fuel requires a fuel performance code. Such tool solves the equations for the heat transfer, the stresses and strains in fuel and cladding, the evolution of several isotopes and the behaviour of various fission products in the fuel rod. The main equations along with their limitations are briefly described. The current approaches adopted for overcoming these limitations and the perspectives are also outlined. (author)

  4. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  5. ATES/heat pump simulations performed with ATESSS code

    Science.gov (United States)

    Vail, L. W.

    1989-01-01

    Modifications to the Aquifer Thermal Energy Storage System Simulator (ATESSS) allow simulation of aquifer thermal energy storage (ATES)/heat pump systems. The heat pump algorithm requires a coefficient of performance (COP) relationship of the form: COP = COP sub base + alpha (T sub ref minus T sub base). Initial applications of the modified ATES code to synthetic building load data for two sizes of buildings in two U.S. cities showed insignificant performance advantage of a series ATES heat pump system over a conventional groundwater heat pump system. The addition of algorithms for a cooling tower and solar array improved performance slightly. Small values of alpha in the COP relationship are the principal reason for the limited improvement in system performance. Future studies at Pacific Northwest Laboratory (PNL) are planned to investigate methods to increase system performance using alternative system configurations and operations scenarios.

  6. Verona Coding Definitions of Emotional Sequences (VR-CoDES): Conceptual framework and future directions.

    Science.gov (United States)

    Piccolo, Lidia Del; Finset, Arnstein; Mellblom, Anneli V; Figueiredo-Braga, Margarida; Korsvold, Live; Zhou, Yuefang; Zimmermann, Christa; Humphris, Gerald

    2017-12-01

    To discuss the theoretical and empirical framework of VR-CoDES and potential future direction in research based on the coding system. The paper is based on selective review of papers relevant to the construction and application of VR-CoDES. VR-CoDES system is rooted in patient-centered and biopsychosocial model of healthcare consultations and on a functional approach to emotion theory. According to the VR-CoDES, emotional interaction is studied in terms of sequences consisting of an eliciting event, an emotional expression by the patient and the immediate response by the clinician. The rationale for the emphasis on sequences, on detailed classification of cues and concerns, and on the choices of explicit vs. non-explicit responses and providing vs. reducing room for further disclosure, as basic categories of the clinician responses, is described. Results from research on VR-CoDES may help raise awareness of emotional sequences. Future directions in applying VR-CoDES in research may include studies on predicting patient and clinician behavior within the consultation, qualitative analyses of longer sequences including several VR-CoDES triads, and studies of effects of emotional communication on health outcomes. VR-CoDES may be applied to develop interventions to promote good handling of patients' emotions in healthcare encounters. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Cost and performance analysis of conceptual designs of physical protection systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-01-01

    CPA -- Cost and Performance Analysis -- is a methodology that joins Activity Based Cost (ABC) estimation with performance based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology are addressing analysis of alternative conceptual designs. To support these activities, the original architecture for CPA, is being expanded to incorporate results from a suite of performance and consequence analysis tools such as JTS (Joint Tactical Simulation), ERAD (Explosive Release Atmospheric Dispersion) and blast effect models. The process flow for applying CPA to the development and analysis conceptual designs is illustrated graphically

  8. Performance analysis of LDPC codes on OOK terahertz wireless channels

    Science.gov (United States)

    Chun, Liu; Chang, Wang; Jun-Cheng, Cao

    2016-02-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. Project supported by the National Key Basic Research Program of China (Grant No. 2014CB339803), the National High Technology Research and Development Program of China (Grant No. 2011AA010205), the National Natural Science Foundation of China (Grant Nos. 61131006, 61321492, and 61204135), the Major National Development Project of Scientific Instrument and Equipment (Grant No. 2011YQ150021), the National Science and Technology Major Project (Grant No. 2011ZX02707), the International Collaboration and Innovation Program on High Mobility Materials Engineering of the Chinese Academy of Sciences, and the Shanghai Municipal Commission of Science and Technology (Grant No. 14530711300).

  9. High-Performance Green Building: Towards a Conceptual Framework

    CSIR Research Space (South Africa)

    Van Wyk, Llewellyn

    2016-10-01

    Full Text Available consumption is building performance design targets (AIA 2005:2). Kibert notes that “a unique vocabulary is emerging to describe concepts related to sustainability” including concepts such as “Factor 4 and Factor 10, ecological footprint, ecological rucksack...

  10. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  11. University Library Strategy Development: A Conceptual Model of Researcher Performance to Inform Service Delivery

    Science.gov (United States)

    Maddox, Alexia; Zhao, Linlin

    2017-01-01

    This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…

  12. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  13. Stereotype threat? Effects of inquiring about test takers' gender on conceptual test performance in physics

    Science.gov (United States)

    Maries, Alexandru; Singh, Chandralekha

    2015-12-01

    It has been found that activation of a stereotype, for example by indicating one's gender before a test, typically alters performance in a way consistent with the stereotype, an effect called "stereotype threat." On a standardized conceptual physics assessment, we found that asking test takers to indicate their gender right before taking the test did not deteriorate performance compared to an equivalent group who did not provide gender information. Although a statistically significant gender gap was present on the standardized test whether or not students indicated their gender, no gender gap was observed on the multiple-choice final exam students took, which included both quantitative and conceptual questions on similar topics.

  14. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  15. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  16. Safety analysis of MOX fuels by fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-12-01

    Performance of plutonium rick mixed oxide fuels specified for the Reduced-Moderation Water Reactor (RMWR) has been analysed by modified fuel performance code. Thermodynamic properties of these fuels up to 120 GWd/t burnup have not been measured and estimated using existing uranium fuel models. Fission product release, pressure rise inside fuel rods and mechanical loads of fuel cans due to internal pressure have been preliminarily assessed based on assumed axial power distribution history, which show the integrity of fuel performance. Detailed evaluation of fuel-cladding interactions due to thermal expansion or swelling of fuel pellets due to high burnup will be required for safety analysis of mixed oxide fuels. Thermal conductivity and swelling of plutonium rich mixed oxide fuels shall be taken into consideration. (T. Tanaka)

  17. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    International Nuclear Information System (INIS)

    Jang, Yu Jin

    2013-01-01

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  18. Performance and Fabrication Status of TREAT LEU Conversion Conceptual Design Concepts

    Energy Technology Data Exchange (ETDEWEB)

    IJ van Rooyen; SR Morrell; AE Wright; E. P Luther; K Jamison; AL Crawford; HT III Hartman

    2014-10-01

    Resumption of transient testing at the TREAT facility was approved in February 2014 to meet U.S. Department of Energy (DOE) objectives. The National Nuclear Security Administration’s Global Threat Reduction Initiative Convert Program is evaluating conversion of TREAT from its existing highly enriched uranium (HEU) core to a new core containing low enriched uranium (LEU). This paper describes briefly the initial pre-conceptual designs screening decisions with more detailed discussions on current feasibility, qualification and fabrication approaches. Feasible fabrication will be shown for a LEU fuel element assembly that can meet TREAT design, performance, and safety requirements. The statement of feasibility recognizes that further development, analysis, and testing must be completed to refine the conceptual design. Engineering challenges such as cladding oxidation, high temperature material properties, and fuel block fabrication along with neutronics performance, will be highlighted. Preliminary engineering and supply chain evaluation provided confidence that the conceptual designs can be achieved.

  19. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Yu Jin [Dongguk University, GyeongJu (Korea, Republic of)

    2013-07-15

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  20. Performance Analysis for Cooperative Communication System with QC-LDPC Codes Constructed with Integer Sequences

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2015-01-01

    Full Text Available This paper presents four different integer sequences to construct quasi-cyclic low-density parity-check (QC-LDPC codes with mathematical theory. The paper introduces the procedure of the coding principle and coding. Four different integer sequences constructing QC-LDPC code are compared with LDPC codes by using PEG algorithm, array codes, and the Mackey codes, respectively. Then, the integer sequence QC-LDPC codes are used in coded cooperative communication. Simulation results show that the integer sequence constructed QC-LDPC codes are effective, and overall performance is better than that of other types of LDPC codes in the coded cooperative communication. The performance of Dayan integer sequence constructed QC-LDPC is the most excellent performance.

  1. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  2. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  3. The error performance analysis over cyclic redundancy check codes

    Science.gov (United States)

    Yoon, Hee B.

    1991-06-01

    The burst error is generated in digital communication networks by various unpredictable conditions, which occur at high error rates, for short durations, and can impact services. To completely describe a burst error one has to know the bit pattern. This is impossible in practice on working systems. Therefore, under the memoryless binary symmetric channel (MBSC) assumptions, the performance evaluation or estimation schemes for digital signal 1 (DS1) transmission systems carrying live traffic is an interesting and important problem. This study will present some analytical methods, leading to efficient detecting algorithms of burst error using cyclic redundancy check (CRC) code. The definition of burst error is introduced using three different models. Among the three burst error models, the mathematical model is used in this study. The probability density function, function(b) of burst error of length b is proposed. The performance of CRC-n codes is evaluated and analyzed using function(b) through the use of a computer simulation model within CRC block burst error. The simulation result shows that the mean block burst error tends to approach the pattern of the burst error which random bit errors generate.

  4. Isotopic modelling using the ENIGMA-B fuel performance code

    International Nuclear Information System (INIS)

    Rossiter, G.D.; Cook, P.M.A.; Weston, R.

    2001-01-01

    A number of experimental programmes by BNFL and other MOX fabricators have now shown that the in-pile performance of MOX fuel is generally similar to that of conventional UO 2 fuel. Models based on UO 2 fuel experience form a good basis for a description of MOX fuel behaviour. However, an area where the performance of MOX fuel is sufficiently different from that of UO 2 to warrant model changes is in the radial power and burnup profile. The differences in radial power and burnup profile arise from the presence of significant concentrations of plutonium in MOX fuel, at beginning of life, and their subsequent evolution with burnup. Amongst other effects, plutonium has a greater neutron absorption cross-section than uranium. This paper focuses on the development of a new model for the radial power and burnup profile within a UO 2 or MOX fuel rod, in which the underlying fissile isotope concentration distributions are tracked during irradiation. The new model has been incorporated into the ENIGMA-B fuel performance code and has been extended to track the isotopic concentrations of the fission gases, xenon and krypton. The calculated distributions have been validated against results from rod puncture measurements and electron probe micro-analysis (EPMA) linescans, performed during the M501 post irradiation examination (PIE) programme. The predicted gas inventory of the fuel/clad gap is compared with the isotopic composition measured during rod puncture and the measured radial distributions of burnup (from neodymium measurements) and plutonium in the fuel are compared with the calculated distributions. It is shown that there is good agreement between the code predictions and the measurements. (author)

  5. The Effect of Contextual and Conceptual Rewording on Mathematical Problem-Solving Performance

    Science.gov (United States)

    Haghverdi, Majid; Wiest, Lynda R.

    2016-01-01

    This study shows how separate and combined contextual and conceptual problem rewording can positively influence student performance in solving mathematical word problems. Participants included 80 seventh-grade Iranian students randomly assigned in groups of 20 to three experimental groups involving three types of rewording and a control group. All…

  6. A Conceptual Framework of Corporate and Business Ethics across Organizations: Structures, Processes and Performance

    Science.gov (United States)

    Svensson, Goran; Wood, Greg

    2011-01-01

    Purpose: The objective of this paper is to introduce and describe a conceptual framework of corporate and business ethics across organizations in terms of ethical structures, ethical processes and ethical performance. Design/methodology/approach: A framework is outlined and positioned incorporating an ethical frame of reference in the field of…

  7. Does the new conceptual framework provide adequate concepts for reporting relevant information about performance?

    NARCIS (Netherlands)

    Brouwer, A.; Faramarzi, A; Hoogendoorn, M.

    2014-01-01

    The basic question we raise in this paper is whether the 2013 Discussion Paper (DP 2013) on the Conceptual Framework provides adequate principles for reporting an entity’s performance and what improvements could be made in light of both user needs and evidence from academic literature. DP 2013

  8. Development of Tokamak reactor system code and conceptual studies of DEMO with He Cooled Molten Li blanket

    International Nuclear Information System (INIS)

    Hong, B.G.; Lee, Dong Won; Kim, Yong Hi

    2007-01-01

    To develop the concepts of fusion power plants and identify the design parameters, we have been developing the tokamak reactor system code. The system code can take into account a wide range of plasma physics and technology effects simultaneously and it can be used to find design parameters which optimize the given figure of merits. The outcome of the system studies using the system code is to identify which areas of plasma physics and technologies and to what extent should be developed for realization of a given fusion power plant concepts. As an application of the tokamak reactor system code, we investigate the performance of DEMO for early realization with a limited extension from the plasma physics and technology used in the design of the ITER. Main requirements for DEMO are selected as: 1) to demonstrate tritium self-sufficiency, 2) to generate net electricity, and 3) for steady-state operation. The size of plasma is assumed to be same as that of ITER and the plasma parameters which characterize the performance, i.e. normalized β value, β N , confinement improvement factor for the H-mode, H and the ratio of the Greenwald density limit n/n G are assumed to be improved beyond those of ITER: β N >2.0, H>1.0 and n/n G >1.0. Tritium self-sufficiency is provided by the He Cooled Molten Lithium (HCML) blanket with the total thickness of 2.5 m including the shield. With n/n G >1.2, net electric power bigger than 500 MW is possible with β N >4.0 andH>1.2. To access operation space for higher electric power, main restrictions are given by the divertor heat load and the steady-state operation requirements. Developments in both plasma physics and technology are required to handle high heat load and to increase the current drive efficiency. (orig.)

  9. Software Design Document for the AMP Nuclear Fuel Performance Code

    International Nuclear Information System (INIS)

    Philip, Bobby; Clarno, Kevin T.; Cochran, Bill

    2010-01-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  10. BER performance comparison of optical CDMA systems with/without turbo codes

    Science.gov (United States)

    Kulkarni, Muralidhar; Chauhan, Vijender S.; Dutta, Yashpal; Sinha, Ravindra K.

    2002-08-01

    In this paper, we have analyzed and simulated the BER performance of a turbo coded optical code-division multiple-access (TC-OCDMA) system. A performance comparison has been made between uncoded OCDMA and TC-OCDMA systems employing various OCDMA address codes (optical orthogonal codes (OOCs), Generalized Multiwavelength Prime codes (GMWPC's), and Generalized Multiwavelength Reed Solomon code (GMWRSC's)). The BER performance of TC-OCDMA systems has been analyzed and simulated by varying the code weight of address code employed by the system. From the simulation results, it is observed that lower weight address codes can be employed for TC-OCDMA systems that can have the equivalent BER performance of uncoded systems employing higher weight address codes for a fixed number of active users.

  11. Early Experiences Writing Performance Portable OpenMP 4 Codes

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Hernandez, Oscar R [ORNL

    2016-01-01

    In this paper, we evaluate the recently available directives in OpenMP 4 to parallelize a computational kernel using both the traditional shared memory approach and the newer accelerator targeting capabilities. In addition, we explore various transformations that attempt to increase application performance portability, and examine the expressiveness and performance implications of using these approaches. For example, we want to understand if the target map directives in OpenMP 4 improve data locality when mapped to a shared memory system, as opposed to the traditional first touch policy approach in traditional OpenMP. To that end, we use recent Cray and Intel compilers to measure the performance variations of a simple application kernel when executed on the OLCF s Titan supercomputer with NVIDIA GPUs and the Beacon system with Intel Xeon Phi accelerators attached. To better understand these trade-offs, we compare our results from traditional OpenMP shared memory implementations to the newer accelerator programming model when it is used to target both the CPU and an attached heterogeneous device. We believe the results and lessons learned as presented in this paper will be useful to the larger user community by providing guidelines that can assist programmers in the development of performance portable code.

  12. Test Code Quality and Its Relation to Issue Handling Performance

    NARCIS (Netherlands)

    Athanasiou, D.; Nugroho, A.; Visser, J.; Zaidman, A.

    2014-01-01

    Automated testing is a basic principle of agile development. Its benefits include early defect detection, defect cause localization and removal of fear to apply changes to the code. Therefore, maintaining high quality test code is essential. This study introduces a model that assesses test code

  13. Fire-safety engineering and performance-based codes

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    project administrators, etc. The book deals with the following topics: • Historical presentation on the subject of fire • Legislation and building project administration • European fire standardization • Passive and active fire protection • Performance-based Codes • Fire-safety Engineering • Fundamental......Fire-safety Engineering is written as a textbook for Engineering students at universities and other institutions of higher education that teach in the area of fire. The book can also be used as a work of reference for consulting engineers, Building product manufacturers, contractors, building...... thermodynamics • Heat exchange during the fire process • Skin burns • Burning rate, energy release rate and design fires • Proposal to Risk-based design fires • Proposal to a Fire scale • Material ignition and flame spread • Fire dynamics in buildings • Combustion products and toxic gases • Smoke inhalation...

  14. Assessment of MARMOT. A Mesoscale Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, M. R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, X. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yu, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Teague, M. C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andersson, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics, and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.

  15. A Conceptual Framework for Evaluation of Public Health and Primary Care System Performance in Iran

    Science.gov (United States)

    Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Sari, Ali Akbari; Mesdaghinia, Alireza

    2015-01-01

    Introduction: The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. Methods: We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. Results: We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. Conclusion: The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report. PMID:25946937

  16. A conceptual framework for evaluation of public health and primary care system performance in iran.

    Science.gov (United States)

    Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Akbari Sari, Ali; Mesdaghinia, Alireza

    2015-01-26

    The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report.

  17. Conceptualizing performance of nursing care as a prerequisite for better measurement: a systematic and interpretive review

    Science.gov (United States)

    2013-01-01

    Background Despite the critical role of nursing care in determining high-performing healthcare delivery, performance science in this area is still at an early stage of development and nursing’s contribution most often remains invisible to policy-makers and managers. The objectives of this study were: 1) to develop a theoretically based framework to conceptualize nursing care performance; 2) to analyze how the different components of the framework have been operationalized in the literature; and 3) to develop a pool of indicators sensitive to various aspects of nursing care that can be used as a basis for designing a performance measurement system. Methods We carried out a systematic review of published literature across three databases (MEDLINE, EMBASE and CINAHL), focusing on literature between 1990 and 2008. Screening of 2,103 papers resulted in final selection of 101 papers. A detailed template was used to extract the data. For the analysis, we used the method of interpretive synthesis, focusing first on 31 papers with theoretical or conceptual frameworks; the remaining 70 articles were used to strengthen and consolidate the findings. Results Current conceptualizations of nursing care performance mostly reflect a system perspective that builds on system theory, Donabedian’s earlier works on healthcare organization, and Parsons’ theory of social action. Drawing on these foundational works and the evidence collated, the Nursing Care Performance Framework (NCPF) we developed conceptualizes nursing care performance as resulting from three nursing subsystems that operate together to achieve three key functions: (1) acquiring, deploying and maintaining nursing resources, (2) transforming nursing resources into nursing services, and (3) producing changes in patients’ conditions. Based on the literature review, these three functions are operationalized through 14 dimensions that cover 51 variables. The NCPF not only specifies core aspects of nursing performance

  18. Performance Analysis of New Binary User Codes for DS-CDMA Communication

    Science.gov (United States)

    Usha, Kamle; Jaya Sankar, Kottareddygari

    2016-03-01

    This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.

  19. Conceptual Kinematic Design and Performance Evaluation of a Chameleon-Like Service Robot for Space Stations

    Directory of Open Access Journals (Sweden)

    Marco Ceccarelli

    2015-03-01

    Full Text Available In this paper a conceptual kinematic design of a chameleon-like robot with proper mobility capacity is presented for service applications in space stations as result of design considerations with biomimetic inspiration by looking at chameleons. Requirements and characteristics are discussed with the aim to identify design problems and operation features. A study of feasibility is described through performance evaluation by using simulations for a basic operation characterization.

  20. Impact of intra-flow network coding on the relay channel performance: an analytical study

    OpenAIRE

    Apavatjrut , Anya; Goursaud , Claire; Jaffrès-Runser , Katia; Gorce , Jean-Marie

    2012-01-01

    International audience; One of the most powerful ways to achieve trans- mission reliability over wireless links is to employ efficient coding techniques. This paper investigates the performance of a transmission over a relay channel where information is protected by two layers of coding. In the first layer, transmission reliability is ensured by fountain coding at the source. The second layer incorporates network coding at the relay node. Thus, fountain coded packets are re-encoded at the relay...

  1. Performance Evaluation of HARQ Technique with UMTS Turbo Code

    Directory of Open Access Journals (Sweden)

    S. S. Brkić

    2011-11-01

    Full Text Available The hybrid automatic repeat request technique (HARQ represents the error control principle which combines an error correcting code and automatic repeat request procedure (ARQ, within the same transmission system. In this paper, using Monte Carlo simulation process, the characteristics of HARQ technique are determined, for the case of the Universal Mobile Telecommunication System (UMTS turbo code.

  2. Environmental performance of green building code and certification systems.

    Science.gov (United States)

    Suh, Sangwon; Tomar, Shivira; Leighton, Matthew; Kneifel, Joshua

    2014-01-01

    We examined the potential life-cycle environmental impact reduction of three green building code and certification (GBCC) systems: LEED, ASHRAE 189.1, and IgCC. A recently completed whole-building life cycle assessment (LCA) database of NIST was applied to a prototype building model specification by NREL. TRACI 2.0 of EPA was used for life cycle impact assessment (LCIA). The results showed that the baseline building model generates about 18 thousand metric tons CO2-equiv. of greenhouse gases (GHGs) and consumes 6 terajoule (TJ) of primary energy and 328 million liter of water over its life-cycle. Overall, GBCC-compliant building models generated 0% to 25% less environmental impacts than the baseline case (average 14% reduction). The largest reductions were associated with acidification (25%), human health-respiratory (24%), and global warming (GW) (22%), while no reductions were observed for ozone layer depletion (OD) and land use (LU). The performances of the three GBCC-compliant building models measured in life-cycle impact reduction were comparable. A sensitivity analysis showed that the comparative results were reasonably robust, although some results were relatively sensitive to the behavioral parameters, including employee transportation and purchased electricity during the occupancy phase (average sensitivity coefficients 0.26-0.29).

  3. Validation of SCALE code package on high performance neutron shields

    International Nuclear Information System (INIS)

    Bace, M.; Jecmenica, R.; Smuc, T.

    1999-01-01

    The shielding ability and other properties of new high performance neutron shielding materials from the KRAFTON series have been recently published. A comparison of the published experimental and MCNP results for the two materials of the KRAFTON series, with our own calculations has been done. Two control modules of the SCALE-4.4 code system have been used, one of them based on one dimensional radiation transport analysis (SAS1) and other based on the three dimensional Monte Carlo method (SAS3). The comparison of the calculated neutron dose equivalent rates shows a good agreement between experimental and calculated results for the KRAFTON-N2 material.. Our results indicate that the N2-M-N2 sandwich type is approximately 10% inferior as neutron shield to the KRAFTON-N2 material. All values of neutron dose equivalent obtained by SAS1 are approximately 25% lower in comparison with the SAS3 results, which indicates proportions of discrepancies introduced by one-dimensional geometry approximation.(author)

  4. Code structure for U-Mo fuel performance analysis in high performance research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Gwan Yoon; Cho, Tae Won; Lee, Chul Min; Sohn, Dong Seong [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Kyu Hong; Park, Jong Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A performance analysis modeling applicable to research reactor fuel is being developed with available models describing fuel performance phenomena observed from in-pile tests. We established the calculation algorithm and scheme to best predict fuel performance using radio-thermo-mechanically coupled system to consider fuel swelling, interaction layer growth, pore formation in the fuel meat, and creep fuel deformation and mass relocation, etc. In this paper, we present a general structure of the performance analysis code for typical research reactor fuel and advanced features such as a model to predict fuel failure induced by combination of breakaway swelling and pore growth in the fuel meat. Thermo-mechanical code dedicated to the modeling of U-Mo dispersion fuel plates is being under development in Korea to satisfy a demand for advanced performance analysis and safe assessment of the plates. The major physical phenomena during irradiation are considered in the code such that interaction layer formation by fuel-matrix interdiffusion, fission induced swelling of fuel particle, mass relocation by fission induced stress, and pore formation at the interface between the reaction product and Al matrix.

  5. Performance Analysis of Wavelength Multiplexed Sac Ocdma Codes in Beat Noise Mitigation in Sac Ocdma Systems

    Science.gov (United States)

    Alhassan, A. M.; Badruddin, N.; Saad, N. M.; Aljunid, S. A.

    2013-07-01

    In this paper we investigate the use of wavelength multiplexed spectral amplitude coding (WM SAC) codes in beat noise mitigation in coherent source SAC OCDMA systems. A WM SAC code is a low weight SAC code, where the whole code structure is repeated diagonally (once or more) in the wavelength domain to achieve the same cardinality as a higher weight SAC code. Results show that for highly populated networks, the WM SAC codes provide better performance than SAC codes. However, for small number of active users the situation is reversed. Apart from their promising improvement in performance, these codes are more flexible and impose less complexity on the system design than their SAC counterparts.

  6. HOW TO IMPROVE EMPLOYEE MOTIVATION AND GROUP PERFORMANCE THROUGH LEADERSHIP – CONCEPTUAL MODEL

    Directory of Open Access Journals (Sweden)

    Tebeian Adriana

    2012-07-01

    Full Text Available We live times of intense change, due to fast technological development, when information and knowledge are certain factors of power. Organizations today must not underestimate the importance of capable employees, owners of these great factors like information and knowledge. The key to success, as many organizations have shown, is, of course a mix of attributes, but one of much importance stands in the quality of employees and their performance. How to motivate and keep such employees is a real challenge in the actual business world. Employee fluctuation is a big risk, because it brings many losses: loss of expertise, of know-how, of time, it breaks relationships and teams. One solution to this problem is creating a good and attractive environment, so that employees are motivated enough to bring up the best in them and rise their performance to a high level. The main objective of this study is to present a new approach in motivating employees and raising their performance. In this approach, the leader is the main pawn, the generator and sustainer of a healthy environment. By reviewing the literature and the most recent works in this domain, two styles of leadership come into focus: transformational and servant leadership. Both are relatively new concepts, with common but also different attributes, and both having the same final goal: performance, but through a different approach. Based on a vast literature research, the conceptual model was built, considering the main attributes of the two leadership styles, their impact on motivational factors, their influence on job satisfaction and ultimately on team performance. The intermediary results of the research sustain the conceptual framework and main hypothesis of the study. One important contribution of this research is introducing elements of servant leadership together with transformational leadership in the empirical research and also creating an overall conceptual framework which can be applied

  7. Postclosure performance assessment of the SCP [Site Characterization Plan] conceptual design for horizontal emplacement: Revision 1

    International Nuclear Information System (INIS)

    1987-08-01

    This report is a preliminary postclosure performance assessment of the repository design specified in the Site Characterization Plan Conceptual Design Report (SCP-CDR) for horizontal emplacement of high-level nuclear waste. At the time that these analyses were done, horizontal emplacement was the preferred orientation for the waste packages but vertical emplacement is now the reference design. This assessment consists of (1) a review of the regulatory requirements and strategy to demonstrate compliance with these requirements, (2) an analysis of the performance of the total repository system, (3) an analysis of the thermomechanical behavior of the repository, (4) an analysis of brine mobility in the repository, (5) an analysis of the waste package performance, (6) an analysis of the performance of seals, and (7) comments on the sensitivity of the various performance measures to uncertainties in the data and models. These are preliminary analyses and, in most cases, involve bounding calculations of the repository behavior. They have several purposes including (1) assessing how well this conceptual design ''measures up'' against requirements, (2) gaining experience in implementing the performance assessment strategy and tools and thereby learning where improvements are needed, (3) helping to identify needed data, and (4) helping to indicate required design modifications. 26 refs., 40 figs., 20 tabs

  8. Performance Analysis of a Decoding Algorithm for Algebraic Geometry Codes

    DEFF Research Database (Denmark)

    Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund; Høholdt, Tom

    1998-01-01

    We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is greater than or equal to [(dFR-1)/2]+1, where dFR is the Feng-Rao distance......We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is greater than or equal to [(dFR-1)/2]+1, where dFR is the Feng-Rao distance...

  9. Transient and fuel performance analysis with VTT's coupled code system

    International Nuclear Information System (INIS)

    Daavittila, A.; Hamalainen, A.; Raty, H.

    2005-01-01

    VTT (technical research center of Finland) maintains and further develops a comprehensive safety analysis code system ranging from the basic neutronic libraries to 3-dimensional transient analysis and fuel behaviour analysis codes. The code system is based on various types of couplings between the relevant physical phenomena. The main tools for analyses of reactor transients are presently the 3-dimensional reactor dynamics code HEXTRAN for cores with a hexagonal fuel assembly geometry and TRAB-3D for cores with a quadratic fuel assembly geometry. HEXTRAN has been applied to safety analyses of VVER type reactors since early 1990's. TRAB-3D is the latest addition to the code system, and has been applied to BWR and PWR analyses in recent years. In this paper it is shown that TRAB-3D has calculated accurately the power distribution during the Olkiluoto-1 load rejection test. The results from the 3-dimensional analysis can be used as boundary conditions for more detailed fuel rod analysis. For this purpose a general flow model GENFLO, developed at VTT, has been coupled with USNRC's FRAPTRAN fuel accident behaviour model. The example case for FRAPTRAN-GENFLO is for an ATWS at a BWR plant. The basis for the analysis is an oscillation incident in the Olkiluoto-1 BWR during reactor startup on February 22, 1987. It is shown that the new coupled code FRAPTRAN/GENFLO is quite a promising tool that can handle flow situations and give a detailed analysis of reactor transients

  10. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  11. Current Status of the LIFE Fast Reactors Fuel Performance Codes

    International Nuclear Information System (INIS)

    Yacout, A.M.; Billone, M.C.

    2013-01-01

    The LIFE-4 (Rev. 1) code was calibrated and validated using data from (U,Pu)O2 mixed-oxide fuel pins and UO2 blanket rods which were irradiation tested under steady-state and transient conditions. – It integrates a broad material and fuel-pin irradiation database into a consistent framework for use and extrapolation of the database to reactor design applications. – The code is available and running on different computer platforms (UNIX & PC) – Detailed documentations of the code’s models, routines, calibration and validation data sets are available. LIFE-METAL code is based on LIFE4 with modifications to include key phenomena applicable to metallic fuel, and metallic fuel properties – Calibrated with large database from irradiations in EBR-II – Further effort for calibration and detailed documentation. Recent activities with the codes are related to reactor design studies and support of licensing efforts for 4S and KAERI SFR designs. Future activities are related to re-assessment of the codes calibration and validation and inclusion of models for advanced fuels (transmutation fuels)

  12. Heterogeneous fuels for minor actinides transmutation: Fuel performance codes predictions in the EFIT case study

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, R., E-mail: rolando.calabrese@enea.i [ENEA, Innovative Nuclear Reactors and Fuel Cycle Closure Division, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Vettraino, F.; Artioli, C. [ENEA, Innovative Nuclear Reactors and Fuel Cycle Closure Division, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Sobolev, V. [SCK.CEN, Belgian Nuclear Research Centre, Boeretang 200, B-2400 Mol (Belgium); Thetford, R. [Serco Technical and Assurance Services, 150 Harwell Business Centre, Didcot OX11 0QB (United Kingdom)

    2010-06-15

    Plutonium recycling in new-generation fast reactors coupled with minor actinides (MA) transmutation in dedicated nuclear systems could achieve a decrease of nuclear waste long-term radiotoxicity by two orders of magnitude in comparison with current once-through strategy. In a double-strata scenario, purpose-built accelerator-driven systems (ADS) could transmute minor actinides. The innovative nuclear fuel conceived for such systems demands significant R and D efforts in order to meet the safety and technical performance of current fuel systems. The Integrated Project EUROTRANS (EUROpean research programme for the TRANSmutation of high level nuclear waste in ADS), part of the EURATOM Framework Programme 6 (FP6), undertook some of this research. EUROTRANS developed from the FP5 research programmes on ADS (PDS-XADS) and on fuels dedicated to MA transmutation (FUTURE, CONFIRM). One of its main objectives is the conceptual design of a small sub-critical nuclear system loaded with uranium-free fuel to provide high MA transmutation efficiency. These principles guided the design of EFIT (European Facility for Industrial Transmutation) in the domain DESIGN of IP EUROTRANS. The domain AFTRA (Advanced Fuels for TRAnsmutation system) identified two composite fuel systems: a ceramic-ceramic (CERCER) where fuel particles are dispersed in a magnesia matrix, and a ceramic-metallic (CERMET) with a molybdenum matrix in the place of MgO matrix to host a ceramic fissile phase. The EFIT fuel is composed of plutonium and MA oxides in solid solution with isotopic vectors typical of LWR spent fuel with 45 MWd/kg{sub HM} discharge burnup and 30 years interim storage before reprocessing. This paper is focused on the thermomechanical state of the hottest fuel pins of two EFIT cores of 400 MW{sub (th)} loaded with either CERCER or CERMET fuels. For calculations three fuel performance codes were used: FEMALE, TRAFIC and TRANSURANUS. The analysis was performed at the beginning of fuel life

  13. Implementation of computer codes for performance assessment of the Republic repository of LLW/ILW Mochovce

    International Nuclear Information System (INIS)

    Hanusik, V.; Kopcani, I.; Gedeon, M.

    2000-01-01

    This paper describes selection and adaptation of computer codes required to assess the effects of radionuclide release from Mochovce Radioactive Waste Disposal Facility. The paper also demonstrates how these codes can be integrated into performance assessment methodology. The considered codes include DUST-MS for source term release, MODFLOW for ground-water flow and BS for transport through biosphere and dose assessment. (author)

  14. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  15. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  16. Performance Analysis of Faulty Gallager-B Decoding of QC-LDPC Codes with Applications

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2014-06-01

    Full Text Available In this paper we evaluate the performance of Gallager-B algorithm, used for decoding low-density parity-check (LDPC codes, under unreliable message computation. Our analysis is restricted to LDPC codes constructed from circular matrices (QC-LDPC codes. Using Monte Carlo simulation we investigate the effects of different code parameters on coding system performance, under a binary symmetric communication channel and independent transient faults model. One possible application of the presented analysis in designing memory architecture with unreliable components is considered.

  17. Conceptual study on high performance blanket in a spherical tokamak fusion-driven transmuter

    International Nuclear Information System (INIS)

    Chen Yixue; Wu Yican

    2000-01-01

    A preliminary conceptual design on high performance dual-cooled blanket of fusion-driven transmuter is presented based on neutronic calculation. The dual-cooled system has some attractive advantages when utilized in transmutation of HLW (High Level Wastes). The calculation results show that this kind of blanket could safely transmute about 6 ton minor actinides (produced by 170 GW(e) Year PWRs approximately) and 0.4 ton fission products per year, and output 12 GW thermal power. In addition, the variation of power and critical factor of this blanket is relatively little during its 1-year operation period. This blanket is also tritium self-sustainable

  18. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  19. Performance Analysis of DPSK Signals with Selection Combining and Convolutional Coding in Fading Channel

    National Research Council Canada - National Science Library

    Ong, Choon

    1998-01-01

    The performance analysis of a differential phase shift keyed (DPSK) communications system, operating in a Rayleigh fading environment, employing convolutional coding and diversity processing is presented...

  20. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    Science.gov (United States)

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  1. Solar thermal–photovoltaic powered potato cold storage – Conceptual design and performance analyses

    International Nuclear Information System (INIS)

    Basu, Dipankar N.; Ganguly, A.

    2016-01-01

    Highlights: • Loss of food crop is a huge problem in India due to the shortage of cold storage. • Conceptual design of a power system using solar energy for a potato cold storage. • Integration of flat plate collector and SPV module with suitable operating strategy. • System provides a net energy surplus of about 36 MW h over a calendar year. • Rudimentary economic analysis found payback period of less than four years. - Abstract: Wastage of food crops due to the dearth of proper cold storage facilities is a huge problem in underdeveloped and developing countries of the world. Conceptual design of a potato cold storage is presented here, along with performance appraisal over a calendar year. The microclimate inside the cold storage is regulated using a water–lithium bromide absorption system. Proposed system utilizes both solar thermal and photovoltaic generated electrical energy for its operation. A suitable operation strategy is devised and the performance of the integrated system is analyzed from energy and exergy point of view to identify the required numbers of thermal collectors and photovoltaic modules. The proposed system is found to provide a net surplus of about 36 MW h energy over a calendar year, after meeting the in-house requirements. A rudimentary economic analysis is also performed to check the financial viability of the proposed system. Both the thermal and photovoltaic components are found to have payback periods less than four years.

  2. Effects of donor proliferation in development aid for health on health program performance: A conceptual framework.

    Science.gov (United States)

    Pallas, Sarah Wood; Ruger, Jennifer Prah

    2017-02-01

    Development aid for health increased dramatically during the past two decades, raising concerns about inefficiency and lack of coherence among the growing number of global health donors. However, we lack a framework for how donor proliferation affects health program performance to inform theory-based evaluation of aid effectiveness policies. A review of academic and gray literature was conducted. Data were extracted from the literature sample on study design and evidence for hypothesized effects of donor proliferation on health program performance, which were iteratively grouped into categories and mapped into a new conceptual framework. In the framework, increases in the number of donors are hypothesized to increase inter-donor competition, transaction costs, donor poaching of recipient staff, recipient control over aid, and donor fragmentation, and to decrease donors' sense of accountability for overall development outcomes. There is mixed evidence on whether donor proliferation increases or decreases aid volume. These primary effects in turn affect donor innovation, information hoarding, and aid disbursement volatility, as well as recipient country health budget levels, human resource capacity, and corruption, and the determinants of health program performance. The net effect of donor proliferation on health will vary depending on the magnitude of the framework's competing effects in specific country settings. The conceptual framework provides a foundation for improving design of aid effectiveness practices to mitigate negative effects from donor proliferation while preserving its potential benefits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Performance enhancement of successive interference cancellation scheme based on spectral amplitude coding for optical code-division multiple-access systems using Hadamard codes

    Science.gov (United States)

    Eltaif, Tawfig; Shalaby, Hossam M. H.; Shaari, Sahbudin; Hamarsheh, Mohammad M. N.

    2009-04-01

    A successive interference cancellation scheme is applied to optical code-division multiple-access (OCDMA) systems with spectral amplitude coding (SAC). A detailed analysis of this system, with Hadamard codes used as signature sequences, is presented. The system can easily remove the effect of the strongest signal at each stage of the cancellation process. In addition, simulation of the prose system is performed in order to validate the theoretical results. The system shows a small bit error rate at a large number of active users compared to the SAC OCDMA system. Our results reveal that the proposed system is efficient in eliminating the effect of the multiple-user interference and in the enhancement of the overall performance.

  4. Conceptual Design of Object Oriented Program (OOP) for Pilot Code of Two-Fluid, Three-field Model with C++ 6.0

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin

    2006-01-01

    Engineering software for design purpose in nuclear industries have been developed since early 1970s, and well established in 1980s. The most popular and common language for the software development has been FORTRAN series, until the more sophisticated GUI and software coupling is needed. The advanced computer language, such as C++, C has been developed to help the programming for the easy GUI need and reuse of well developed routines, with adopting the objective oriented program. A recent trend of programming becomes objective-oriented since the results are often more intuitive and easier to maintain than procedure program. The main motivation of this work is to capture objective oriented concepts for conventional safety analysis programs which consist of many functions and procedure oriented structures. In this work, the new objective programming with C++ 6.0 language has been tried for the PILOT code written in FORTRAN language, and conceptual OOP design of the system safety analysis code has been done

  5. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    Energy Technology Data Exchange (ETDEWEB)

    Sobolik, S.R.; Ho, C.K.; Dunn, E. [Sandia National Labs., Albuquerque, NM (United States); Robey, T.H. [Spectra Research Inst., Albuquerque, NM (United States); Cruz, W.T. [Univ. del Turabo, Gurabo (Puerto Rico)

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document.

  6. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    International Nuclear Information System (INIS)

    Sobolik, S.R.; Ho, C.K.; Dunn, E.; Robey, T.H.; Cruz, W.T.

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document

  7. Pre-conceptual-schema-based patterns for deriving key performance indicators from strategic objectives

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2017-05-01

    Full Text Available Performance measurement is crucial for achieving business success. Moreover, such success is also related to the fulfillment of the organizational strategic objectives. Hence, an adequate determination of relevant performance indicators—or key performance indicators (KPIs—and their relationships to organizational objectives is needed. Even though several approaches for treating KPIs and objective-KPI relationships have been proposed, they exhibit some drawbacks associated with the lack of reusability and traceability. We attempt to fill this gap by proposing a set of patterns based on pre-conceptual schemas for supporting the systematic derivation of KPIs and their relationships to organizational objectives. In this way, the proposed patterns guarantee a reusable and traceable derivation process of a set of candidate KPIs from organizational strategic objectives. Lastly, we provide a lab study in order to illustrate the usefulness of this proposal.

  8. A conceptual model of nurses' goal orientation, service behavior, and service performance.

    Science.gov (United States)

    Chien, Chun-Cheng; Chou, Hsin-Kai; Hung, Shuo-Tsung

    2008-01-01

    Based on the conceptual framework known as the "service triangle," the authors constructed a model of nurses' goal orientation, service behavior, and service performance to investigate the antecedents and consequences of the medical service behavior provided by nurses. This cross-sectional study collected data from 127 nurses in six hospitals using a mail-in questionnaire. Analysis of the model revealed that the customer-oriented behavior of nurses had a positive influence on organizational citizenship behavior; and both of these behaviors had a significant positive influence on service performance. The results also indicate that a higher learning goal orientation among nurses was associated with the performance of both observable customer-oriented behavior and organizational-citizenship behavior.

  9. PERCON: A flexible computer code for detailed thermal performance studies

    International Nuclear Information System (INIS)

    Boardman, F.B.; Collier, W.D.

    1975-07-01

    PERCON is a computer code which evaluates temperatures in three dimensions for a block containing heat sources and having coolant flow in one dimension. The solution is obtained at successive planes perpendicular to the coolant flow and the progression from one plane to the next occurs by the heat to the coolant determining convective boundary conditions at the next plane after due allowance being made for any lateral mixing or mass transfer between coolants. It is also possible to calculate the diametral change along a radius as a function of irradiation shrinkage and thermal expansion. This is used in a 'through life' calculation which evalates interaction pressure in tubular fuel elements. Physical property data used by the code may be specified as functions of temperature. The coolant flow may be specified, or alternatively derived by the program to satisfy either a specified overall pressure drop or mixed mean temperature rise. The pressure drop through each coolant is calculated and the flow modified, followed by a repeat of the temperature calculation, until the pressure imbalance between chosen flow channels at chosen axial positions is less than the specified convergence limit. A detailed description of the facilities in the code is given and some cases which have been studied are discussed. (U.K.)

  10. CONCEPTUAL APRROACH OF LEADERSHIP AND EMPLOYEES’ INDIVIDUAL PERFORMANCE RELATIONSHIP

    Directory of Open Access Journals (Sweden)

    Ispas Andreia

    2012-07-01

    Full Text Available There is a need for more leadership research in the hotel industry. A reason is the better understanding of the effects and influences that manager-employee relationship can have. Further, this can improve employee performance at their work and clients would be more satisfied. Due to the emergent stage of this field of research, the leadership and employee performance relationship has a lot of potential for being investigated in the context of hotel industry. There are many types of leadership styles recommended to be used in managers’ daily hotels’ activities. The most suggested is transformational leadership. Hotel industry is part of hospitality industry and is providing different services such as accommodation, food nbeverage, entertainment. This makes hotel industry very specific and the hotel products, services unique for each of client. That’s why high standards of organizational performance, a good manager-employee relationship and a high perceived employee performance by the clients are the keys for a long and sustainable business. Leadership is important in getting people to do the right things at their job. More, it helps employee to work better, to be motivated, to strive for excellence and perform beyond expectations. The first objective of this paper is to analyze the main approaches of leadership styles applied in the context of hotel industry studies. The second objective is to explore, from a theoretical point of view, the direct and indirect relationship between leadership style and employee performance based on two mediator factors: job satisfaction and organizational commitment. These two mediators are the most used in organizational researches and have a high degree of validity. The methodology applied is based on documentary study review and critic analysis of the concepts used by the literature. The results of the conceptual paper are: the construction of the conceptual model of scientific research

  11. Performance analysis of wavelength/spatial coding system with fixed in-phase code matrices in OCDMA network

    Science.gov (United States)

    Tsai, Cheng-Mu; Liang, Tsair-Chun

    2011-12-01

    This paper proposes a wavelength/spatial (W/S) coding system with fixed in-phase code (FIPC) matrix in the optical code-division multiple-access (OCDMA) network. A scheme is presented to form the FIPC matrix which is applied to construct the W/S OCDMA network. The encoder/decoder in the W/S OCDMA network is fully able to eliminate the multiple-access-interference (MAI) at the balanced photo-detectors (PD), according to fixed in-phase cross correlation. The phase-induced intensity noise (PIIN) related to the power square is markedly suppressed in the receiver by spreading the received power into each PD while the net signal power is kept the same. Simulation results show that the W/S OCDMA network based on the FIPC matrices cannot only completely remove the MAI but effectively suppress the PIIN to upgrade the network performance.

  12. Embedded Sensors and Controls to Improve Component Performance and Reliability Conceptual Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Kisner, R.; Melin, A.; Burress, T.; Fugate, D.; Holcomb, D.; Wilgen, J.; Miller, J.; Wilson, D.; Silva, P.; Whitlow, L.; Peretz, F.

    2012-09-15

    The objective of this project is to demonstrate improved reliability and increased performance made possible by deeply embedding instrumentation and controls (I&C) in nuclear power plant (NPP) components and systems. The project is employing a highly instrumented canned rotor, magnetic bearing, fluoride salt pump as its I&C technology demonstration platform. I&C is intimately part of the basic millisecond-by-millisecond functioning of the system; treating I&C as an integral part of the system design is innovative and will allow significant improvement in capabilities and performance. As systems become more complex and greater performance is required, traditional I&C design techniques become inadequate and more advanced I&C needs to be applied. New I&C techniques enable optimal and reliable performance and tolerance of noise and uncertainties in the system rather than merely monitoring quasistable performance. Traditionally, I&C has been incorporated in NPP components after the design is nearly complete; adequate performance was obtained through over-design. By incorporating I&C at the beginning of the design phase, the control system can provide superior performance and reliability and enable designs that are otherwise impossible. This report describes the progress and status of the project and provides a conceptual design overview for the platform to demonstrate the performance and reliability improvements enabled by advanced embedded I&C.

  13. Conceptual framework in creating and selecting the performance measurement system for marketing strategy control

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Performance measurement in modern conditions is one of the most important business requirements since enterprises face the need to exhibit returns for stockholders and investors, but also contribution of management to those returns, as well as contribution of certain business units, functional departments and activities within them. Hence, it is particularly important to assess marketing successfulness as a business function according to return on investment in marketing activities, but also according to the set of indicators from following performance groups - marketing effectiveness and marketing efficiency. Core issue is which measures to select and use in the marketing performance measurement system of certain enterprise so that it could be able to assess how effective and efficient its marketing is. In other words, adequate performance measurement system ought to contain performance measures that will be used to monitor effects and marketing strategy implementation process (controlling while implementing, and performance measures that can be applied to overall effect monitoring after the strategy implementation period. Otherwise, creating the marketing performance measurement system is a complex task for marketing managers. Therefore, the aim of this paper is to explore key principles and develop conceptual framework for creating and selecting performance measurement system for marketing strategy control which is based on characteristics and key success factors of marketing strategy, that is activities and actions for its operationalizing and effective implementing.

  14. An Investigation of Seventh Grade Students’ Performances on Conceptual, Procedural and Graphical Problems Regarding Circles

    Directory of Open Access Journals (Sweden)

    Lütfi İncikabı

    2015-04-01

    Full Text Available The purpose of this study is to determine seventh grade students’ preferences among the procedural, conceptual and graphical questions in the subject of circles, to define their success levels in their preferences, and to compare students’ success levels in one question type with their performances in other question types. The methodology adopted during this research was case study. Based on criterion-based purposive sampling strategy, 98 middle school students were selected as the participants. Data were collected through an achievement test consisting of nine questions (three per question type. The results obtained from the study indicated that students mostly preferred graphical question types. Moreover, majority of students could not succeeded high levels in their preferred question types. In addition, the students performed better in graphical question types; however, the failure in procedural question types was remarkable. Keywords: Multiple representations, middle school students, mathematics education, circles

  15. The Barrier code for predicting long-term concrete performance

    International Nuclear Information System (INIS)

    Shuman, R.; Rogers, V.C.; Shaw, R.A.

    1989-01-01

    There are numerous features incorporated into a LLW disposal facility that deal directly with critical safety objectives required by the NRC in 10 CFR 61. Engineered barriers or structures incorporating concrete are commonly being considered for waste disposal facilities. The Barrier computer code calculates the long-term degradation of concrete structures in LLW disposal facilities. It couples this degradation with water infiltration into the facility, nuclide leaching from the waste, contaminated water release from the facility, and associated doses to members of the critical population group. The concrete degradation methodology of Barrier is described

  16. A comparison of thermal algorithms of fuel rod performance code systems

    International Nuclear Information System (INIS)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C.

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance

  17. A comparison of thermal algorithms of fuel rod performance code systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance.

  18. Construction and performance analysis of variable-weight optical orthogonal codes for asynchronous OCDMA systems

    Science.gov (United States)

    Li, Chuan-qi; Yang, Meng-jie; Zhang, Xiu-rong; Chen, Mei-juan; He, Dong-dong; Fan, Qing-bin

    2014-07-01

    A construction scheme of variable-weight optical orthogonal codes (VW-OOCs) for asynchronous optical code division multiple access (OCDMA) system is proposed. According to the actual situation, the code family can be obtained by programming in Matlab with the given code weight and corresponding capacity. The formula of bit error rate (BER) is derived by taking account of the effects of shot noise, avalanche photodiode (APD) bulk, thermal noise and surface leakage currents. The OCDMA system with the VW-OOCs is designed and improved. The study shows that the VW-OOCs have excellent performance of BER. Despite of coming from the same code family or not, the codes with larger weight have lower BER compared with the other codes in the same conditions. By taking simulation, the conclusion is consistent with the analysis of BER in theory. And the ideal eye diagrams are obtained by the optical hard limiter.

  19. New Technique for Improving Performance of LDPC Codes in the Presence of Trapping Sets

    Directory of Open Access Journals (Sweden)

    Mohamed Adnan Landolsi

    2008-06-01

    Full Text Available Trapping sets are considered the primary factor for degrading the performance of low-density parity-check (LDPC codes in the error-floor region. The effect of trapping sets on the performance of an LDPC code becomes worse as the code size decreases. One approach to tackle this problem is to minimize trapping sets during LDPC code design. However, while trapping sets can be reduced, their complete elimination is infeasible due to the presence of cycles in the underlying LDPC code bipartite graph. In this work, we introduce a new technique based on trapping sets neutralization to minimize the negative effect of trapping sets under belief propagation (BP decoding. Simulation results for random, progressive edge growth (PEG and MacKay LDPC codes demonstrate the effectiveness of the proposed technique. The hardware cost of the proposed technique is also shown to be minimal.

  20. High-Performance Java Codes for Computational Fluid Dynamics

    Science.gov (United States)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  1. Approaching a Conceptual Framework for Research on Sustainability Performance in Corporate Value Chains

    DEFF Research Database (Denmark)

    Kjærgaard, Thomas

    variations of stakeholder engagement and adopt a value chain narrative in their sustainability reporting. Multi-stakeholder reporting standards like the Global Reporting Initiative (GRI) and the UN Global Compact (UNGC) are adopted by corporations across industries, but only target sustainability issues...... in supply- and value chains to a limited extent. Though, this article proposes that the ongoing work towards new standards for integrated sustainability reporting represents a unique opportunity for increasing the presence of supply- and value chain perspectives in reporting in a way that facilitates a more...... optimal use of sustainability reports as a primary data source in research. Hence, this article proposes a conceptual framework for research on sustainability performance in corporate value chains, which potentially increases the future contributions to both the literature let alone practice. Different...

  2. Process-based interpretation of conceptual hydrological model performance using a multinational catchment set

    Science.gov (United States)

    Poncelet, Carine; Merz, Ralf; Merz, Bruno; Parajka, Juraj; Oudin, Ludovic; Andréassian, Vazken; Perrin, Charles

    2017-08-01

    Most of previous assessments of hydrologic model performance are fragmented, based on small number of catchments, different methods or time periods and do not link the results to landscape or climate characteristics. This study uses large-sample hydrology to identify major catchment controls on daily runoff simulations. It is based on a conceptual lumped hydrological model (GR6J), a collection of 29 catchment characteristics, a multinational set of 1103 catchments located in Austria, France, and Germany and four runoff model efficiency criteria. Two analyses are conducted to assess how features and criteria are linked: (i) a one-dimensional analysis based on the Kruskal-Wallis test and (ii) a multidimensional analysis based on regression trees and investigating the interplay between features. The catchment features most affecting model performance are the flashiness of precipitation and streamflow (computed as the ratio of absolute day-to-day fluctuations by the total amount in a year), the seasonality of evaporation, the catchment area, and the catchment aridity. Nonflashy, nonseasonal, large, and nonarid catchments show the best performance for all the tested criteria. We argue that this higher performance is due to fewer nonlinear responses (higher correlation between precipitation and streamflow) and lower input and output variability for such catchments. Finally, we show that, compared to national sets, multinational sets increase results transferability because they explore a wider range of hydroclimatic conditions.

  3. Embedded Sensors and Controls to Improve Component Performance and Reliability: Conceptual Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Kisner, Roger A [ORNL; Melin, Alexander M [ORNL; Burress, Timothy A [ORNL; Fugate, David L [ORNL; Holcomb, David Eugene [ORNL; Wilgen, John B [ORNL; Miller, John M [ORNL; Wilson, Dane F [ORNL; Silva, Pamela C [ORNL; Whitlow, Lynsie J [ORNL; Peretz, Fred J [ORNL

    2012-10-01

    The overall project objective is to demonstrate improved reliability and increased performance made possible by deeply embedding instrumentation and controls (I&C) in nuclear power plant components. The project is employing a highly instrumented canned rotor, magnetic bearing, fluoride salt pump as its I&C technology demonstration vehicle. The project s focus is not primarily on pump design, but instead is on methods to deeply embed I&C within a pump system. However, because the I&C is intimately part of the basic millisecond-by-millisecond functioning of the pump, the I&C design cannot proceed in isolation from the other aspects of the pump. The pump will not function if the characteristics of the I&C are not embedded within the design because the I&C enables performance of the basic function rather than merely monitoring quasi-stable performance. Traditionally, I&C has been incorporated in nuclear power plant (NPP) components after their design is nearly complete; adequate performance was obtained through over-design. This report describes the progress and status of the project and provides a conceptual design overview for the embedded I&C pump.

  4. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  5. DEXTER: A one-dimensional code for calculating thermionic performance of long converters

    Science.gov (United States)

    Sawyer, C. D.

    1971-01-01

    A versatile code is described for computing the coupled thermionic electric-thermal performance of long thermionic converters in which the temperature and voltage variations cannot be neglected. The code is capable of accounting for a variety of external electrical connection schemes, coolant flow paths and converter failures by partial shorting. Example problem solutions are included along with a user's manual.

  6. Dexter - A one-dimensional code for calculating thermionic performance of long converters.

    Science.gov (United States)

    Sawyer, C. D.

    1971-01-01

    This paper describes a versatile code for computing the coupled thermionic electric-thermal performance of long thermionic converters in which the temperature and voltage variations cannot be neglected. The code is capable of accounting for a variety of external electrical connection schemes, coolant flow paths and converter failures by partial shorting. Example problem solutions are given.

  7. PERFORMANCE ANALYSIS OF OPTICAL CDMA SYSTEM USING VC CODE FAMILY UNDER VARIOUS OPTICAL PARAMETERS

    Directory of Open Access Journals (Sweden)

    HASSAN YOUSIF AHMED

    2012-06-01

    Full Text Available The intent of this paper is to study the performance of spectral-amplitude coding optical code-division multiple-access (OCDMA systems using Vector Combinatorial (VC code under various optical parameters. This code can be constructed by an algebraic way based on Euclidian vectors for any positive integer number. One of the important properties of this code is that the maximum cross-correlation is always one which means that multi-user interference (MUI and phase induced intensity noise are reduced. Transmitter and receiver structures based on unchirped fiber Bragg grating (FBGs using VC code and taking into account effects of the intensity, shot and thermal noise sources is demonstrated. The impact of the fiber distance effects on bit error rate (BER is reported using a commercial optical systems simulator, virtual photonic instrument, VPITM. The VC code is compared mathematically with reported codes which use similar techniques. We analyzed and characterized the fiber link, received power, BER and channel spacing. The performance and optimization of VC code in SAC-OCDMA system is reported. By comparing the theoretical and simulation results taken from VPITM, we have demonstrated that, for a high number of users, even if data rate is higher, the effective power source is adequate when the VC is used. Also it is found that as the channel spacing width goes from very narrow to wider, the BER decreases, best performance occurs at a spacing bandwidth between 0.8 and 1 nm. We have shown that the SAC system utilizing VC code significantly improves the performance compared with the reported codes.

  8. Effect of alternative conceptual models in a preliminary performance assessment for the waste isolation pilot plant

    International Nuclear Information System (INIS)

    Helton, J.C.; Bean, J.E.; Berglund, J.W.; Beyeler, W.; Garner, J.W.; Iuzzolino, H.J.; Marietta, M.G.; Rudeen, D.K.; Schreiber, J.D.; Swift, P.N.; Tierney, M.S.; Vaughn, P.

    1995-01-01

    The most appropriate conceptual model for performance assessment (PA) at the waste isolation pilot plant (WIPP) is believed to include gas generation resulting from corrosion and microbial action in the repository, and a dual-porosity (matrix and fracture porosity) representation for the solute transport in the Culebra dolomite member of the Rustler formation. Under these assumptions, complementary cummulative distribution functions (CCDFs) which summarize the radionuclide releases to the accessible environment, resulting from both cutting removal and groundwater transport, fall substantially below the release limits promulgated by the US Environmental Protection Agency (EPA), with the releases being dominated by cuttings removal. To provide additional views, the following alternative conceptual models were considered as part of a preliminary PA for the WIPP: no gas generation in the repository and a dual-porosity transport model in the Culebra; gas generation in the repository and a single-porosity (fracture porosity) transport model in the Culebra; no gas generation in the repository and a single-porosity transport model in the Culebra; gas generation in the repository and a dual-porosity transport model in the Culebra, without chemical retardation; gas generation in the repository, chemical retardation in the Culebra, and extremes of climatic variation. These variations relate to groundwater transport, so do not affect the releases resulting from cuttings removal. Several of these variations substantially increase the importance of releases resulting from groundwater transport relative to releases resulting from cuttings removal. However, the total amount of releases generally remained small, with the CCDFs which summarize the releases to the accessible environment falling below the EPA release limits

  9. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  10. A Conceptual Framework to Measure Systems’ Performance during Emergency Preparedness Exercises

    Directory of Open Access Journals (Sweden)

    Elena Savoia

    2014-09-01

    Full Text Available Large-scale public health emergencies require a sophisticated, coordinated response involving multiple entities to protect health and minimize suffering. However, the rarity of such emergencies presents a barrier to gathering observational data about the effectiveness of the public health response before such events occur. For this reason, public health practitioners increasingly have relied on simulated emergencies, known as “exercises” as proxies to test their emergency capabilities. However, the formal evaluation of performance in these exercises, historically has been inconsistent, and there is little research to describe how data acquired from simulated emergencies actually support conclusions about the quality of the public health emergency response system. Over the past six years, we have designed and evaluated more than seventy public health emergency exercises, collaborating with public health agencies, hospitals and others to test a wide variety of systems and their capabilities. Using the data and experience that we gathered, we have developed a conceptual framework that describes the essential elements necessary to consider when applying performance measurement science to public health emergency exercises. We suggest that this framework may assist practitioners and researchers who wish to better measure performance in exercises and to improve public health emergency preparedness.

  11. Do Performance-Based Codes Support Universal Design in Architecture?

    DEFF Research Database (Denmark)

    Grangaard, Sidse; Frandsen, Anne Kathrine

    2016-01-01

    – Universal Design (UD). The empirical material consists of input from six workshops to which all 700 Danish Architectural firms were invited, as well as eight group interviews. The analysis shows that the current prescriptive requirements are criticized for being too homogenous and possibilities...... for differentiation and zoning are required. Therefore, a majority of professionals are interested in a performance-based model because they think that such a model will support ‘accessibility zoning’, achieving flexibility because of different levels of accessibility in a building due to its performance. The common...... of educational objectives is suggested as a tool for such a boost. The research project has been financed by the Danish Transport and Construction Agency....

  12. Lysimeter data as input to performance assessment source term codes

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.; Rogers, R.D.; Sullivan, T.

    1992-01-01

    The Field Lysimeter Investigation: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-II c prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. In this paper, radionuclide releases from waste forms in the first seven years of sampling are presented and discussed. Application of lysimeter data to be used in performance assessment source term models is presented. Initial results from use of data in two models are discussed

  13. To Master or Perform? Exploring Relations between Achievement Goals and Conceptual Change Learning

    Science.gov (United States)

    Ranellucci, John; Muis, Krista R.; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M.

    2013-01-01

    Background: Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. Aims: To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Sample and Method:…

  14. Optimizing fusion PIC code performance at scale on Cori Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, T. S.; Deslippe, J.

    2017-07-23

    In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale well up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.

  15. Conceptual structure of performance assessments conducted for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Helton, J.C.; Marietta, M.G.; Rechard, R.P.

    1993-04-01

    The Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico is being developed by the US Department of Energy as a disposal facility for transuranic waste. In support of this project, Sandia National Laboratories is conducting an ongoing performance assessment (PA) for the WIPP. The ordered triple representation for risk proposed by Kaplan and Garrick is used to provide a clear conceptual structure for this PA. This presentation describes how the preceding representation provides a basis in the WIPP PA for (1) the definition of scenarios and the calculation of scenario probabilities and consequences, (2) the separation of subjective and stochastic uncertainties, (3) the construction of the complementary cumulative distribution functions required in comparisons with the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (i.e., 40 CFR Part 191, Subpart B), and (4) the performance of uncertainty and sensitivity studies. Results obtained in a preliminary PA for the WIPP completed in December of 1991 are used for illustration

  16. Conceptual Design Optimization of an Augmented Stability Aircraft Incorporating Dynamic Response Performance Constraints

    Science.gov (United States)

    Welstead, Jason

    2014-01-01

    This research focused on incorporating stability and control into a multidisciplinary de- sign optimization on a Boeing 737-class advanced concept called the D8.2b. A new method of evaluating the aircraft handling performance using quantitative evaluation of the sys- tem to disturbances, including perturbations, continuous turbulence, and discrete gusts, is presented. A multidisciplinary design optimization was performed using the D8.2b transport air- craft concept. The con guration was optimized for minimum fuel burn using a design range of 3,000 nautical miles. Optimization cases were run using xed tail volume coecients, static trim constraints, and static trim and dynamic response constraints. A Cessna 182T model was used to test the various dynamic analysis components, ensuring the analysis was behaving as expected. Results of the optimizations show that including stability and con- trol in the design process drastically alters the optimal design, indicating that stability and control should be included in conceptual design to avoid system level penalties later in the design process.

  17. Conceptual design and performance simulations of super-compact electromagnetic calorimeter

    Directory of Open Access Journals (Sweden)

    Skoda Libor

    2013-11-01

    Full Text Available Measurements of particle production at forward rapidities in high energy p-p, p-A and A-A collisions provide access to physics processes at very low Bjorken x. These measurements will allow to study the gluon saturation scale and improve our knowledge of parton distribution in nuclei. Specific requirements must be fulfilled for a calorimeter to successfully operate in high-multiplicity forward region within often stringent space limits. Here we present a study of a conceptual design of super-compact electromagnetic calorimeter being developed at Czech Technical University in Prague. The design of the sampling calorimeter is based on a sandwich structure of thin tungsten and scintillator layers oriented in parallel to the beam. Used optical readout of individual scintillator pads guaranties the required high radiation hardness of the detector. We present simulation of the expected performance of the optical pad readout together with overall detector performance. It is aimed for the detector to allow measuring of high energy photons (1

  18. Quantifying the performance of two conceptual models for snow dominated catchments in Austria and Turkey

    Science.gov (United States)

    Sensoy, Aynur; Parajka, Juraj; Coskun, Cihan; Sorman, Arda; Ertas, Cansaran

    2014-05-01

    In many mountainous regions, snowmelt makes significant contribution to streamflow, particularly during spring and summer months. Understanding the magnitude and timing of this contribution and hydrological forecasts are essential for a range of purposes concerning the implications with water resources management. Conceptual hydrological models have been widely applied for mountain catchments both for operational and scientific applications. Hydrologiska Byran Vattenbalansavdelning (HBV) and Snowmelt Runoff Model (SRM) are selected in this study as the commonly used conceptual models in hydrological modeling forecasting for a number of basins in several countries. Moreover, this selection is also supported by the experiences on the improvement and application in remote sensing techniques in snow dominated regions. The greatest similarity between the two models is that each uses a temperature index method to predict melt rate whereas the greatest difference lies in the way snow cover is handled. In mountainous regions, data limitations prevent detailed understanding of the variability of snow cover and melt. In situ snowpack measurements are sparsely distributed relative to snowpack heterogeneity therefore, to supplement ground measurements; remotely sensed images of snow covered area (SCA) provide useful information for runoff prediction during the snowmelt season. SCA has been used as a direct input to SRM and as a means of checking the internal validity for HBV model. Moderate Resolution Imaging Spectroradiometer (MODIS) daily snow cover products with 500 m spatial resolution are used to derive SCA data in this study. A number of studies have been reported in the literature indicated that the model performance can vary depending on several factors, including the scale and characteristics of the catchment, availability of the data required and runoff producing mechanism. Therefore, five different catchments including data scare and rich basins, areas and reliefs

  19. Code division multiple-access techniques in optical fiber networks. II - Systems performance analysis

    Science.gov (United States)

    Salehi, Jawad A.; Brackett, Charles A.

    1989-08-01

    A technique based on optical orthogonal codes was presented by Salehi (1989) to establish a fiber-optic code-division multiple-access (FO-CDMA) communications system. The results are used to derive the bit error rate of the proposed FO-CDMA system as a function of data rate, code length, code weight, number of users, and receiver threshold. The performance characteristics for a variety of system parameters are discussed. A means of reducing the effective multiple-access interference signal by placing an optical hard-limiter at the front end of the desired optical correlator is presented. Performance calculations are shown for the FO-CDMA with an ideal optical hard-limiter, and it is shown that using a optical hard-limiter would, in general, improve system performance.

  20. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    Science.gov (United States)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  1. Conceptual analyses of neutronic and equilibrium refueling parameters to develop a cost-effective multi-purpose pool-type research reactor using WIMSD and CITVAP codes

    Energy Technology Data Exchange (ETDEWEB)

    Hedayat, Afshin, E-mail: ahedayat@aeoi.org.ir

    2016-12-01

    Highlights: • Introducing a high-beneficent and low-cost multipurpose research reactor. • High technical documents and standard safety issues are introduced coherently. • High effective conceptual neutronic analyses and fuel management strategy. • Gaining high score design criteria and safety margins via 3-D core modeling. • Capacity and capability to produce all medical and industrial radioisotopes. - Abstract: In this paper, neutronic and equilibrium refueling parameters of a multi-purpose cost-effective research reactor have been studied and analyzed. It has been tried to provide periodic and long-term requirements of the irradiating applications coherently. The WIMSD5B and CITVAP codes are used to calculate neutronic parameters and simulate fuel management strategy. The used nuclear data, codes, and calculating methods have been severally benchmarked and verified, successfully. Fundamental concepts, design criteria, and safety issues are introduced and discussed, coherently. Design criteria are selected to gain the most economic benefits per capital costs via minimum required reactor power. Accurate, fast and simplified models have been tried for an integrated decision making and analyses using deterministic codes. Core management, power effects, fuel consumption and burn up effects, and also a complete simulation of the fuel management strategy are presented and analyzed. Results show that the supposed reactor core design can be promisingly suitable in accordance with the commercial multi-purpose irradiating applications. It also retains Operating Limits and Conditions (OLCs) due to standard safety issues, conservatively where safety parameters are calculated using best estimate tools. Such reactor core configuration and integrated refueling task can effectively enhance the Quality Assurance (QA) of the general irradiating applications of the current MTR within their power limits and corresponding OLCs.

  2. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    International Nuclear Information System (INIS)

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites

  3. Conceptual structure of the 1996 performance assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    HELTON, JON CRAIG; ANDERSON, D. RICHARD; BASABILVAZO, G.; JOW, HONG-NIAN; MARIETTA, MELVIN G.

    2000-01-01

    The conceptual structure of the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) is described. This structure involves three basic entities (EN1, EN2, EN3): (1) EN1, a probabilistic characterization of the likelihood of different futures occurring at the WIPP site over the next 10,000 yr, (2) EN2, a procedure for estimating the radionuclide releases to the accessible environment associated with each of the possible futures that could occur at the WIPP site over the next 10,000 yr, and (3) EN3, a probabilistic characterization of the uncertainty in the parameters used in the definition of EN1 and EN2. In the formal development of the 1996 WIPP PA, EN1 is characterized by a probability space (S st , P st , p st ) for stochastic (i.e., aleatory) uncertainly; EN2 is characterized by a function (line i ntegral) that corresponds to the models and associated computer programs used to estimate radionuclide releases; and EN3 is characterized by a probability space (S su , P su , p su ) for subjective (i.e., epistemic) uncertainty. A high-level overview of the 1996 WIPP PA and references to additional sources of information are given in the context of (S st , P st , p st ), (line i ntegral) and (S su , P su , p su )

  4. The Relationship Between Work Experience and Job Performance: A Conceptual and Meta-Analytic Review

    National Research Council Canada - National Science Library

    Quinones, Miguel

    2001-01-01

    .... To this end, we conducted a conceptual review of the work experience literature, developed a framework for measuring different facets of work experience, and conducted a meta-analysis to examine...

  5. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  6. Stress, Cognition, and Human Performance: A Literature Review and Conceptual Framework

    Science.gov (United States)

    Staal, Mark A.

    2004-01-01

    The following literature review addresses the effects of various stressors on cognition. While attempting to be as inclusive as possible, the review focuses its examination on the relationships between cognitive appraisal, attention, memory, and stress as they relate to information processing and human performance. The review begins with an overview of constructs and theoretical perspectives followed by an examination of effects across attention, memory, perceptual-motor functions, judgment and decision making, putative stressors such as workload, thermals, noise, and fatigue and closes with a discussion of moderating variables and related topics. In summation of the review, a conceptual framework for cognitive process under stress has been assembled. As one might imagine, the research literature that addresses stress, theories governing its effects on human performance, and experimental evidence that supports these notions is large and diverse. In attempting to organize and synthesize this body of work, I was guided by several earlier efforts (Bourne & Yaroush, 2003; Driskell, Mullen, Johnson, Hughes, & Batchelor, 1992; Driskell & Salas, 1996; Haridcock & Desmond, 2001; Stokes & Kite, 1994). These authors should be credited with accomplishing the monumental task of providing focused reviews in this area and their collective efforts laid the foundation for this present review. Similarly, the format of this review has been designed in accordance with these previous exemplars. However, each of these previous efforts either simply reported general findings, without sufficient experimental illustration, or narrowed their scope of investigation to the extent that the breadth of such findings remained hidden from the reader. Moreover, none of these examinations yielded an architecture that adequately describes or explains the inter-relations between information processing elements under stress conditions.

  7. Performance of Turbo Interference Cancellation Receivers in Space-Time Block Coded DS-CDMA Systems

    Directory of Open Access Journals (Sweden)

    Emmanuel Oluremi Bejide

    2008-07-01

    Full Text Available We investigate the performance of turbo interference cancellation receivers in the space time block coded (STBC direct-sequence code division multiple access (DS-CDMA system. Depending on the concatenation scheme used, we divide these receivers into the partitioned approach (PA and the iterative approach (IA receivers. The performance of both the PA and IA receivers is evaluated in Rayleigh fading channels for the uplink scenario. Numerical results show that the MMSE front-end turbo space-time iterative approach receiver (IA effectively combats the mixture of MAI and intersymbol interference (ISI. To further investigate the possible achievable data rates in the turbo interference cancellation receivers, we introduce the puncturing of the turbo code through the use of rate compatible punctured turbo codes (RCPTCs. Simulation results suggest that combining interference cancellation, turbo decoding, STBC, and RCPTC can significantly improve the achievable data rates for a synchronous DS-CDMA system for the uplink in Rayleigh flat fading channels.

  8. Oxide fuel pin transient performance analysis and design with the TEMECH code

    International Nuclear Information System (INIS)

    Bard, F.E.; Dutt, S.P.; Hinman, C.A.; Hunter, C.W.; Pitner, A.L.

    1986-01-01

    The TEMECH code is a fast-running, thermal-mechanical-hydraulic, analytical program used to evaluate the transient performance of LMR oxide fuel pins. The code calculates pin deformation and failure probability due to fuel-cladding differential thermal expansion, expansion of fuel upon melting, and fission gas pressurization. The mechanistic fuel model in the code accounts for fuel cracking, crack closure, porosity decrease, and the temperature dependence of fuel creep through the course of the transient. Modeling emphasis has been placed on results obtained from Fuel Cladding Transient Test (FCTT) testing, Transient Fuel Deformation (TFD) tests and TREAT integral fuel pin experiments

  9. A probabilistic analysis of PWR and BWR fuel rod performance using the code CASINO-SLEUTH

    International Nuclear Information System (INIS)

    Bull, A.J.

    1987-01-01

    This paper presents a brief description of the Monte Carlo and response surface techniques used in the code, and a probabilistic analysis of fuel rod performance in PWR and BWR applications. The analysis shows that fission gas release predictions are very sensitive to changes in certain of the code's inputs, identifies the most dominant input parameters and compares their effects in the two cases. (orig./HP)

  10. Impact of the Revised Malaysian Code on Corporate Governance on Audit Committee Attributes and Firm Performance

    OpenAIRE

    KALLAMU, Basiru Salisu

    2016-01-01

    Abstract. Using a sample of 37 finance companies listed under the finance segment of Bursa Malaysia, we examined the impact of the revision to Malaysian code on corporate governance on audit committee attributes and firm performance. Our result suggests that audit committee attributes significantly improved after the Code was revised. In addition, the coefficient for audit committee and risk committee interlock has a significant negative relationship with Tobin’s Q in the period before the re...

  11. Performance Evaluation of a Novel Optimization Sequential Algorithm (SeQ Code for FTTH Network

    Directory of Open Access Journals (Sweden)

    Fazlina C.A.S.

    2017-01-01

    Full Text Available The SeQ codes has advantages, such as variable cross-correlation property at any given number of users and weights, as well as effectively suppressed the impacts of phase induced intensity noise (PIIN and multiple access interference (MAI cancellation property. The result revealed, at system performance analysis of BER = 10-09, the SeQ code capable to achieved 1 Gbps up to 60 km.

  12. Performance Comparison of Assorted Color Spaces for Multilevel Block Truncation Coding based Face Recognition

    OpenAIRE

    H.B. Kekre; Sudeep Thepade; Karan Dhamejani; Sanchit Khandelwal; Adnan Azmi

    2012-01-01

    The paper presents a performance analysis of Multilevel Block Truncation Coding based Face Recognition among widely used color spaces. In [1], Multilevel Block Truncation Coding was applied on the RGB color space up to four levels for face recognition. Better results were obtained when the proposed technique was implemented using Kekre’s LUV (K’LUV) color space [25]. This was the motivation to test the proposed technique using assorted color spaces. For experimental analysis, two face databas...

  13. Performance and complexity of tunable sparse network coding with gradual growing tuning functions over wireless networks

    OpenAIRE

    Garrido Ortiz, Pablo; Sørensen, Chres W.; Lucani Roetter, Daniel Enrique; Agüero Calvo, Ramón

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and comp...

  14. UNA MIRADA CONCEPTUAL A LA GENERACIÓN AUTOMÁTICA DE CÓDIGO UMA ABORDAGEM CONCEITUAL À GERAÇÃO AUTOMÁTICA DE CÓDIGO A CONCEPTUAL APPROACH TO AUTOMATIC GENERATION OF CODE

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata

    2010-07-01

    ícil compreensão para o cliente, o que impede que se tenha uma validação em períodos prévios do desenvolvimento.Automated code generation is fostered by several software development methods. This generation is often supplied by well-known CASE (Computer-Aided Software Engineering tools. However, automation is still so far and some CASE tools are complemented by non-standard modeling projects. In this paper, we conceptualize projects related to automated code generation, starting from discourse representations in either controlled or natural language, or in conceptual schemas. In this way, we present a graphical summary of crucial concepts related to this issue, by means of a state-of-the-art review. We conclude that automated code generation usually begins from solution-based representations of the problem instead of domain-based representations. Also, we summarize that these starting points are misunderstood by the client and this situation leads to poor validation in early stages of software development lifecycle.

  15. An Examination of the Performance Based Building Code on the Design of a Commercial Building

    Directory of Open Access Journals (Sweden)

    John Greenwood

    2012-11-01

    Full Text Available The Building Code of Australia (BCA is the principal code under which building approvals in Australia are assessed. The BCA adopted performance-based solutions for building approvals in 1996. Performance-based codes are based upon a set of explicit objectives, stated in terms of a hierarchy of requirements beginning with key general objectives. With this in mind, the research presented in this paper aims to analyse the impact of the introduction of the performance-based code within Western Australia to gauge the effect and usefulness of alternative design solutions in commercial construction using a case study project. The research revealed that there are several advantages to the use of alternative designs and that all parties, in general, are in favour of the performance-based building code of Australia. It is suggested that change in the assessment process to streamline the alternative design path is needed for the greater use of the performance-based alternative. With appropriate quality control measures, minor variations to the deemed-to-satisfy provisions could easily be managed by the current and future building surveying profession.

  16. A Conceptual Framework for the Electronic Performance Support Systems within IBM Lotus Notes 6 (LN6 Example

    Directory of Open Access Journals (Sweden)

    Servet BAYRAM

    2005-10-01

    Full Text Available A Conceptual Framework for the Electronic PerformanceSupport Systems within IBM Lotus Notes 6 (LN6 Example Assoc. Prof. Dr. Servet BAYRAM Computer Education & Instructional Technologies Marmara University, TURKEYsbayram@marmara.edu.tr ABSTRACT The concept of Electronic Performance Support Systems (EPSS is containing multimedia or computer based instruction components that improves human performance by providing process simplification, performance information and decision support system. EPSS has become a hot topic for organizational development, human resources, performance technology, training, and educational development professionals. A conceptual framework of EPSS is constructed under five interrelated and interdependent domains for educational implications. The domains of the framework are online collaboration, cost-effectiveness, motivation, service management, and performance empowering. IBM Lotus Notes 6 (LN6 is used as an example application tool to illustrate the power of this framework. The framework describes a set of relevant events based upon deductive analyses for improving our understanding of the EPSS and its implications on education and training. The article is also pointed out that there are some similarities between the EPSS’ and the LN6’s specific features within this conceptual framework. It can provide some guidelines and benefits to researchers, educators, and designers as well.

  17. Heterogeneous performances of conceptual dis/continuity: a dialectic reading of Brown and Kloser's article

    Science.gov (United States)

    Hwang, Sungwon; Kim, Mijung

    2009-12-01

    We review Brown and Kloser's article, "Conceptual continuity and the science of baseball: using informal science literacy to promote students science learning" from a Vygotskian cultural-historical and dialectic perspective. Brown and Kloser interpret interview data with student baseball players and claim that students' conceptual understanding articulated in vernacular genres involves continuities (similarities) with the canonical scientific explanations. In this commentary, we suggest that the authors' approach presupposes the dichotomy of the formal and the informal, which brings the authors' attention to continuity into the separation of cognition from language. We propose a Vygotskian approach that points out the problem of theorizing cognition (conceptual understanding) by depending on specific forms of representation (e.g., scientific terms). As alternative, we envision a Vygotskian cultural-historical approach to language, which considers different, irreducible modes of communication as an integrated whole and therefore allows theorizing cognition without dichotomizing it from the concrete ways by which human being communicates. We provide an exemplary analysis of a lecture talk in a university physics classroom and exemplify dialectic theories that explain the development of conceptual understanding. We discuss that this Vygotskian dialectic approach shows that people communicate scientific concepts through hybridization, which does not reproduce a genre self-identically; the continuity of conceptual understanding involves dis/continuity.

  18. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Science.gov (United States)

    Nawawi, N. M.; Anuar, M. S.; Junita, M. N.; Rashidi, C. B. M.

    2017-11-01

    In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA) system using Zero Cross Correlation (ZCC) code and multiband Orthogonal Frequency Division Multiplexing (OFDM) called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  19. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Directory of Open Access Journals (Sweden)

    Nawawi N. M.

    2017-01-01

    Full Text Available In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA system using Zero Cross Correlation (ZCC code and multiband Orthogonal Frequency Division Multiplexing (OFDM called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  2. Reliability issues and solutions for coding social communication performance in classroom settings.

    Science.gov (United States)

    Olswang, Lesley B; Svensson, Liselotte; Coggins, Truman E; Beilinson, Jill S; Donaldson, Amy L

    2006-10-01

    To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroom settings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for describing social communication performance. Four coders participated in this study. They observed and independently coded 6 social communication behavioral dimensions using handheld computers. The dimensions were mutually exclusive and accounted for all verbal and nonverbal productions during a specified time frame. The technology allowed for coding frequency and duration for each entered code. Data were collected from 20 different 2-min video segments of children in kindergarten through 3rd-grade classrooms. Data were analyzed for interobserver and intraobserver agreements using time-interval sorting and Cohen's kappa. Further, interval size and total observation length were manipulated to determine their influence on reliability. The data revealed interval sorting and kappa to be a suitable method for examining reliability of occurrence and duration of ongoing social communication behavioral dimensions. Nearly all comparisons yielded medium to large kappa values; interval size and length of observation minimally affected results. Implications The analysis procedure described in this research solves a challenge in reliability: comparing coding by independent observers of both occurrence and duration of behaviors. Results indicate the utility of a new coding taxonomy and technology for application in online observations of social communication in a classroom setting.

  3. The development of the fuel rod transient performance analysis code FTPAC

    International Nuclear Information System (INIS)

    Han Zhijie; Ji Songtao

    2014-01-01

    Fuel rod behavior, especially the integrity of cladding, played an important role in fuel safety research during reactor transient and hypothetical accidents conditions. In order to study fuel rod performance under transient accidents, FTPAC (Fuel Transient Performance Analysis Code) has been developed for simulating light water reactor fuel rod transient behavior when power or coolant boundary conditions are rapidly changing. It is composed of temperature, mechanical deformation, cladding oxidation and gas pressure model. The assessment was performed by comparing FTPAC code analysis result to experiments data and FRAPTRAN code calculations. Comparison shows that, the FTPAC gives reasonable agreement in temperature, deformation and gas pressure prediction. And the application of slip coefficient is more suitable for simulating the sliding between pellet and cladding when the gap is closed. (authors)

  4. Application and Analysis of Performance of DQPSK Advanced Modulation Format in Spectral Amplitude Coding OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-04-01

    Full Text Available SAC (Spectral Amplitude Coding is a technique of OCDMA (Optical Code Division Multiple Access to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying is applied, simulated and analyzed. m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK

  5. Implementation and Performance Evaluation of Distributed Cloud Storage Solutions using Random Linear Network Coding

    DEFF Research Database (Denmark)

    Fitzek, Frank; Toth, Tamas; Szabados, Áron

    2014-01-01

    This paper advocates the use of random linear network coding for storage in distributed clouds in order to reduce storage and traffic costs in dynamic settings, i.e. when adding and removing numerous storage devices/clouds on-the-fly and when the number of reachable clouds is limited. We introduce...... various network coding approaches that trade-off reliability, storage and traffic costs, and system complexity relying on probabilistic recoding for cloud regeneration. We compare these approaches with other approaches based on data replication and Reed-Solomon codes. A simulator has been developed...... to carry out a thorough performance evaluation of the various approaches when relying on different system settings, e.g., finite fields, and network/storage conditions, e.g., storage space used per cloud, limited network use, and limited recoding capabilities. In contrast to standard coding approaches, our...

  6. Typical performance of regular low-density parity-check codes over general symmetric channels

    International Nuclear Information System (INIS)

    Tanaka, Toshiyuki; Saad, David

    2003-01-01

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models

  7. Typical performance of regular low-density parity-check codes over general symmetric channels

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Toshiyuki [Department of Electronics and Information Engineering, Tokyo Metropolitan University, 1-1 Minami-Osawa, Hachioji-shi, Tokyo 192-0397 (Japan); Saad, David [Neural Computing Research Group, Aston University, Aston Triangle, Birmingham B4 7ET (United Kingdom)

    2003-10-31

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models.

  8. Application and analysis of performance of dqpsk advanced modulation format in spectral amplitude coding ocdma

    International Nuclear Information System (INIS)

    Memon, A.

    2015-01-01

    SAC (Spectral Amplitude Coding) is a technique of OCDMA (Optical Code Division Multiple Access) to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying) modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying) is applied, simulated and analyzed, m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK. (author)

  9. PORST: a computer code to analyze the performance of retrofitted steam turbines

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C.; Hwang, I.T.

    1980-09-01

    The computer code PORST was developed to analyze the performance of a retrofitted steam turbine that is converted from a single generating to a cogenerating unit for purposes of district heating. Two retrofit schemes are considered: one converts a condensing turbine to a backpressure unit; the other allows the crossover extraction of steam between turbine cylinders. The code can analyze the performance of a turbine operating at: (1) valve-wide-open condition before retrofit, (2) partial load before retrofit, (3) valve-wide-open after retrofit, and (4) partial load after retrofit.

  10. Introduction into scientific work methods-a necessity when performance-based codes are introduced

    DEFF Research Database (Denmark)

    Dederichs, Anne; Sørensen, Lars Schiøtt

    The introduction of performance-based codes in Denmark in 2004 requires new competences from people working with different aspects of fire safety in the industry and the public sector. This abstract presents an attempt in reducing problems with handling and analysing the mathematical methods...... and CFD models when applying performance-based codes. This is done within the educational program "Master of Fire Safety Engineering" at the department of Civil Engineering at the Technical University of Denmark. It was found that the students had general problems with academic methods. Therefore, a new...

  11. Performance Analysis of Spectral Amplitude Coding Based OCDMA System with Gain and Splitter Mismatch

    Science.gov (United States)

    Umrani, Fahim A.; Umrani, A. Waheed; Umrani, Naveed A.; Memon, Kehkashan A.; Kalwar, Imtiaz Hussain

    2013-09-01

    This paper presents the practical analysis of the optical code-division multiple-access (O-CDMA) systems based on perfect difference codes. The work carried out use SNR criterion to select the optimal value of avalanche photodiodes (APD) gain and shows how the mismatch in the splitters and gains of the APD used in the transmitters and receivers of network can degrade the BER performance of the system. The investigations also reveal that higher APD gains are not suitable for such systems even at higher powers. The system performance, with consideration of shot noise, thermal noise, bulk and surface leakage currents is also investigated.

  12. Development of a general coupling interface for the fuel performance code TRANSURANUS – Tested with the reactor dynamics code DYN3D

    International Nuclear Information System (INIS)

    Holt, L.; Rohde, U.; Seidl, M.; Schubert, A.; Van Uffelen, P.; Macián-Juan, R.

    2015-01-01

    Highlights: • A general coupling interface was developed for couplings of the TRANSURANUS code. • With this new tool simplified fuel behavior models in codes can be replaced. • Applicable e.g. for several reactor types and from normal operation up to DBA. • The general coupling interface was applied to the reactor dynamics code DYN3D. • The new coupled code system DYN3D–TRANSURANUS was successfully tested for RIA. - Abstract: A general interface is presented for coupling the TRANSURANUS fuel performance code with thermal hydraulics system, sub-channel thermal hydraulics, computational fluid dynamics (CFD) or reactor dynamics codes. As first application the reactor dynamics code DYN3D was coupled at assembly level in order to describe the fuel behavior in more detail. In the coupling, DYN3D provides process time, time-dependent rod power and thermal hydraulics conditions to TRANSURANUS, which in case of the two-way coupling approach transfers parameters like fuel temperature and cladding temperature back to DYN3D. Results of the coupled code system are presented for the reactivity transient scenario, initiated by control rod ejection. More precisely, the two-way coupling approach systematically calculates higher maximum values for the node fuel enthalpy. These differences can be explained thanks to the greater detail in fuel behavior modeling. The numerical performance for DYN3D–TRANSURANUS was proved to be fast and stable. The coupled code system can therefore improve the assessment of safety criteria, at a reasonable computational cost

  13. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    Highlights: ► The application of advanced validation techniques (sensitivity, calibration and prediction) to nuclear performance codes FRAPCON and LIFE-4 is the focus of the paper. ► A sensitivity ranking methodology narrows down the number of selected modeling parameters from 61 to 24 for FRAPCON and from 69 to 35 for LIFE-4. ► Fuel creep, fuel thermal conductivity, fission gas transport/release, crack/boundary, and fuel gap conductivity models of LIFE-4 are identified for improvements. ► FRAPCON sensitivity results indicated the importance of the fuel thermal conduction and the fission gas release models. -- Abstract: Evolving nuclear energy programs expect to use enhanced modeling and simulation (M and S) capabilities, using multiscale, multiphysics modeling approaches, to reduce both cost and time from the design through the licensing phases. Interest in the development of the multiscale, multiphysics approach has increased in the last decade because of the need for predictive tools for complex interacting processes as a means of eliminating the limited use of empirically based model development. Complex interacting processes cannot be predicted by analyzing each individual component in isolation. In most cases, the mathematical models of complex processes and their boundary conditions are nonlinear. As a result, the solutions of these mathematical models often require high-performance computing capabilities and resources. The use of multiscale, multiphysics (MS/MP) models in conjunction with high-performance computational software and hardware introduces challenges in validating these predictive tools—traditional methodologies will have to be modified to address these challenges. The advanced MS/MP codes for nuclear fuels and reactors are being developed within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of the US Department of Energy (DOE) – Nuclear Energy (NE). This paper does not directly address challenges in calibration

  14. Performance of the dot product function in radiative transfer code SORD

    Science.gov (United States)

    Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent

    2016-10-01

    The successive orders of scattering radiative transfer (RT) codes frequently call the scalar (dot) product function. In this paper, we study performance of some implementations of the dot product in the RT code SORD using 50 scenarios for light scattering in the atmosphere-surface system. In the dot product function, we use the unrolled loops technique with different unrolling factor. We also considered the intrinsic Fortran functions. We show results for two machines: ifort compiler under Windows, and pgf90 under Linux. Intrinsic DOT_PRODUCT function showed best performance for the ifort. For the pgf90, the dot product implemented with unrolling factor 4 was the fastest. The RT code SORD together with the interface that runs all the mentioned tests are publicly available from ftp://maiac.gsfc.nasa.gov/pub/skorkin/SORD_IP_16B (current release) or by email request from the corresponding (first) author.

  15. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  16. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  17. Improving 3D-Turbo Code's BER Performance with a BICM System over Rayleigh Fading Channel

    Directory of Open Access Journals (Sweden)

    R. Yao

    2016-12-01

    Full Text Available Classical Turbo code suffers from high error floor due to its small Minimum Hamming Distance (MHD. Newly-proposed 3D-Turbo code can effectively increase the MHD and achieve a lower error floor by adding a rate-1 post encoder. In 3D-Turbo codes, part of the parity bits from the classical Turbo encoder are further encoded through the post encoder. In this paper, a novel Bit-Interleaved Coded Modulation (BICM system is proposed by combining rotated mapping Quadrature Amplitude Modulation (QAM and 3D-Turbo code to improve the Bit Error Rate (BER performance of 3D-Turbo code over Raleigh fading channel. A key-bit protection scheme and a Two-Dimension (2D iterative soft demodulating-decoding algorithm are developed for the proposed BICM system. Simulation results show that the proposed system can obtain about 0.8-1.0 dB gain at BER of 10^{-6}, compared with the existing BICM system with Gray mapping QAM.

  18. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  19. Constraining performance assessment models with tracer test results: a comparison between two conceptual models

    Science.gov (United States)

    McKenna, Sean A.; Selroos, Jan-Olof

    Tracer tests are conducted to ascertain solute transport parameters of a single rock feature over a 5-m transport pathway. Two different conceptualizations of double-porosity solute transport provide estimates of the tracer breakthrough curves. One of the conceptualizations (single-rate) employs a single effective diffusion coefficient in a matrix with infinite penetration depth. However, the tracer retention between different flow paths can vary as the ratio of flow-wetted surface to flow rate differs between the path lines. The other conceptualization (multirate) employs a continuous distribution of multiple diffusion rate coefficients in a matrix with variable, yet finite, capacity. Application of these two models with the parameters estimated on the tracer test breakthrough curves produces transport results that differ by orders of magnitude in peak concentration and time to peak concentration at the performance assessment (PA) time and length scales (100,000 years and 1,000 m). These differences are examined by calculating the time limits for the diffusive capacity to act as an infinite medium. These limits are compared across both conceptual models and also against characteristic times for diffusion at both the tracer test and PA scales. Additionally, the differences between the models are examined by re-estimating parameters for the multirate model from the traditional double-porosity model results at the PA scale. Results indicate that for each model the amount of the diffusive capacity that acts as an infinite medium over the specified time scale explains the differences between the model results and that tracer tests alone cannot provide reliable estimates of transport parameters for the PA scale. Results of Monte Carlo runs of the transport models with varying travel times and path lengths show consistent results between models and suggest that the variation in flow-wetted surface to flow rate along path lines is insignificant relative to variability in

  20. Performance Comparison of Containment PT analysis between CAP and CONTEMPT Code

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Jun; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul [FNC Tech., Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [KHNP-CENTERAL RESEARCH INSTITUTE, Daejeon (Korea, Republic of)

    2013-10-15

    CAP, in the form that is linked with SPACE, computed the containment back-pressure during LOCA accident. In previous SAR (safety analysis report) report of Shin-Kori Units 3 and 4, the CONTEMPT series of codes(hereby referred to as just 'CONTEMPT') is used to evaluate the containment safety during the postulated loss-of-coolant accident (LOCA). In more detail, CONTEMPT-LT/028 was used to calculate the containment maximum PT, while CONTEMPT4/MOD5 to calculate the minimum PT. Actually, in minimum PT analysis, CONTEMPT4/MOD5, which provide back pressure condition of containment, was linked with RELAP5/MOD3.3 which calculate the amount of blowdown into containment. In this analysis, CONTEMPT4/MOD5 was modified based on KREM. CONTEMPT code was developed to predict the long term behavior of water-cooled nuclear reactor containment systems subjected to LOCA conditions. It calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments, leakage on containment response. Models are provided for fan cooler and cooling spray as engineered safety systems. Any compartment may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. As mentioned above, CONTEMP has the similar code features and it therefore is expected to show the similar analysis performance with CAP. In this study, the differences between CAP and two CONTEMPT code versions (CONTEMPT-LT/028 for maximum PT and CONTEMPT4/MOD5 for minimum PT) are, in detail, identified and the code performances were compared for the same problem. Code by code comparison was carried out to identify the difference of LOCA analysis between a series of COMTEMPT and CAP code. With regard to important factors that affect the transient behavior of compartment thermodynamic

  1. Performance Comparison of Containment PT analysis between CAP and CONTEMPT Code

    International Nuclear Information System (INIS)

    Choo, Yeon Jun; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2013-01-01

    CAP, in the form that is linked with SPACE, computed the containment back-pressure during LOCA accident. In previous SAR (safety analysis report) report of Shin-Kori Units 3 and 4, the CONTEMPT series of codes(hereby referred to as just 'CONTEMPT') is used to evaluate the containment safety during the postulated loss-of-coolant accident (LOCA). In more detail, CONTEMPT-LT/028 was used to calculate the containment maximum PT, while CONTEMPT4/MOD5 to calculate the minimum PT. Actually, in minimum PT analysis, CONTEMPT4/MOD5, which provide back pressure condition of containment, was linked with RELAP5/MOD3.3 which calculate the amount of blowdown into containment. In this analysis, CONTEMPT4/MOD5 was modified based on KREM. CONTEMPT code was developed to predict the long term behavior of water-cooled nuclear reactor containment systems subjected to LOCA conditions. It calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments, leakage on containment response. Models are provided for fan cooler and cooling spray as engineered safety systems. Any compartment may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. As mentioned above, CONTEMP has the similar code features and it therefore is expected to show the similar analysis performance with CAP. In this study, the differences between CAP and two CONTEMPT code versions (CONTEMPT-LT/028 for maximum PT and CONTEMPT4/MOD5 for minimum PT) are, in detail, identified and the code performances were compared for the same problem. Code by code comparison was carried out to identify the difference of LOCA analysis between a series of COMTEMPT and CAP code. With regard to important factors that affect the transient behavior of compartment thermodynamic state in

  2. Modification in the FUDA computer code to predict fuel performance at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Das, M; Arunakumar, B V; Prasad, P N [Nuclear Power Corp., Mumbai (India)

    1997-08-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig.

  3. Steady State and Transient Fuel Rod Performance Analyses by Pad and Transuranus Codes

    International Nuclear Information System (INIS)

    Slyeptsov, O.; Slyeptsov, S.; Kulish, G.; Ostapov, A.; Chernov, I.

    2013-01-01

    The report performed under IAEA research contract No.15370/L2 describes the analysis results of WWER and PWR fuel rod performance at steady state operation and transients by means of PAD and TRANSURANUS codes. The code TRANSURANUS v1m1j09 developed by Institute for of Transuranium Elements (ITU) was used based on the Licensing Agreement N31302. The code PAD 4.0 developed by Westinghouse Electric Company was utilized in the frame of the Ukraine Nuclear Fuel Qualification Project for safety substantiation for the use of Westinghouse fuel assemblies in the mixed core of WWER-1000 reactor. The experimental data for the Russian fuel rod behavior obtained during the steady-state operation in the WWER-440 core of reactor Kola-3 and during the power transients in the core of MIR research reactor were taken from the IFPE database of the OECD/NEA and utilized for assessing the codes themselves during simulation of such properties as fuel burnup, fuel centerline temperature (FCT), fuel swelling, cladding strain, fission gas release (FGR) and rod internal pressure (RIP) in the rod burnup range of (41 - 60) GWD/MTU. The experimental data of fuel behavior at steady-state operation during seven reactor cycles presented by AREVA for the standard PWR fuel rod design were used to examine the code FGR model in the fuel burnup range of (37 - 81) GWD/MTU. (author)

  4. Modification in the FUDA computer code to predict fuel performance at high burnup

    International Nuclear Information System (INIS)

    Das, M.; Arunakumar, B.V.; Prasad, P.N.

    1997-01-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig

  5. Sensitivity Analysis of FEAST-Metal Fuel Performance Code: Initial Results

    International Nuclear Information System (INIS)

    Edelmann, Paul Guy; Williams, Brian J.; Unal, Cetin; Yacout, Abdellatif

    2012-01-01

    This memo documents the completion of the LANL milestone, M3FT-12LA0202041, describing methodologies and initial results using FEAST-Metal. The FEAST-Metal code calculations for this work are being conducted at LANL in support of on-going activities related to sensitivity analysis of fuel performance codes. The objective is to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. This report summarizes our preliminary results for the sensitivity analysis using 6 calibration datasets for metallic fuel developed at ANL for EBR-II experiments. Sensitivity ranking methodology was deployed to narrow down the selected parameters for the current study. There are approximately 84 calibration parameters in the FEAST-Metal code, of which 32 were ultimately used in Phase II of this study. Preliminary results of this sensitivity analysis led to the following ranking of FEAST models for future calibration and improvements: fuel conductivity, fission gas transport/release, fuel creep, and precipitation kinetics. More validation data is needed to validate calibrated parameter distributions for future uncertainty quantification studies with FEAST-Metal. Results of this study also served to point out some code deficiencies and possible errors, and these are being investigated in order to determine root causes and to improve upon the existing code models.

  6. Performance analysis of a decoding algorithm for algebraic-geometry codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund

    1999-01-01

    The fast decoding algorithm for one point algebraic-geometry codes of Sakata, Elbrond Jensen, and Hoholdt corrects all error patterns of weight less than half the Feng-Rao minimum distance. In this correspondence we analyze the performance of the algorithm for heavier error patterns. It turns out...

  7. Reliability in the performance-based concept of fib Model Code 2010

    NARCIS (Netherlands)

    Bigaj-van Vliet, A.; Vrouwenvelder, T.

    2013-01-01

    The design philosophy of the new fib Model Code for Concrete Structures 2010 represents the state of the art with regard to performance-based approach to the design and assessment of concrete structures. Given the random nature of quantities determining structural behaviour, the assessment of

  8. User's manual for the vertical axis wind turbine performance computer code darter

    Energy Technology Data Exchange (ETDEWEB)

    Klimas, P. C.; French, R. E.

    1980-05-01

    The computer code DARTER (DARrieus, Turbine, Elemental Reynolds number) is an aerodynamic performance/loads prediction scheme based upon the conservation of momentum principle. It is the latest evolution in a sequence which began with a model developed by Templin of NRC, Canada and progressed through the Sandia National Laboratories-developed SIMOSS (SSImple MOmentum, Single Streamtube) and DART (SARrieus Turbine) to DARTER.

  9. Performance of super-orthogonal space-time trellis code in a multipath environment

    CSIR Research Space (South Africa)

    Sokoya, OA

    2007-09-01

    Full Text Available This paper investigates the performance of Super-Orthogonal Space-time Trellis Code (SOSTTC) designed primarily for non-frequency selective (i.e. flat) fading channel but now applied to a frequency selective fading channel. A new decoding trellis...

  10. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  11. Direct and Conceptual Replications of Burgmer & Englich (2012: Power May Have Little to No Effect on Motor Performance.

    Directory of Open Access Journals (Sweden)

    Margaret Cusack

    Full Text Available Burgmer and Englich (2012 have reported that manipulating feelings of power can substantially improve performance on two motor tasks: golf and darts. We conducted two high-powered direct replications of the effects of power on golf, two online conceptual replications using mirror-tracing as a performance measure, and an additional conceptual replication using a cognitive performance measure (word-search. Overall, we found little to no effect of power on motor skill (d = 0.09, 95% CI[-0.07, 0.22], n = 603. We varied task difficulty, re-analyzed data without participants showing weak responses on manipulation checks, and tried adjusting performance scores for age, gender, and initial task skill. None of these secondary analyses revealed a strong effect of power on performance. A meta-analysis integrating our data with Burgmer & Englich leaves open the possibility that manipulating power could provide a modest boost in motor skill (d = 0.19, 95% CI [0.001, 0.38], n = 685. Unfortunately, the pattern of performance changes we observed was unrelated to group differences in perceived and rated power, suggesting that what motor effects do occur with this protocol may not be directly related to the construct of power. [Burgmer, P., &Englich, B. (2012. Bullseye!: How Power Improves Motor Performance. Social Psychological and Personality Science, 4(2, 224-232.].

  12. A fast and compact Fuel Rod Performance Simulator code for predictive, interpretive and educational purpose

    International Nuclear Information System (INIS)

    Lorenzen, J.

    1990-01-01

    A new Fuel rod Performance Simulator code FRPS has been developed, tested and benchmarked and is now available in different versions. The user may choose between the batch version INTERPIN producing results in form of listings or beforehand defined plots, or the interactive simulator code SIMSIM which is stepping through a power history under the control of user. Both versions are presently running on minicomputers and PC:s using EGA-Graphics. A third version is the implementation in a Studsvik Compact Simulator with FRPS being one of its various modules receiving the dynamic inputs from the simulator

  13. Reactivity Insertion Accident (RIA) Capability Status in the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Folsom, Charles Pearson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pastore, Giovanni [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-05-01

    One of the Challenge Problems being considered within CASL relates to modelling and simulation of Light Water Reactor LWR) fuel under Reactivity Insertion Accident (RIA) conditions. BISON is the fuel performance code used within CASL for LWR fuel under both normal operating and accident conditions, and thus must be capable of addressing the RIA challenge problem. This report outlines required BISON capabilities for RIAs and describes the current status of the code. Information on recent accident capability enhancements, application of BISON to a RIA benchmark exercise, and plans for validation to RIA behavior are included.

  14. System Performance of Concatenated STBC and Block Turbo Codes in Dispersive Fading Channels

    Directory of Open Access Journals (Sweden)

    Kam Tai Chan

    2005-05-01

    Full Text Available A new scheme of concatenating the block turbo code (BTC with the space-time block code (STBC for an OFDM system in dispersive fading channels is investigated in this paper. The good error correcting capability of BTC and the large diversity gain characteristics of STBC can be achieved simultaneously. The resulting receiver outperforms the iterative convolutional Turbo receiver with maximum- a-posteriori-probability expectation maximization (MAP-EM algorithm. Because of its ability to perform the encoding and decoding processes in parallel, the proposed system is easy to implement in real time.

  15. Performance of an Error Control System with Turbo Codes in Powerline Communications

    Directory of Open Access Journals (Sweden)

    Balbuena-Campuzano Carlos Alberto

    2014-07-01

    Full Text Available This paper reports the performance of turbo codes as an error control technique in PLC (Powerline Communications data transmissions. For this system, computer simulations are used for modeling data networks based on the model classified in technical literature as indoor, and uses OFDM (Orthogonal Frequency Division Multiplexing as a modulation technique. Taking into account the channel, modulation and turbo codes, we propose a methodology to minimize the bit error rate (BER, as a function of the average received signal noise ratio (SNR.

  16. Performance and Complexity of Tunable Sparse Network Coding with Gradual Growing Tuning Functions over Wireless Networks

    DEFF Research Database (Denmark)

    Garrido, Pablo; Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages...... a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and computational cost. In addition, it would be difficult to implement, due to the feedback delay. In this work...

  17. FEMAXI-III, a computer code for fuel rod performance analysis

    International Nuclear Information System (INIS)

    Ito, K.; Iwano, Y.; Ichikawa, M.; Okubo, T.

    1983-01-01

    This paper presents a method of fuel rod thermal-mechanical performance analysis used in the FEMAXI-III code. The code incorporates the models describing thermal-mechanical processes such as pellet-cladding thermal expansion, pellet irradiation swelling, densification, relocation and fission gas release as they affect pellet-cladding gap thermal conductance. The code performs the thermal behavior analysis of a full-length fuel rod within the framework of one-dimensional multi-zone modeling. The mechanical effects including ridge deformation is rigorously analyzed by applying the axisymmetric finite element method. The finite element geometrical model is confined to a half-pellet-height region with the assumption that pellet-pellet interaction is symmetrical. The 8-node quadratic isoparametric ring elements are adopted for obtaining accurate finite element solutions. The Newton-Raphson iteration with an implicit algorithm is applied to perform the analysis of non-linear material behaviors accurately and stably. The pellet-cladding interaction mechanism is exactly treated using the nodal continuity conditions. The code is applicable to the thermal-mechanical analysis of water reactor fuel rods experiencing variable power histories. (orig.)

  18. Assessment of stainless steel 348 fuel rod performance against literature available data using TRANSURANUS code

    Directory of Open Access Journals (Sweden)

    Giovedi Claudia

    2016-01-01

    Full Text Available Early pressurized water reactors were originally designed to operate using stainless steel as cladding material, but during their lifetime this material was replaced by zirconium-based alloys. However, after the Fukushima Daiichi accident, the problems related to the zirconium-based alloys due to the hydrogen production and explosion under severe accident brought the importance to assess different materials. In this sense, initiatives as ATF (Accident Tolerant Fuel program are considering different material as fuel cladding and, one candidate is iron-based alloy. In order to assess the fuel performance of fuel rods manufactured using iron-based alloy as cladding material, it was necessary to select a specific stainless steel (type 348 and modify properly conventional fuel performance codes developed in the last decades. Then, 348 stainless steel mechanical and physics properties were introduced in the TRANSURANUS code. The aim of this paper is to present the obtained results concerning the verification of the modified TRANSURANUS code version against data collected from the open literature, related to reactors which operated using stainless steel as cladding. Considering that some data were not available, some assumptions had to be made. Important differences related to the conventional fuel rods were taken into account. Obtained results regarding the cladding behavior are in agreement with available information. This constitutes an evidence of the modified TRANSURANUS code capabilities to perform fuel rod investigation of fuel rods manufactured using 348 stainless steel as cladding material.

  19. Drop performance test of conceptually designed control rod assembly for prototype generation IV sodium-cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Kyu; Lee, Jae Han; Kim, Hoe Woong; KIm, Sung Kyun; Kim, Jong Bum [Sodium-cooled Fast Reactor NSSS Design Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    The control rod assembly controls reactor power by adjusting its position during normal operation and shuts down chain reactions by its free drop under scram conditions. Therefore, the drop performance of the control rod assembly is important for the safety of a nuclear reactor. In this study, the drop performance of the conceptually designed control rod assembly for the prototype generation IV sodium-cooled fast reactor that is being developed at the Korea Atomic Energy Research Institute as a next-generation nuclear reactor was experimentally investigated. For the performance test, the test facility and test procedure were established first, and several free drop performance tests of the control rod assembly under different flow rate conditions were then carried out. Moreover, performance tests under several types and magnitudes of seismic loading conditions were also conducted to investigate the effects of seismic loading on the drop performance of the control rod assembly. The drop time of the conceptually designed control rod assembly for 0% of the tentatively designed flow rate was measured to be 1.527 seconds, and this agrees well with the analytically calculated drop time. It was also observed that the effect of seismic loading on the drop time was not significant.

  20. Reflections on the conceptualization and operationalization of a set-theoretic approach to employee motivation and performance research

    Directory of Open Access Journals (Sweden)

    James Christopher Ryan

    2017-01-01

    Full Text Available The current commentary offers a reflection on the conceptualizations of Lee and Raschke's (2016 proposal for a set-theoretic approach to employee motivation and organizational performance. The commentary is informed by the current author's operationalization of set-theoretic research on employee motivation which occurred contemporaneously to the work of Lee and Raschke. Observations on the state of current research on employee motivation, development of motivation theory and future directions of set-theoretic approaches to employee motivation and performance are offered.

  1. CONCEPTUAL FRAMEWORK FOR IMPROVING BUSINESS PERFORMANCE WITH LEAN MANUFACTURING AND SUCCESSFUL HUMAN FACTORS INTERVENTIONS-A CASE STUD

    Directory of Open Access Journals (Sweden)

    Rajiv Sharm

    2012-09-01

    Full Text Available Nowadays organizations compete between themselves in various categories such as faster delivery, price tags, state of art - technology and higher quality dimensio ns. A Conceptual framework with lean manufacturing and hum an factors interventions for improving business performance in terms of improved quality, reduced cost and faster de livery is presented and example s from literature are given to illustrate the desir ed situation in which ergonomics is considered as an integrated part of performance strategy . A case from an industry engaged in manufacturing shafts using lean manufacturing practices with successful ergonomic or human factors interventions is also inves tigated.

  2. A ''SuperCode'' for performing systems analysis of tokamak experiments and reactors

    International Nuclear Information System (INIS)

    Haney, S.W.; Barr, W.L.; Crotinger, J.A.; Perkins, L.J.; Solomon, C.J.; Chaniotakis, E.A.; Freidberg, J.P.; Wei, J.; Galambos, J.D.; Mandrekas, J.

    1992-01-01

    A new code, named the ''SUPERCODE,'' has been developed to fill the gap between currently available zero dimensional systems codes and highly sophisticated, multidimensional plasma performance codes. The former are comprehensive in content, fast to execute, but rather simple in terms of the accuracy of the physics and engineering models. The latter contain state-of-the-art plasma physics modelling but are limited in engineering content and time consuming to run. The SUPERCODE upgrades the reliability and accuracy of systems codes by calculating the self consistent 1 1/2 dimensional MHD-transport plasma evolution in a realistic engineering environment. By a combination of variational techniques and careful formation, there is only a modest increase in CPU time over O-D runs, thereby making the SUPERCODE suitable for use as a systems studies tool. In addition, considerable effort has been expended to make the code user- and programming-friendly, as well as operationally flexible, with the hope of encouraging wide usage throughout the fusion community

  3. High performance optical encryption based on computational ghost imaging with QR code and compressive sensing technique

    Science.gov (United States)

    Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan

    2015-10-01

    In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.

  4. SCANAIR a transient fuel performance code Part two: Assessment of modelling capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Georgenthum, Vincent, E-mail: vincent.georgenthum@irsn.fr; Moal, Alain; Marchand, Olivier

    2014-12-15

    Highlights: • The SCANAIR code is devoted to the study of irradiated fuel rod behaviour during RIA. • The paper deals with the status of the code validation for PWR rods. • During the PCMI stage there is a good agreement between calculations and experiments. • The boiling crisis occurrence is rather well predicted. • The code assessment during the boiling crisis has still to be improved. - Abstract: In the frame of their research programmes on fuel safety, the French Institut de Radioprotection et de Sûreté Nucléaire develops the SCANAIR code devoted to the study of irradiated fuel rod behaviour during reactivity initiated accident. A first paper was focused on detailed modellings and code description. This second paper deals with the status of the code validation for pressurised water reactor rods performed thanks to the available experimental results. About 60 integral tests carried out in CABRI and NSRR experimental reactors and 24 separated tests performed in the PATRICIA facility (devoted to the thermal-hydraulics study) have been recalculated and compared to experimental data. During the first stage of the transient, the pellet clad mechanical interaction phase, there is a good agreement between calculations and experiments: the clad residual elongation and hoop strain of non failed tests but also the failure occurrence and failure enthalpy of failed tests are correctly calculated. After this first stage, the increase of cladding temperature can lead to the Departure from Nucleate Boiling. During the film boiling regime, the clad temperature can reach a very high temperature (>700 °C). If the boiling crisis occurrence is rather well predicted, the calculation of the clad temperature and the clad hoop strain during this stage have still to be improved.

  5. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  6. 3D Analysis of Cooling Performance with Loss of Offsite Power Using GOTHIC Code

    International Nuclear Information System (INIS)

    Oh, Kye Min; Heo, Gyun Young; Na, In Sik; Choi, Yu Jung

    2010-01-01

    GOTHIC code enables to analyze one-dimensional or multi-dimensional problems for evaluating the cooling performance of loss of offsite power. The conventional GOTHIC code analysis performs heat transfer between plant containment and the outside of the fan cooler tubes by modeling each of fan cooler part model and component cooling water inside tube each to analyze boiling probability. In this paper, we suggest a way which reduces the multi-procedure of the cooling performance with loss of offsite power or the heat transfer states with complex geometrical structure to a single-procedure and verify the applicability of the heat transfer differences from the containment atmosphere humidity changes by the multi-nodes which component cooling water of tube or air of Reactor Containment Fan Cooler in the containment, otherwise the component model uses only one node

  7. Design and Performance Assessment of a Conceptual Cover Cap of Near Surface Repository for Short Lived Low and Intermediate Level Radioactive Waste

    Energy Technology Data Exchange (ETDEWEB)

    Asaba, Ruth; Kim, Changlak [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    The safety of the overall waste disposal system is determined by the performance of its individual components: waste form, waste container, engineered barrier and the host environment. The design of the cover cap helps in preventing percolation of water, and in retarding radionuclide migration from the disposal facility to the biosphere. The cover of a disposal facility is usually a combination of materials such as sand, gravel, concrete, clay and soil conditioned for vegetation growth. The cover system will be designed using models such as EPA's Hydrological Evaluation of Land fill Performance (HELP) code. This paper describes a conceptual design for a cover cap for a land fill as a preferred disposal facility for low and short lived intermediate radioactive waste in Uganda. Majority of the waste is generated from disused sealed and unsealed radioactive sources arising from medical, industrial applications, and research. Radioactive waste management has raised a lot of concern in both developed and developing countries. Each country has a responsibility to come up with a solution to prevent contamination of the environment and humans from radioactive waste. It is important to have thicker soil layers in cap designs so as to support vegetation growth since some activities such as erosion and settlements are expected. Help simulations in this study will assist to demonstrate that it is possible to design a cover cap which can contain radioactive waste packages for hundreds of years provided the proper institutional and performance monitoring schemes are implemented.

  8. Conceptual performance model for deep in situ recycled pavements with cement and bitumen additives

    CSIR Research Space (South Africa)

    Steyn, WJvdM

    2001-10-01

    Full Text Available RESUMEN El objetivo de esta comunicación es proveer información a cerca de un modelo conceptual de comportamiento de firmes sometidos al reciclado profundo in situ, en los cuales cemento y betún espumado, o cemento y emulsión bituminosa, han sido... aspectos de comportamiento muy distintos a firmes convencionales. La descripción de las estructuras de los firmes, su construcción y su medioambiente son presentados en esta comunicación. También, se discuten y se muestran los resultados iniciales de...

  9. Development and application of the BISON fuel performance code to the analysis of fission gas behaviour

    International Nuclear Information System (INIS)

    Pastore, G.; Hales, J.D.; Novascone, S.R.; Perez, D.M.; Spencer, B.W.; Williamson, R.L.

    2014-01-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that has been under development at Idaho National Laboratory (USA) since 2009. The capabilities of BISON comprise implicit solution of the fully coupled thermo-mechanics and diffusion equations, applicability to a variety of fuel forms, and simulation of both steady-state and transient conditions. The code includes multiphysics constitutive behavior for both fuel and cladding materials, and is designed for efficient use on highly parallel computers. This paper describes the main features of BISON, with emphasis on recent developments in modelling of fission gas behaviour in LWR-UO 2 fuel. The code is applied to the simulation of fuel rod irradiation experiments from the OECD/NEA International Fuel Performance Experiments Database. The comparison of the results with the available experimental data of fuel temperature, fission gas release, and cladding diametrical strain during pellet-cladding mechanical interaction is presented, pointing out a promising potential of the BISON code with the new fission gas behaviour model. (authors)

  10. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  11. Performance and Complexity Evaluation of Iterative Receiver for Coded MIMO-OFDM Systems

    Directory of Open Access Journals (Sweden)

    Rida El Chall

    2016-01-01

    Full Text Available Multiple-input multiple-output (MIMO technology in combination with channel coding technique is a promising solution for reliable high data rate transmission in future wireless communication systems. However, these technologies pose significant challenges for the design of an iterative receiver. In this paper, an efficient receiver combining soft-input soft-output (SISO detection based on low-complexity K-Best (LC-K-Best decoder with various forward error correction codes, namely, LTE turbo decoder and LDPC decoder, is investigated. We first investigate the convergence behaviors of the iterative MIMO receivers to determine the required inner and outer iterations. Consequently, the performance of LC-K-Best based receiver is evaluated in various LTE channel environments and compared with other MIMO detection schemes. Moreover, the computational complexity of the iterative receiver with different channel coding techniques is evaluated and compared with different modulation orders and coding rates. Simulation results show that LC-K-Best based receiver achieves satisfactory performance-complexity trade-offs.

  12. VINE-A NUMERICAL CODE FOR SIMULATING ASTROPHYSICAL SYSTEMS USING PARTICLES. II. IMPLEMENTATION AND PERFORMANCE CHARACTERISTICS

    International Nuclear Information System (INIS)

    Nelson, Andrew F.; Wetzstein, M.; Naab, T.

    2009-01-01

    We continue our presentation of VINE. In this paper, we begin with a description of relevant architectural properties of the serial and shared memory parallel computers on which VINE is intended to run, and describe their influences on the design of the code itself. We continue with a detailed description of a number of optimizations made to the layout of the particle data in memory and to our implementation of a binary tree used to access that data for use in gravitational force calculations and searches for smoothed particle hydrodynamics (SPH) neighbor particles. We describe the modifications to the code necessary to obtain forces efficiently from special purpose 'GRAPE' hardware, the interfaces required to allow transparent substitution of those forces in the code instead of those obtained from the tree, and the modifications necessary to use both tree and GRAPE together as a fused GRAPE/tree combination. We conclude with an extensive series of performance tests, which demonstrate that the code can be run efficiently and without modification in serial on small workstations or in parallel using the OpenMP compiler directives on large-scale, shared memory parallel machines. We analyze the effects of the code optimizations and estimate that they improve its overall performance by more than an order of magnitude over that obtained by many other tree codes. Scaled parallel performance of the gravity and SPH calculations, together the most costly components of most simulations, is nearly linear up to at least 120 processors on moderate sized test problems using the Origin 3000 architecture, and to the maximum machine sizes available to us on several other architectures. At similar accuracy, performance of VINE, used in GRAPE-tree mode, is approximately a factor 2 slower than that of VINE, used in host-only mode. Further optimizations of the GRAPE/host communications could improve the speed by as much as a factor of 3, but have not yet been implemented in VINE

  13. Validating the Japanese translation of the Force and Motion Conceptual Evaluation and comparing performance levels of American and Japanese students

    Directory of Open Access Journals (Sweden)

    Michi Ishimoto

    2014-08-01

    Full Text Available This study assesses the Japanese translation of the Force and Motion Conceptual Evaluation (FMCE. Researchers are often interested in comparing the conceptual ideas of students with different cultural backgrounds. The FMCE has been useful in identifying the concepts of English-speaking students from different backgrounds. To identify effectively the conceptual ideas of Japanese students and to compare them to those of their English-speaking counterparts, more work is required. Because of differences between the Japanese and English languages, and between the Japanese and American educational systems, it is important to assess the Japanese translation of the FMCE, a conceptual evaluation originally developed in English for American students. To assess its appropriateness, we examined the performance of a large sample of students on the translated version of the FMCE and then compared the results to those of English-speaking students. The data comprise the pretest results of 1095 students, most of whom were first-year students at a midlevel engineering school between 2003 and 2012. Basic statistics and the classical test theory indices of the translated FMCE indicate that its reliability and discrimination are appropriate to assess Japanese students’ concepts about force and motion. In general, the preconcepts of Japanese students assessed with the Japanese translation of the FMCE are quite similar to those of American students assessed with the FMCE, thereby supporting the validity of the translated version. However, our findings do show (1 that only a small percentage of Japanese students grasped Newtonian concepts and (2 that the percentage of Japanese students who used two different concept models together to answer some questions seems to be higher than that of American students.

  14. Validating the Japanese translation of the Force and Motion Conceptual Evaluation and comparing performance levels of American and Japanese students

    Science.gov (United States)

    Ishimoto, Michi; Thornton, Ronald K.; Sokoloff, David R.

    2014-12-01

    This study assesses the Japanese translation of the Force and Motion Conceptual Evaluation (FMCE). Researchers are often interested in comparing the conceptual ideas of students with different cultural backgrounds. The FMCE has been useful in identifying the concepts of English-speaking students from different backgrounds. To identify effectively the conceptual ideas of Japanese students and to compare them to those of their English-speaking counterparts, more work is required. Because of differences between the Japanese and English languages, and between the Japanese and American educational systems, it is important to assess the Japanese translation of the FMCE, a conceptual evaluation originally developed in English for American students. To assess its appropriateness, we examined the performance of a large sample of students on the translated version of the FMCE and then compared the results to those of English-speaking students. The data comprise the pretest results of 1095 students, most of whom were first-year students at a midlevel engineering school between 2003 and 2012. Basic statistics and the classical test theory indices of the translated FMCE indicate that its reliability and discrimination are appropriate to assess Japanese students' concepts about force and motion. In general, the preconcepts of Japanese students assessed with the Japanese translation of the FMCE are quite similar to those of American students assessed with the FMCE, thereby supporting the validity of the translated version. However, our findings do show (1) that only a small percentage of Japanese students grasped Newtonian concepts and (2) that the percentage of Japanese students who used two different concept models together to answer some questions seems to be higher than that of American students.

  15. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    Science.gov (United States)

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  16. The CMSSW benchmarking suite: Using HEP code to measure CPU performance

    International Nuclear Information System (INIS)

    Benelli, G

    2010-01-01

    The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.

  17. KUGEL: a thermal, hydraulic, fuel performance, and gaseous fission product release code for pebble bed reactor core analysis

    International Nuclear Information System (INIS)

    Shamasundar, B.I.; Fehrenbach, M.E.

    1981-05-01

    The KUGEL computer code is designed to perform thermal/hydraulic analysis and coated-fuel particle performance calculations for axisymmetric pebble bed reactor (PBR) cores. This computer code was developed as part of a Department of Energy (DOE)-funded study designed to verify the published core performance data on PBRs. The KUGEL code is designed to interface directly with the 2DB code, a two-dimensional neutron diffusion code, to obtain distributions of thermal power, fission rate, fuel burnup, and fast neutron fluence, which are needed for thermal/hydraulic and fuel performance calculations. The code is variably dimensioned so that problem size can be easily varied. An interpolation routine allows variable mesh size to be used between the 2DB output and the two-dimensional thermal/hydraulic calculations

  18. Performance of Multilevel Coding Schemes with Different Decoding Methods and Mapping Strategies in Mobile Fading Channels

    Institute of Scientific and Technical Information of China (English)

    YUAN Dongfeng; WANG Chengxiang; YAO Qi; CAO Zhigang

    2001-01-01

    Based on "capacity rule", the perfor-mance of multilevel coding (MLC) schemes with dif-ferent set partitioning strategies and decoding meth-ods in AWGN and Rayleigh fading channels is investi-gated, in which BCH codes are chosen as componentcodes and 8ASK modulation is used. Numerical re-sults indicate that MLC scheme with UP strategy canobtain optimal performance in AWGN channels andBP is the best mapping strategy for Rayleigh fadingchannels. BP strategy is of good robustness in bothkinds of channels to realize an optimum MLC system.Multistage decoding (MSD) is a sub-optimal decodingmethod of MLC for both channels. For Ungerboeckpartitioning (UP) and mixed partitioning (MP) strat-egy, MSD is strongly recommended to use for MLCsystem, while for BP strategy, PDL is suggested to useas a simple decoding method compared with MSD.

  19. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  20. Extending the application range of a fuel performance code from normal operating to design basis accident conditions

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Gyori, C.; Schubert, A.; Laar, J. van de; Hozer, Z.; Spykman, G.

    2008-01-01

    Two types of fuel performance codes are generally being applied, corresponding to the normal operating conditions and the design basis accident conditions, respectively. In order to simplify the code management and the interface between the codes, and to take advantage of the hardware progress it is favourable to generate a code that can cope with both conditions. In the first part of the present paper, we discuss the needs for creating such a code. The second part of the paper describes an example of model developments carried out by various members of the TRANSURANUS user group for coping with a loss of coolant accident (LOCA). In the third part, the validation of the extended fuel performance code is presented for LOCA conditions, whereas the last section summarises the present status and indicates needs for further developments to enable the code to deal with reactivity initiated accident (RIA) events

  1. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  2. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  3. PERFORMANCE EVALUATION OF TURBO CODED OFDM SYSTEMS AND APPLICATION OF TURBO DECODING FOR IMPULSIVE CHANNEL

    Directory of Open Access Journals (Sweden)

    Savitha H. M.

    2010-09-01

    Full Text Available A comparison of the performance of hard and soft-decision turbo coded Orthogonal Frequency Division Multiplexing systems with Quadrature Phase Shift Keying (QPSK and 16-Quadrature Amplitude Modulation (16-QAM is considered in the first section of this paper. The results show that the soft-decision method greatly outperforms the hard-decision method. The complexity of the demapper is reduced with the use of simplified algorithm for 16-QAM demapping. In the later part of the paper, we consider the transmission of data over additive white class A noise (AWAN channel, using turbo coded QPSK and 16-QAM systems. We propose a novel turbo decoding scheme for AWAN channel. Also we compare the performance of turbo coded systems with QPSK and 16-QAM on AWAN channel with two different channel values- one computed as per additive white Gaussian noise (AWGN channel conditions and the other as per AWAN channel conditions. The results show that the use of appropriate channel value in turbo decoding helps to combat the impulsive noise more effectively. The proposed model for AWAN channel exhibits comparable Bit error rate (BER performance as compared to AWGN channel.

  4. High-performance computational fluid dynamics: a custom-code approach

    International Nuclear Information System (INIS)

    Fannon, James; Náraigh, Lennon Ó; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain

    2016-01-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier–Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing. (paper)

  5. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  6. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  7. High-performance computational fluid dynamics: a custom-code approach

    Science.gov (United States)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  8. WWER-440 fuel rod performance analysis with PIN-Micro and TRANSURANUS codes

    International Nuclear Information System (INIS)

    Vitkova, M.; Manolova, M.; Stefanova, S.; Simeonova, V.; Passage, G.; Lassmann, K.

    1994-01-01

    PIN-micro and TRANSURANUS codes were used to analyse the WWER-440 fuel rod behaviour at normal operation conditions. Two highest loaded fuel rods of the fuel assemblies irradiated in WWER-440 with different power histories were selected. A set of the most probable average values of all geometrical and technological parameters were used. A comparison between PIN-micro and TRANSURANUS codes was performed using identical input data. The results for inner gas pressure, gap size, local linear heat rate, fuel central temperature and fission gas release as a function of time calculated for the selected fuel rods are presented. The following conclusions were drawn: 1) The PIN-micro code predicts adequately the thermal and mechanical behaviour of the two fuel rods; 2) The comparison of the results obtained by PIN-micro and TRANSURANUS shows a reasonable agreement and the discrepancies could be explained by the lack of thoroughly WWER oriented verification of TRANSURANUS; 3) The advanced TRANSURANUS code could be successfully applied for WWER fuel rod thermal and mechanical analysis after incorporation of all necessary WWER specific material properties and models for the Zr+1%Nb cladding, for the fuel rod as a whole and after validation against WWER experimental and operational data. 1 tab., 10 figs., 10 refs

  9. WWER-440 fuel rod performance analysis with PIN-Micro and TRANSURANUS codes

    Energy Technology Data Exchange (ETDEWEB)

    Vitkova, M; Manolova, M; Stefanova, S; Simeonova, V; Passage, G [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika; Kharalampieva, Ts [Kombinat Atomna Energetika, Kozloduj (Bulgaria); Lassmann, K [European Atomic Energy Community, Karlsruhe (Germany). European Inst. for Transuranium Elements

    1994-12-31

    PIN-micro and TRANSURANUS codes were used to analyse the WWER-440 fuel rod behaviour at normal operation conditions. Two highest loaded fuel rods of the fuel assemblies irradiated in WWER-440 with different power histories were selected. A set of the most probable average values of all geometrical and technological parameters were used. A comparison between PIN-micro and TRANSURANUS codes was performed using identical input data. The results for inner gas pressure, gap size, local linear heat rate, fuel central temperature and fission gas release as a function of time calculated for the selected fuel rods are presented. The following conclusions were drawn: (1) The PIN-micro code predicts adequately the thermal and mechanical behaviour of the two fuel rods; (2) The comparison of the results obtained by PIN-micro and TRANSURANUS shows a reasonable agreement and the discrepancies could be explained by the lack of thoroughly WWER oriented verification of TRANSURANUS; (3) The advanced TRANSURANUS code could be successfully applied for WWER fuel rod thermal and mechanical analysis after incorporation of all necessary WWER specific material properties and models for the Zr+1%Nb cladding, for the fuel rod as a whole and after validation against WWER experimental and operational data. 1 tab., 10 figs., 10 refs.

  10. Technical and tactical soccer players' performance in conceptual small-sided games

    Directory of Open Access Journals (Sweden)

    Cristian Javier Ramirez Lizana

    2015-09-01

    Full Text Available AbstractConceptual small-sided games (CSSGs may be interesting as a methodology for training soccer players given its connection to the unpredictability that is inherent to soccer. Our aim was investigate, through videogrammetry, if the technical and tactical principles promoted through the adoption of distinct rules from two distinct CSSGs (maintaining ball possession; and progression to the target would actually be achieved. The study included 24 athletes assigned to 6-player teams. Our data showed that the CSSGs' organising principles create situations with differing levels of difficulty that obey the propositions of maintaining ball possession and progression to the target, i.e., CSSGs permit systematic training on technical and tactical components in order to emphasize the concepts adopted in this study in games context. Our data credit the CSSGs for teaching technical and tactical lessons that, when coupled with adequate physical conditioning, can facilitate a player's capacity to merge thoughts and events in different situations.

  11. Influence of Code Size Variation on the Performance of 2D Hybrid ZCC/MD in OCDMA System

    Directory of Open Access Journals (Sweden)

    Matem Rima.

    2018-01-01

    Full Text Available Several two dimensional OCDMA have been developed in order to overcome many problems in optical network, enhancing cardinality, suppress Multiple Access Interference (MAI and mitigate Phase Induced Intensity Noise (PIIN. This paper propose a new 2D hybrid ZCC/MD code combining between 1D ZCC spectral encoding where M is its code length and 1D MD spatial spreading where N is its code length. The spatial spreading (N code length offers a good cardinality so it represents the main effect to enhance the performance of the system compared to the spectral (M code length according to the numerical results.

  12. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    Energy Technology Data Exchange (ETDEWEB)

    Giovedi, Claudia; Martins, Marcelo Ramos, E-mail: claudia.giovedi@labrisco.usp.br, E-mail: mrmartin@usp.br [Laboratorio de Analise, Avaliacao e Gerenciamento de Risco (LabRisco/POLI/USP), São Paulo, SP (Brazil); Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e, E-mail: ayabe@ipen.br, E-mail: dsgomes@ipen.br, E-mail: teixiera@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  13. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    International Nuclear Information System (INIS)

    Giovedi, Claudia; Martins, Marcelo Ramos; Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e

    2017-01-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  14. Better Measurement for Performance Improvement in Low- and Middle-Income Countries: The Primary Health Care Performance Initiative (PHCPI) Experience of Conceptual Framework Development and Indicator Selection.

    Science.gov (United States)

    Veillard, Jeremy; Cowling, Krycia; Bitton, Asaf; Ratcliffe, Hannah; Kimball, Meredith; Barkley, Shannon; Mercereau, Laure; Wong, Ethan; Taylor, Chelsea; Hirschhorn, Lisa R; Wang, Hong

    2017-12-01

    Policy Points: Strengthening accountability through better measurement and reporting is vital to ensure progress in improving quality primary health care (PHC) systems and achieving universal health coverage (UHC). The Primary Health Care Performance Initiative (PHCPI) provides national decision makers and global stakeholders with opportunities to benchmark and accelerate performance improvement through better performance measurement. Results from the initial PHC performance assessments in low- and middle-income countries (LMICs) are helping guide PHC reforms and investments and improve the PHCPI's instruments and indicators. Findings from future assessment activities will further amplify cross-country comparisons and peer learning to improve PHC. New indicators and sources of data are needed to better understand PHC system performance in LMICs. The Primary Health Care Performance Initiative (PHCPI), a collaboration between the Bill and Melinda Gates Foundation, The World Bank, and the World Health Organization, in partnership with Ariadne Labs and Results for Development, was launched in 2015 with the aim of catalyzing improvements in primary health care (PHC) systems in 135 low- and middle-income countries (LMICs), in order to accelerate progress toward universal health coverage. Through more comprehensive and actionable measurement of quality PHC, the PHCPI stimulates peer learning among LMICs and informs decision makers to guide PHC investments and reforms. Instruments for performance assessment and improvement are in development; to date, a conceptual framework and 2 sets of performance indicators have been released. The PHCPI team developed the conceptual framework through literature reviews and consultations with an advisory committee of international experts. We generated 2 sets of performance indicators selected from a literature review of relevant indicators, cross-referenced against indicators available from international sources, and evaluated through

  15. Application of the BISON Fuel Performance Code of the FUMEX-III Coordinated Research Project

    International Nuclear Information System (INIS)

    Williamson, R.L.; Novascone, S.R.

    2013-01-01

    Since 1981, the International Atomic Energy Agency (IAEA) has sponsored a series of Coordinated Research Projects (CRP) in the area of nuclear fuel modeling. These projects have typically lasted 3-5 years and have had broad international participation. The objectives of the projects have been to assess the maturity and predictive capability of fuel performance codes, support interaction and information exchange between countries with code development and application needs, build a database of well- defined experiments suitable for code validation, transfer a mature fuel modeling code to developing countries, and provide guidelines for code quality assurance and code application to fuel licensing. The fourth and latest of these projects, known as FUMEX-III1 (FUel Modeling at EXtended Burnup- III), began in 2008 and ended in December of 2011. FUMEX-III was the first of this series of fuel modeling CRP's in which the INL participated. Participants met at the beginning of the project to discuss and select a set of experiments ('priority cases') for consideration during the project. These priority cases were of broad interest to the participants and included reasonably well-documented and reliable data. A meeting was held midway through the project for participants to present and discuss progress on modeling the priority cases. A final meeting was held at close of the project to present and discuss final results and provide input for a final report. Also in 2008, the INL initiated development of a new multidimensional (2D and 3D) multiphysics nuclear fuel performance code called BISON, with code development progressing steadily during the three-year FUMEX-III project. Interactions with international fuel modeling researchers via FUMEX-III played a significant role in the BISON evolution, particularly influencing the selection of material and behavioral models which are now included in the code. The FUMEX-III cases are generally integral fuel rod experiments occurring

  16. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  17. IAMBUS, a computer code for the design and performance prediction of fast breeder fuel rods

    International Nuclear Information System (INIS)

    Toebbe, H.

    1990-05-01

    IAMBUS is a computer code for the thermal and mechanical design, in-pile performance prediction and post-irradiation analysis of fast breeder fuel rods. The code deals with steady, non-steady and transient operating conditions and enables to predict in-pile behavior of fuel rods in power reactors as well as in experimental rigs. Great effort went into the development of a realistic account of non-steady fuel rod operating conditions. The main emphasis is placed on characterizing the mechanical interaction taking place between the cladding tube and the fuel as a result of contact pressure and friction forces, with due consideration of axial and radial crack configuration within the fuel as well as the gradual transition at the elastic/plastic interface in respect to fuel behavior. IAMBUS can be readily adapted to various fuel and cladding materials. The specific models and material correlations of the reference version deal with the actual in-pile behavior and physical properties of the KNK II and SNR 300 related fuel rod design, confirmed by comparison of the fuel performance model with post-irradiation data. The comparison comprises steady, non-steady and transient irradiation experiments within the German/Belgian fuel rod irradiation program. The code is further validated by comparison of model predictions with post-irradiation data of standard fuel and breeder rods of Phenix and PFR as well as selected LWR fuel rods in non-steady operating conditions

  18. The Analysis and the Performance Simulation of the Capacity of Bit-interleaved Coded Modulation System

    Directory of Open Access Journals (Sweden)

    Hongwei ZHAO

    2014-09-01

    Full Text Available In this paper, the capacity of the BICM system over AWGN channels is first analyzed; the curves of BICM capacity versus SNR are also got by the Monte-Carlo simulations===?=== and compared with the curves of the CM capacity. Based on the analysis results, we simulate the error performances of BICM system with LDPC codes. Simulation results show that the capacity of BICM system with LDPC codes is enormously influenced by the mapping methods. Given a certain modulation method, the BICM system can obtain about 2-3 dB gain with Gray mapping compared with Non-Gray mapping. Meanwhile, the simulation results also demonstrate the correctness of the theory analysis.

  19. Analyses with the FSTATE code: fuel performance in destructive in-pile experiments

    International Nuclear Information System (INIS)

    Bauer, T.H.; Meek, C.C.

    1982-01-01

    Thermal-mechanical analysis of a fuel pin is an essential part of the evaluation of fuel behavior during hypothetical accident transients. The FSTATE code has been developed to provide this required computational ability in situations lacking azimuthal symmetry about the fuel-pin axis by performing 2-dimensional thermal, mechanical, and fission gas release and redistribution computations for a wide range of possible transient conditions. In this paper recent code developments are described and application is made to in-pile experiments undertaken to study fast-reactor fuel under accident conditions. Three accident simulations, including a fast and slow ramp-rate overpower as well as a loss-of-cooling accident sequence, are used as representative examples, and the interpretation of STATE computations relative to experimental observations is made

  20. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  1. An overview of the geochemical code MINTEQ: Applications to performance assessment for low-level wastes

    International Nuclear Information System (INIS)

    Peterson, S.R.; Opitz, B.E.; Graham, M.J.; Eary, L.E.

    1987-03-01

    The MINTEQ geochemical computer code, developed at the Pacific Northwest Laboratory (PNL), integrates many of the capabilities of its two immediate predecessors, MINEQL and WATEQ3. The MINTEQ code will be used in the Special Waste Form Lysimeters-Arid program to perform the calculations necessary to simulate (model) the contact of low-level waste solutions with heterogeneous sediments of the interaction of ground water with solidified low-level wastes. The code can calculate ion speciation/solubilitya, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution of solid phases. Under the Special Waste Form Lysimeters-Arid program, the composition of effluents (leachates) from column and batch experiments, using laboratory-scale waste forms, will be used to develop a geochemical model of the interaction of ground water with commercial, solidified low-level wastes. The wastes being evaluated include power-reactor waste streams that have been solidified in cement, vinyl ester-styrene, and bitumen. The thermodynamic database for the code was upgraded preparatory to performing the geochemical modeling. Thermodynamic data for solid phases and aqueous species containing Sb, Ce, Cs, or Co were added to the MINTEQ database. The need to add these data was identified from the characterization of the waste streams. The geochemical model developed from the laboratory data will then be applied to predict the release from a field-lysimeter facility that contains full-scale waste samples. The contaminant concentrations migrating from the waste forms predicted using MINTEQ will be compared to the long-term lysimeter data. This comparison will constitute a partial field validation of the geochemical model

  2. Performance awareness execution performance of HEP codes on RISC platforms,issues and solutions

    CERN Document Server

    Yaari, R; Yaari, Refael; Jarp, Sverre

    1995-01-01

    The work described in this paper was started during the migration of Aleph's production jobs from the IBM mainframe/CRAY supercomputer to several RISC/Unix workstation platforms. The aim was to understand why Aleph did not obtain the performance on the RISC platforms that was "promised" after a CERN Unit comparison between these RISC platforms and the IBM mainframe. Remedies were also sought. Since the work with the Aleph jobs in turn led to the related task of understanding compilers and their options, the conditions under which the CERN benchmarks (and other benchmarks) were run, kernel routines and frequently used CERNLIB routines, the whole undertaking expanded to try to look at all the factors that influence the performance of High Energy Physics (HEP) jobs in general. Finally, key performance issues were reviewed against the programs of one of the LHC collaborations (Atlas) with the hope that the conclusions would be of long- term interest during the establishment of their simulation, reconstruction and...

  3. What role does performance information play in securing improvement in healthcare? a conceptual framework for levers of change.

    Science.gov (United States)

    Levesque, Jean-Frederic; Sutherland, Kim

    2017-08-28

    Across healthcare systems, there is consensus on the need for independent and impartial assessment of performance. There is less agreement about how measurement and reporting performance improves healthcare. This paper draws on academic theories to develop a conceptual framework-one that classifies in an integrated manner the ways in which change can be leveraged by healthcare performance information. A synthesis of published frameworks. The framework identifies eight levers for change enabled by performance information, spanning internal and external drivers, and emergent and planned processes: (1) cognitive levers provide awareness and understanding; (2) mimetic levers inform about the performance of others to encourage emulation; (3) supportive levers provide facilitation, implementation tools or models of care to actively support change; (4) formative levers develop capabilities and skills through teaching, mentoring and feedback; (5) normative levers set performance against guidelines, standards, certification and accreditation processes; (6) coercive levers use policies, regulations incentives and disincentives to force change; (7) structural levers modify the physical environment or professional cultures and routines; (8) competitive levers attract patients or funders. This framework highlights how performance measurement and reporting can contribute to eight different levers for change. It provides guidance into how to align performance measurement and reporting into quality improvement programme. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran.

    Science.gov (United States)

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-09-01

    Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  5. SNR and BER Models and the Simulation for BER Performance of Selected Spectral Amplitude Codes for OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-01-01

    Full Text Available Many encoding schemes are used in OCDMA (Optical Code Division Multiple Access Network but SAC (Spectral Amplitude Codes is widely used. It is considered an effective arrangement to eliminate dominant noise called MAI (Multi Access Interference. Various codes are studied for evaluation with respect to their performance against three noises namely shot noise, thermal noise and PIIN (Phase Induced Intensity Noise. Various Mathematical models for SNR (Signal to Noise Ratios and BER (Bit Error Rates are discussed where the SNRs are calculated and BERs are computed using Gaussian distribution assumption. After analyzing the results mathematically, it is concluded that ZCC (Zero Cross Correlation Code performs better than the other selected SAC codes and can serve larger number of active users than the other codes do. At various receiver power levels, analysis points out that RDC (Random Diagonal Code also performs better than the other codes. For the power interval between -10 and -20 dBm performance of RDC is better ZCC. Their lowest BER values suggest that these codes should be part of an efficient and cost effective OCDM access network in the future.

  6. The development of the Nuclear Electric core performance and fault transient analysis code package in support of Sizewell B

    International Nuclear Information System (INIS)

    Hall, P.; Hutt, P.

    1994-01-01

    This paper describes Nuclear Electric's (NE) development of an integrated code package in support of all its reactors including Sizewell B, designed for the provision of fuel management design, core performance studies, operational support and fault transient analysis. The package uses the NE general purpose three-dimensional transient reactor physics code PANTHER with cross-sections derived in the PWR case from the LWRWIMS LWR lattice neutronics code. The package also includes ENIGMA a generic fuel performance code and for PWR application VIPRE-01 a subchannel thermal hydraulics code, RELAP5 the system thermal hydraulics transient code and SCORPIO an on-line surveillance system. The paper describes the capabilities and validation of the elements of this package for PWR, how they are coupled within the package and the way in which they are being applied for Sizewell B to on-line surveillance and fault transient analysis. (Author)

  7. Conceptual frameworks for health systems performance: a quest for effectiveness, quality, and improvement.

    NARCIS (Netherlands)

    Arah, O.A.; Klazinga, N.S.; Delnoij, D.M.J.; Asbroek, A.H.A. ten; Custers, T.

    2003-01-01

    ISSUES: Countries and international organizations have recently renewed their interest in how health systems perform. This has led to the development of performance indicators for monitoring, assessing, and managing health systems to achieve effectiveness, equity, efficiency, and quality. Although

  8. Conceptual frameworks for health systems performance: a quest for effectiveness, quality, and improvement

    NARCIS (Netherlands)

    Arah, O. A.; Klazinga, N. S.; Delnoij, D. M. J.; ten Asbroek, A. H. A.; Custers, T.

    2003-01-01

    Issues. Countries and international organizations have recently renewed their interest in how health systems perform. This has led to the development of performance indicators for monitoring, assessing, and managing health systems to achieve effectiveness, equity, efficiency, and quality. Although

  9. Development of a general coupling interface for the fuel performance code transuranus tested with the reactor dynamic code DYN3D

    International Nuclear Information System (INIS)

    Holt, L.; Rohde, U.; Seidl, M.; Schubert, A.; Van Uffelen, P.

    2013-01-01

    Several institutions plan to couple the fuel performance code TRANSURANUS developed by the European Institute for Transuranium Elements with their own codes. One of these codes is the reactor dynamic code DYN3D maintained by the Helmholtz-Zentrum Dresden - Rossendorf. DYN3D was developed originally for VVER type reactors and was extended later to western type reactors. Usually, the fuel rod behavior is modeled in thermal hydraulics and neutronic codes in a simplified manner. The main idea of this coupling is to describe the fuel rod behavior in the frame of core safety analysis in a more detailed way, e.g. including the influence of the high burn-up structure, geometry changes and fission gas release. It allows to take benefit from the improved computational power and software achieved over the last two decades. The coupling interface was developed in a general way from the beginning. Thence it can be easily used also by other codes for a coupling with TRANSURANUS. The user can choose between a one-way as well as a two-way online coupling option. For a one-way online coupling, DYN3D provides only the time-dependent rod power and thermal hydraulics conditions to TRANSURANUS, but the fuel performance code doesn’t transfer any variable back to DYN3D. In a two-way online coupling, TRANSURANUS in addition transfers parameters like fuel temperature and cladding temperature back to DYN3D. This list of variables can be extended easily by geometric and further variables of interest. First results of the code system DYN3D-TRANSURANUS will be presented for a control rod ejection transient in a modern western type reactor. Pre-analyses show already that a detailed fuel rod behavior modeling will influence the thermal hydraulics and thence also the neutronics due to the Doppler reactivity effect of the fuel temperature. The coupled code system has therefore a potential to improve the assessment of safety criteria. The developed code system DYN3D-TRANSURANUS can be used also

  10. A CONCEPTUAL TOOL FOR ASSESSING CLIENT PERFORMANCE IN THE CONSTRUCTION PROJECT COALITION

    Directory of Open Access Journals (Sweden)

    Gary D. Holt

    2002-01-01

    Full Text Available Due to the significant impact of client performance on overall project performance and the interdependence of participant%5C%27s performance in the construction project coalition %28i.e. clients%2C designers and constructors%29%2C there is a need to establish client performance measures. Based on data collected from in-depth interviews with nineteen UK architects and nine UK contractors%2C a generic tool for the on-going formal assessment of client performance is presented. It was found that this approach to performance assessment %28i.e. from the view point of other%2C non-client coalition participants%29 should lead to improved project relationships. Data analysis showed that in addition to %5C%27harder%5C%27 measures such as understanding of project requirements and finance%2C other%2C %5C%27softer%5C%27 measures of client performance %28e.g. attitude%29 were worthy of consideration since they determine the quality of participant relationships. It is recommended that the tool be used to promote more effective client performance and thus enhance coalition relationships%2C enabling continuous improvement. The ultimate aim is to develop similar tools for the assessment of all coalition participants based on a culture of openness and trust. Abstract in Bahasa Indonesia : assessment+tool%2C+coalition+participants%2C+client+performance%2C+perceptions%2C+performance+measures%2C+satisfaction.

  11. Overview of the geochemical code MINTEQ: applications to performance assessment for low-level wastes

    International Nuclear Information System (INIS)

    Graham, M.J.; Peterson, S.R.

    1985-09-01

    The MINTEQ geochemical computer code, developed at Pacific Northwest Laboratory, integrates many of the capabilities of its two immediate predecessors, WATEQ3 and MINEQL. MINTEQ can be used to perform the calculations necessary to simulate (model) the contact of low-level waste solutions with heterogeneous sediments or the interaction of ground water with solidified low-level wastes. The code is capable of performing calculations of ion speciation/solubility, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution of solid phases. Under the Special Waste Form Lysimeters-Arid program, the composition of effluents (leachates) from column and batch experiments, using laboratory-scale waste forms, will be used to develop a geochemical model of the interaction of ground water with commercial solidified low-level wastes. The wastes being evaluated include power reactor waste streams that have been solidified in cement, vinyl ester-styrene, and bitumen. The thermodynamic database for the code is being upgraded before the geochemical modeling is performed. Thermodynamic data for cobalt, antimony, cerium, and cesium solid phases and aqueous species are being added to the database. The need to add these data was identified from the characterization of the waste streams. The geochemical model developed from the laboratory data will then be applied to predict the release from a field-lysimeter facility that contains full-scale waste samples. The contaminant concentrations migrating from the wastes predicted using MINTEQ will be compared to the long-term lysimeter data. This comparison will constitute a partical field validation of the geochemical model. 28 refs

  12. Construction and performance research on variable-length codes for multirate OCDMA multimedia networks

    Science.gov (United States)

    Li, Chuan-qi; Yang, Meng-jie; Luo, De-jun; Lu, Ye; Kong, Yi-pu; Zhang, Dong-chuang

    2014-09-01

    A new kind of variable-length codes with good correlation properties for the multirate asynchronous optical code division multiple access (OCDMA) multimedia networks is proposed, called non-repetition interval (NRI) codes. The NRI codes can be constructed by structuring the interval-sets with no repetition, and the code length depends on the number of users and the code weight. According to the structural characteristics of NRI codes, the formula of bit error rate (BER) is derived. Compared with other variable-length codes, the NRI codes have lower BER. A multirate OCDMA multimedia simulation system is designed and built, the longer codes are assigned to the users who need slow speed, while the shorter codes are assigned to the users who need high speed. It can be obtained by analyzing the eye diagram that the user with slower speed has lower BER, and the conclusion is the same as the actual demand in multimedia data transport.

  13. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    2017-02-01

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functional characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.

  14. Enhancing the performance of the light field microscope using wavefront coding.

    Science.gov (United States)

    Cohen, Noy; Yang, Samuel; Andalman, Aaron; Broxton, Michael; Grosenick, Logan; Deisseroth, Karl; Horowitz, Mark; Levoy, Marc

    2014-10-06

    Light field microscopy has been proposed as a new high-speed volumetric computational imaging method that enables reconstruction of 3-D volumes from captured projections of the 4-D light field. Recently, a detailed physical optics model of the light field microscope has been derived, which led to the development of a deconvolution algorithm that reconstructs 3-D volumes with high spatial resolution. However, the spatial resolution of the reconstructions has been shown to be non-uniform across depth, with some z planes showing high resolution and others, particularly at the center of the imaged volume, showing very low resolution. In this paper, we enhance the performance of the light field microscope using wavefront coding techniques. By including phase masks in the optical path of the microscope we are able to address this non-uniform resolution limitation. We have also found that superior control over the performance of the light field microscope can be achieved by using two phase masks rather than one, placed at the objective's back focal plane and at the microscope's native image plane. We present an extended optical model for our wavefront coded light field microscope and develop a performance metric based on Fisher information, which we use to choose adequate phase masks parameters. We validate our approach using both simulated data and experimental resolution measurements of a USAF 1951 resolution target; and demonstrate the utility for biological applications with in vivo volumetric calcium imaging of larval zebrafish brain.

  15. Performance Analysis of an Astrophysical Simulation Code on the Intel Xeon Phi Architecture

    OpenAIRE

    Noormofidi, Vahid; Atlas, Susan R.; Duan, Huaiyu

    2015-01-01

    We have developed the astrophysical simulation code XFLAT to study neutrino oscillations in supernovae. XFLAT is designed to utilize multiple levels of parallelism through MPI, OpenMP, and SIMD instructions (vectorization). It can run on both CPU and Xeon Phi co-processors based on the Intel Many Integrated Core Architecture (MIC). We analyze the performance of XFLAT on configurations with CPU only, Xeon Phi only and both CPU and Xeon Phi. We also investigate the impact of I/O and the multi-n...

  16. First vapor explosion calculations performed with MC3D thermal-hydraulic code

    Energy Technology Data Exchange (ETDEWEB)

    Brayer, C.; Berthoud, G. [CEA Centre d`Etudes de Grenoble, 38 (France). Direction des Reacteurs Nucleaires

    1998-01-01

    This paper presents the first calculations performed with the `explosion` module of the multiphase computer code MC3D, which is devoted to the fine fragmentation and explosion phase of a fuel coolant interaction. A complete description of the physical laws included in this module is given. The fragmentation models, taking into account two fragmentation mechanisms, a thermal one and an hydrodynamic one, are also developed here. Results to some calculations to test the numerical behavior of MC3D and to test the explosion models in 1D or 2D are also presented. (author)

  17. Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment

    Science.gov (United States)

    Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.

    2017-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of

  18. The analysis of the annular fuel performance in steady state condition by using AFPAC code

    International Nuclear Information System (INIS)

    He Xiaojun; Ji Songtao; Zhang Yingchao

    2012-01-01

    The fuel performance code AFPAC v1.0 is used to analyze annular fuel's behavior under steady state conditions, including neutronics, thermal hydraulic, rod deformation, fission gas release and rod internal pressure. The calculation results show that: 1) Annular fuel has a good steady irradiation performance at 150% power level as current LWRs' with burnup up to 50 GWd/t, and all parameters, such as temperature, rod internal pressure and rod deformation, are meet the rod design criteria for current fuel of PWRs: 2) Compared to the solid fuel under the same irradiation condition. annular fuel has lower temperature, smaller deformation, lower fission gas release and lower pressure at EOL. From the point of view of steady irradiation performance, the safety of reactors can significantly improved by u sing the annular fuel. (authors)

  19. On the performance of diagonal lattice space-time codes for the quasi-static MIMO channel

    KAUST Repository

    Abediseid, Walid

    2013-06-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple-output (MIMO) channel. All the coding design to date focuses on either high-performance, high rates, low complexity encoding and decoding, or targeting a combination of these criteria. In this paper, we analyze in detail the performance of diagonal lattice space-time codes under lattice decoding. We present both upper and lower bounds on the average error probability. We derive a new closed form expression of the lower bound using the so-called sphere-packing bound. This bound presents the ultimate performance limit a diagonal lattice space-time code can achieve at any signal-to-noise ratio (SNR). The upper bound is simply derived using the union-bound and demonstrates how the average error probability can be minimized by maximizing the minimum product distance of the code. © 2013 IEEE.

  20. Dual throat NaK pump performance evaluation (conceptual design of prototype)

    International Nuclear Information System (INIS)

    Johnson, J.L.

    1972-01-01

    An evaluation of the performance of the dual throat NaK pump for the 5 Kwe Reactor TE System is discussed. Performance at nominal operating conditions and at startup conditions is predicted. The basis for selection of this pump design is presented. (U.S.)

  1. Treatment Integrity Enhancement via Performance Feedback Conceptualized as an Exercise in Social Influence

    Science.gov (United States)

    Erchul, William P.

    2013-01-01

    This commentary is in response to the article "Effects of Verbal and Written Performance Feedback on Treatment Adherence" (Kaufman, Codding, Markus, Tryon, & Kyse, this issue). The overall recommendation to those who study treatment integrity using performance feedback methods is to incorporate theories and research on social…

  2. On the performance of diagonal lattice space-time codes for the quasi-static MIMO channel

    KAUST Repository

    Abediseid, Walid; Alouini, Mohamed-Slim

    2013-01-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple-output (MIMO) channel. All the coding design to date focuses on either high-performance, high rates, low complexity encoding and decoding

  3. Bearing performance degradation assessment based on time-frequency code features and SOM network

    International Nuclear Information System (INIS)

    Zhang, Yan; Tang, Baoping; Han, Yan; Deng, Lei

    2017-01-01

    Bearing performance degradation assessment and prognostics are extremely important in supporting maintenance decision and guaranteeing the system’s reliability. To achieve this goal, this paper proposes a novel feature extraction method for the degradation assessment and prognostics of bearings. Features of time-frequency codes (TFCs) are extracted from the time-frequency distribution using a hybrid procedure based on short-time Fourier transform (STFT) and non-negative matrix factorization (NMF) theory. An alternative way to design the health indicator is investigated by quantifying the similarity between feature vectors using a self-organizing map (SOM) network. On the basis of this idea, a new health indicator called time-frequency code quantification error (TFCQE) is proposed to assess the performance degradation of the bearing. This indicator is constructed based on the bearing real-time behavior and the SOM model that is previously trained with only the TFC vectors under the normal condition. Vibration signals collected from the bearing run-to-failure tests are used to validate the developed method. The comparison results demonstrate the superiority of the proposed TFCQE indicator over many other traditional features in terms of feature quality metrics, incipient degradation identification and achieving accurate prediction. Highlights • Time-frequency codes are extracted to reflect the signals’ characteristics. • SOM network served as a tool to quantify the similarity between feature vectors. • A new health indicator is proposed to demonstrate the whole stage of degradation development. • The method is useful for extracting the degradation features and detecting the incipient degradation. • The superiority of the proposed method is verified using experimental data. (paper)

  4. Proposal for a Conceptual Model for Evaluating Lean Product Development Performance: A Study of LPD Enablers in Manufacturing Companies

    Science.gov (United States)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    The instability in today's market and the emerging demands for mass customized products by customers, are driving companies to seek for cost effective and time efficient improvements in their production system and this have led to real pressure for the adaptation of new developmental architecture and operational parameters to remain competitive in the market. Among such developmental architecture adopted, is the integration of lean thinking in the product development process. However, due to lack of clear understanding of the lean performance and its measurements, many companies are unable to implement and fully integrate the lean principle into their product development process and without a proper performance measurement, the performance level of the organizational value stream will be unknown and the specific area of improvement as it relates to the LPD program cannot be tracked. Hence, it will result in poor decision making in the LPD implementation. This paper therefore seeks to present a conceptual model for evaluation of LPD performances by identifying and analysing the core existing LPD enabler (Chief Engineer, Cross-functional teams, Set-based engineering, Poka-yoke (mistakeproofing), Knowledge-based environment, Value-focused planning and development, Top management support, Technology, Supplier integration, Workforce commitment and Continuous improvement culture) for assessing the LPD performance.

  5. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    Science.gov (United States)

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  6. Conceptual framework for performance assessment: competency, competence and performance in the context of assessments in healthcare--deciphering the terminology.

    Science.gov (United States)

    Khan, Kamran; Ramachandran, Sankaranarayanan

    2012-01-01

    The definitions of performance, competence and competency are not very clear in the literature. The assessment of performance and the selection of tools for this purpose depend upon a deep understanding of each of the above terms and the factors influencing performance. In this article, we distinguish between competence and competency and explain the relationship of competence and performance in the light of the Dreyfus model of skills acquisition. We briefly critique the application of the principles described by Miller to the modern assessment tools and distinguish between assessment of actual performance in workplace settings and the observed performance, demonstrated by the candidates in the workplace or simulated settings. We describe a modification of the Dreyfus model applicable to assessments in healthcare and propose a new model for the assessment of performance and performance rating scale (PRS) based on this model. We propose that the use of adapted versions of this PRS will result in benchmarking of performance and allowing the candidates to track their progression of skills in various areas of clinical practice.

  7. Conceptual framework for holistic dialysis management based on key performance indicators.

    Science.gov (United States)

    Liu, Hu-Chen; Itoh, Kenji

    2013-10-01

    This paper develops a theoretical framework of holistic hospital management based on performance indicators that can be applied to dialysis hospitals, clinics or departments in Japan. Selection of a key indicator set and its validity tests were performed primarily by a questionnaire survey to dialysis experts as well as their statements obtained through interviews. The expert questionnaire asked respondents to rate the degree of "usefulness" for each of 66 indicators on a three-point scale (19 responses collected). Applying the theoretical framework, we selected a minimum set of key performance indicators for dialysis management that can be used in the Japanese context. The indicator set comprised 27 indicators and items that will be collected through three surveys: patient satisfaction, employee satisfaction, and safety culture. The indicators were confirmed by expert judgment from viewpoints of face, content and construct validity as well as their usefulness. This paper established a theoretical framework of performance measurement for holistic dialysis management from primary healthcare stakeholders' perspectives. In this framework, performance indicators were largely divided into healthcare outcomes and performance shaping factors. Indicators of the former type may be applied for the detection of operational problems or weaknesses in a dialysis hospital, clinic or department, while latent causes of each problem can be more effectively addressed by the latter type of indicators in terms of process, structure and culture/climate within the organization. © 2013 The Authors. Therapeutic Apheresis and Dialysis © 2013 International Society for Apheresis.

  8. Simulating the performance of a distance-3 surface code in a linear ion trap

    Science.gov (United States)

    Trout, Colin J.; Li, Muyuan; Gutiérrez, Mauricio; Wu, Yukai; Wang, Sheng-Tao; Duan, Luming; Brown, Kenneth R.

    2018-04-01

    We explore the feasibility of implementing a small surface code with 9 data qubits and 8 ancilla qubits, commonly referred to as surface-17, using a linear chain of 171Yb+ ions. Two-qubit gates can be performed between any two ions in the chain with gate time increasing linearly with ion distance. Measurement of the ion state by fluorescence requires that the ancilla qubits be physically separated from the data qubits to avoid errors on the data due to scattered photons. We minimize the time required to measure one round of stabilizers by optimizing the mapping of the two-dimensional surface code to the linear chain of ions. We develop a physically motivated Pauli error model that allows for fast simulation and captures the key sources of noise in an ion trap quantum computer including gate imperfections and ion heating. Our simulations showed a consistent requirement of a two-qubit gate fidelity of ≥99.9% for the logical memory to have a better fidelity than physical two-qubit operations. Finally, we perform an analysis of the error subsets from the importance sampling method used to bound the logical error rates to gain insight into which error sources are particularly detrimental to error correction.

  9. Conservative performance analysis of a PWR nuclear fuel rod using the FRAPCON code

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Fabio Branco Vaz de; Sabundjian, Gaiane, E-mail: fabio@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    In this paper, some of the preliminary results of the sensitivity and conservative analysis of a hypothetical pressurized water reactor fuel rod are presented, using the FRAPCON code as a basic and preparation tool for the future transient analysis, which will be carried out by the FRAPTRAN code. Emphasis is given to the evaluation of the cladding behavior, since it is one of the critical containment barriers of the fission products, generated during fuel irradiation. Sensitivity analyses were performed by the variation of the values of some parameters, which were mainly related with thermal cycle conditions, and taking into account an intermediate value between the realistic and conservative conditions for the linear heat generation rate parameter, given in literature. Time lengths were taken from typical nuclear power plant operational cycle, adjusted to the obtention of a chosen burnup. Curves of fuel and cladding temperatures, and also for their mechanical and oxidation behavior, as a function of the reactor operation's time, are presented for each one of the nodes considered, over the nuclear fuel rod. Analyzing the curves, it was possible to observe the influence of the thermal cycle on the fuel rod performance, in this preliminary step for the accident/transient analysis. (author)

  10. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  11. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  12. Comparative performance evaluation of transform coding in image pre-processing

    Science.gov (United States)

    Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha

    2017-07-01

    We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.

  13. MIMO-OFDM System's Performance Using LDPC Codes for a Mobile Robot

    Science.gov (United States)

    Daoud, Omar; Alani, Omar

    This work deals with the performance of a Sniffer Mobile Robot (SNFRbot)-based spatial multiplexed wireless Orthogonal Frequency Division Multiplexing (OFDM) transmission technology. The use of Multi-Input Multi-Output (MIMO)-OFDM technology increases the wireless transmission rate without increasing transmission power or bandwidth. A generic multilayer architecture of the SNFRbot is proposed with low power and low cost. Some experimental results are presented and show the efficiency of sniffing deadly gazes, sensing high temperatures and sending live videos of the monitored situation. Moreover, simulation results show the achieved performance by tackling the Peak-to-Average Power Ratio (PAPR) problem of the used technology using Low Density Parity Check (LDPC) codes; and the effect of combating the PAPR on the bit error rate (BER) and the signal to noise ratio (SNR) over a Doppler spread channel.

  14. Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code

    Science.gov (United States)

    Lavelle, Thomas M.; Curlett, Brian P.

    1994-01-01

    XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.

  15. Climate for Innovation impacts on Adaptive Performance. Conceptualization, Measurement, and Validation

    Directory of Open Access Journals (Sweden)

    Stańczyk Sylwia

    2017-05-01

    Full Text Available The main objective of this paper was to examine the relationship between organizational climate for innovation and adaptive performance. The study was carried out in business organisations in Poland (N=387, representing variety of industries. The Cimate for Innovation measure and Individual Adaptive Performance measure was adopted from previous studies. The results of presented research point out that certain measurements of the organizational climate for innovation are interrelated to adaptive performance, especially supervisory encouragement. The present study discusses some aspects concerning the adaptation of existing instruments and measurements. On the basis of the research presented we indicate that, in general, the adaptation, of the mesearuments were relatively effective. The questionnaire was assessed as to be valid in terms of content for the reseraching CI and AP aspects in Poland.

  16. Performance Evaluation of SMART Passive Safety System for Small Break LOCA Using MARS Code

    International Nuclear Information System (INIS)

    Chun, Ji Han; Lee, Guy Hyung; Bae, Kyoo Hwan; Chung, Young Jong; Kim, Keung Koo

    2013-01-01

    SMART has significantly enhanced safety by reducing its core damage frequency to 1/10 that of a conventional nuclear power plant. KAERI is developing a passive safety injection system to replace the active safety injection pump in SMART. It consists of four trains, each of which includes gravity-driven core makeup tank (CMT) and safety injection tank (SIT). This system is required to meet the passive safety performance requirements, i.e., the capability to maintain a safe shutdown condition for a minimum of 72 hours without an AC power supply or operator action in the case of design basis accidents (DBAs). The CMT isolation valve is opened by the low pressurizer pressure signal, and the SIT isolation valve is opened at 2 MPa. Additionally, two stages of automatic depressurization systems are used for rapid depressurization. Preliminary safety analysis of SMART passive safety system in the event of a small-break loss-of-coolant accident (SBLOCA) was performed using MARS code. In this study, the safety analysis results of a guillotine break of safety injection line which was identified as the limiting SBLOCA in SMART are given. The preliminary safety analysis of a SBLOCA for the SMART passive safety system was performed using the MARS code. The analysis results of the most limiting SI line guillotine break showed that the collapsed liquid level inside the core support barrel was maintained sufficiently high above the top of core throughout the transient. This means that the passive safety injection flow from the CMT and SIT causes no core uncovery during the 72 hours following the break with no AC power supply or operator action, which in turn results in a consistent decrease in the fuel cladding temperature. Therefore, the SMART passive safety system can meet the passive safety performance requirement of maintaining the plant at a safe shutdown condition for a minimum of 72 hours without AC power or operator action for a representing accident of SBLOCA

  17. Analysis of Competencies, Job Satisfaction and Organizational Commitment as Indicators of Job Performance: A Conceptual Framework

    Science.gov (United States)

    Khan, Asad; Masrek, Mohamad Noorman; Nadzar, Fuziah Mohamad

    2015-01-01

    Like other disciplines, organizational and technological innovations have influenced the standard philosophies of librarianship. These innovations have changed the basics of information retrieval and delivery in libraries. As a result, library authorities are demanding competency-based job performance. Nonetheless, there is a scarcity of research…

  18. Conceptual Paradigms and Empirical Investigations for Evaluating INTELSAT's Past Performance and Future Prospects.

    Science.gov (United States)

    Snow, Marcellus S.

    This paper summarizes the results of a recent study of the past performance and future prospects of the International Telecommunications Satellite (INTELSAT) Organization. First, an overview of INTELSAT's history is provided and major policy issues are detailed. Five alternative paradigms are then presented through which to evaluate INTELSAT's…

  19. Conceptual representation of verbs in bilinguals: semantic field effects and a second-language performance paradox.

    Science.gov (United States)

    Segalowitz, Norman; de Almeida, Roberto G

    2002-01-01

    It is well known that bilinguals perform better in their first language (L1) than in their second lanaguage (L2) in a wide range of linguistic tasks. In recent studies, however, the authors have found that bilingual participants can demonstrate faster response times to L1 stimuli than to L2 stimuli in one classification task and the reverse in a different classification task. In the current study, they investigated the reasons for this "L2-better-than-L1" effect. English-French bilinguals performed one word relatedness and two categorization tasks with verbs of motion (e.g., run) and psychological verbs (e.g., admire) in both languages. In the word relatedness task, participants judged how closely related pairs of verbs from both categories were. In a speeded semantic categorization task, participants classified the verbs according to their semantic category (psychological or motion). In an arbitrary classification task, participants had to learn how verbs had been assigned to two arbitrary categories. Participants performed better in L1 in the semantic classification task but paradoxically better in L2 in the arbitrary classification task. To account for these effects, the authors used the ratings from the word relatedness task to plot three-dimensional "semantic fields" for the verbs. Cross-language field differences were found to be significantly related to the paradoxical performance and to fluency levels. The results have implications for understanding of how bilinguals represent verbs in the mental lexicon. Copyright 2002 Elsevier Science (USA).

  20. Foreign Language Listening Anxiety and Listening Performance: Conceptualizations and Causal Relationships

    Science.gov (United States)

    Zhang, Xian

    2013-01-01

    This study used structural equation modeling to explore the possible causal relations between foreign language (English) listening anxiety and English listening performance. Three hundred participants learning English as a foreign language (FL) completed the foreign language listening anxiety scale (FLLAS) and IELTS test twice with an interval of…

  1. Performance assessment of sealing systems. Conceptual and integrated modelling of plugs and seals

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Andre; Buhmann, Dieter; Kindlein, Jonathan; Lauke, Thomas

    2016-08-15

    The long-time isolation of radionuclides from the biosphere is the goal of the storage of radioactive waste in deep geological repositories. For repositories in rock salt, this goal is achieved on the one hand by the impermeable undisturbed part of the salt host rock formation and on the other hand by crushed salt, which is used to backfill the mine openings in the emplacement areas and galleries created during the construction of the repository. The crushed salt backfill is compacted over time and achieves a sufficiently high hydraulic resistance to avoid inflow of brines into the emplacement areas of the repository in the long-term. Plugs and seals must additionally provide their sealing function during the early post closure phase, until the compaction of the backfill is adequate and the permeability of the backfill is sufficiently low. To assess the future development of the waste repository, an adequate knowledge of the material behaviour is necessary and related mathematical models must be developed to be able to perform predictions on the long-term safety of the repository. An integrated performance assessment model was formulated that describes the long-term behaviour of a sealing built from salt concrete. The average permeability of the sealing changes with time after its emplacement from various processes of which two were regarded in a constitutive model: first, the healing of the EDZ in the host rock around the sealing, and second, the corrosion of the salt concrete material resulting from brine attack. Empirical parameter model functions were defined for both processes to reflect the actual behaviour. The mathematical model was implemented in the integrated performance assessment model LOPOS which is used by GRS as near-field model for repositories in salt. Deterministic and probabilistic calculations were performed with realistic parameters showing how the permeability of the sealing decreases during the first 2 000 years due to the healing of the EDZ

  2. Performance assessment of sealing systems. Conceptual and integrated modelling of plugs and seals

    International Nuclear Information System (INIS)

    Ruebel, Andre; Buhmann, Dieter; Kindlein, Jonathan; Lauke, Thomas

    2016-08-01

    The long-time isolation of radionuclides from the biosphere is the goal of the storage of radioactive waste in deep geological repositories. For repositories in rock salt, this goal is achieved on the one hand by the impermeable undisturbed part of the salt host rock formation and on the other hand by crushed salt, which is used to backfill the mine openings in the emplacement areas and galleries created during the construction of the repository. The crushed salt backfill is compacted over time and achieves a sufficiently high hydraulic resistance to avoid inflow of brines into the emplacement areas of the repository in the long-term. Plugs and seals must additionally provide their sealing function during the early post closure phase, until the compaction of the backfill is adequate and the permeability of the backfill is sufficiently low. To assess the future development of the waste repository, an adequate knowledge of the material behaviour is necessary and related mathematical models must be developed to be able to perform predictions on the long-term safety of the repository. An integrated performance assessment model was formulated that describes the long-term behaviour of a sealing built from salt concrete. The average permeability of the sealing changes with time after its emplacement from various processes of which two were regarded in a constitutive model: first, the healing of the EDZ in the host rock around the sealing, and second, the corrosion of the salt concrete material resulting from brine attack. Empirical parameter model functions were defined for both processes to reflect the actual behaviour. The mathematical model was implemented in the integrated performance assessment model LOPOS which is used by GRS as near-field model for repositories in salt. Deterministic and probabilistic calculations were performed with realistic parameters showing how the permeability of the sealing decreases during the first 2 000 years due to the healing of the EDZ

  3. Performance of Different OCDMA Codes with FWM and XPM Nonlinear Effects

    Science.gov (United States)

    Rana, Shivani; Gupta, Amit

    2017-08-01

    In this paper, 1 Gb/s non-linear optical code division multiple access system have been simulated and modeled. To reduce multiple user interference multi-diagonal (MD) code which possesses the property of having zero cross-correlation have been deployed. The MD code shows better results than Walsh-Hadamard and multi-weight code under the nonlinear effect of four-wave mixing (FWM) and cross-phase modulation (XPM). The simulation results reveal that effect of FWM reduces when MD codes are employed as compared to other codes.

  4. Effects of intellectual capital and knowledge management on Thai food industry organizational performance: A conceptual perspective

    Directory of Open Access Journals (Sweden)

    Pimsara Yaklai

    2017-01-01

    Full Text Available Due to rising urbanization and an expanding middle class, Thailand will rely more on processed foods. This is combined with a 22% export market to other Southeast Asian nations, as well as China and Japan. With an abundance of natural resources combined with significant investments in technology, food safety, R&D, and more efforts at adhering to international quality standards, Thailand has become the ‘Kitchen of the World’, becoming the largest sole net food exporter in Asia. Therefore, the researchers aspire to develop a structural equation model of factors affecting intellectual capital and knowledge management on the Thai food industry organizational performance using both quantitative and qualitative research methods. Questionnaires using a 7-point Likert scale are to be analysed using SEM techniques, with focus being given to financial performance, internal processes and learning and growth and their contributions in enhancing the industry’s global competitiveness and sustainability.

  5. Levels & Barriers to Supply Chain Integration: A conceptual model of Supply Chain Performance

    OpenAIRE

    RajaIrfan Sabir; Muhammad Irfan

    2014-01-01

    In modern business scenario Supply chain has become the back bone for every business organization. All supply chain partners are joined together in value delivery network of company that no one can perform better without support of other. The ultimate objective of this cohesive relationship is to deliver value to customers and gets desired state of customer satisfaction & loyalty for the organization. For this purpose it is necessary to integrate the internal and external partners of Supply c...

  6. Iterative linear solvers in a 2D radiation-hydrodynamics code: Methods and performance

    International Nuclear Information System (INIS)

    Baldwin, C.; Brown, P.N.; Falgout, R.; Graziani, F.; Jones, J.

    1999-01-01

    Computer codes containing both hydrodynamics and radiation play a central role in simulating both astrophysical and inertial confinement fusion (ICF) phenomena. A crucial aspect of these codes is that they require an implicit solution of the radiation diffusion equations. The authors present in this paper the results of a comparison of five different linear solvers on a range of complex radiation and radiation-hydrodynamics problems. The linear solvers used are diagonally scaled conjugate gradient, GMRES with incomplete LU preconditioning, conjugate gradient with incomplete Cholesky preconditioning, multigrid, and multigrid-preconditioned conjugate gradient. These problems involve shock propagation, opacities varying over 5--6 orders of magnitude, tabular equations of state, and dynamic ALE (Arbitrary Lagrangian Eulerian) meshes. They perform a problem size scalability study by comparing linear solver performance over a wide range of problem sizes from 1,000 to 100,000 zones. The fundamental question they address in this paper is: Is it more efficient to invert the matrix in many inexpensive steps (like diagonally scaled conjugate gradient) or in fewer expensive steps (like multigrid)? In addition, what is the answer to this question as a function of problem size and is the answer problem dependent? They find that the diagonally scaled conjugate gradient method performs poorly with the growth of problem size, increasing in both iteration count and overall CPU time with the size of the problem and also increasing for larger time steps. For all problems considered, the multigrid algorithms scale almost perfectly (i.e., the iteration count is approximately independent of problem size and problem time step). For pure radiation flow problems (i.e., no hydrodynamics), they see speedups in CPU time of factors of ∼15--30 for the largest problems, when comparing the multigrid solvers relative to diagonal scaled conjugate gradient

  7. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  8. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  9. Measured performances on vectorization and multitasking with a Monte Carlo code for neutron transport problems

    International Nuclear Information System (INIS)

    Chauvet, Y.

    1985-01-01

    This paper summarized two improvements of a real production code by using vectorization and multitasking techniques. After a short description of Monte Carlo algorithms employed in neutron transport problems, the authors briefly describe the work done in order to get a vector code. Vectorization principles are presented and measured performances on the CRAY 1S, CYBER 205 and CRAY X-MP compared in terms of vector lengths. The second part of this work is an adaptation to multitasking on the CRAY X-MP using exclusively standard multitasking tools available with FORTRAN under the COS 1.13 system. Two examples are presented. The goal of the first one is to measure the overhead inherent to multitasking when tasks become too small and to define a granularity threshold, that is to say a minimum size for a task. With the second example they propose a method that is very X-MP oriented in order to get the best speedup factor on such a computer. In conclusion they prove that Monte Carlo algorithms are very well suited to future vector and parallel computers

  10. Performance of the improved version of Monte Carlo Code A3MCNP for cask shielding design

    International Nuclear Information System (INIS)

    Hasegawa, T.; Ueki, K.; Sato, O.; Sjoden, G.E.; Miyake, Y.; Ohmura, M.; Haghighat, A.

    2004-01-01

    A 3 MCNP (Automatic Adjoint Accelerated MCNP) is a revised version of the MCNP Monte Carlo code, that automatically prepares variance reduction parameters for the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using a deterministic ''importance'' (or adjoint) function, CADIS performs source and transport biasing within the weight-window technique. The current version of A 3 MCNP uses the 3-D Sn transport TORT code to determine a 3-D importance function distribution. Based on simulation of several real-life problems, it is demonstrated that A3MCNP provides precise calculation results with a remarkably short computation time by using the proper and objective variance reduction parameters. However, since the first version of A 3 MCNP provided only a point source configuration option for large-scale shielding problems, such as spent-fuel transport casks, a large amount of memory may be necessary to store enough points to properly represent the source. Hence, we have developed an improved version of A 3 MCNP (referred to as A 3 MCNPV) which has a volumetric source configuration option. This paper describes the successful use of A 3 MCNPV for cask neutron and gamma-ray shielding problem

  11. Measured performances on vectorization and multitasking with a Monte Carlo code for neutron transport problems

    International Nuclear Information System (INIS)

    Chauvet, Y.

    1985-01-01

    This paper summarized two improvements of a real production code by using vectorization and multitasking techniques. After a short description of Monte Carlo algorithms employed in our neutron transport problems, we briefly describe the work we have done in order to get a vector code. Vectorization principles will be presented and measured performances on the CRAY 1S, CYBER 205 and CRAY X-MP compared in terms of vector lengths. The second part of this work is an adaptation to multitasking on the CRAY X-MP using exclusively standard multitasking tools available with FORTRAN under the COS 1.13 system. Two examples will be presented. The goal of the first one is to measure the overhead inherent to multitasking when tasks become too small and to define a granularity threshold, that is to say a minimum size for a task. With the second example we propose a method that is very X-MP oriented in order to get the best speedup factor on such a computer. In conclusion we prove that Monte Carlo algorithms are very well suited to future vector and parallel computers. (orig.)

  12. Development of free-piston Stirling engine performance and optimization codes based on Martini simulation technique

    Science.gov (United States)

    Martini, William R.

    1989-01-01

    A FORTRAN computer code is described that could be used to design and optimize a free-displacer, free-piston Stirling engine similar to the RE-1000 engine made by Sunpower. The code contains options for specifying displacer and power piston motion or for allowing these motions to be calculated by a force balance. The engine load may be a dashpot, inertial compressor, hydraulic pump or linear alternator. Cycle analysis may be done by isothermal analysis or adiabatic analysis. Adiabatic analysis may be done using the Martini moving gas node analysis or the Rios second-order Runge-Kutta analysis. Flow loss and heat loss equations are included. Graphical display of engine motions and pressures and temperatures are included. Programming for optimizing up to 15 independent dimensions is included. Sample performance results are shown for both specified and unconstrained piston motions; these results are shown as generated by each of the two Martini analyses. Two sample optimization searches are shown using specified piston motion isothermal analysis. One is for three adjustable input and one is for four. Also, two optimization searches for calculated piston motion are presented for three and for four adjustable inputs. The effect of leakage is evaluated. Suggestions for further work are given.

  13. FEMAXI-III. An axisymmetric finite element computer code for the analysis of fuel rod performance

    International Nuclear Information System (INIS)

    Ichikawa, M.; Nakajima, T.; Okubo, T.; Iwano, Y.; Ito, K.; Kashima, K.; Saito, H.

    1980-01-01

    For the analysis of local deformation of fuel rods, which is closely related to PCI failure in LWR, FEMAXI-III has been developed as an improved version based on the essential models of FEMAXI-II, MIPAC, and FEAST codes. The major features of FEMAXI-III are as follows: Elasto-plasticity, creep, pellet cracking, relocation, densification, hot pressing, swelling, fission gas release, and their interrelated effects are considered. Contact conditions between pellet and cladding are exactly treated, where sliding or sticking is defined by iterations. Special emphasis is placed on creep and pellet cracking. In the former, an implicit algorithm is applied to improve numerical stability. In the latter, the pellet is assumed to be non-tension material. The recovery of pellet stiffness under compression is related to initial relocation. Quadratic isoparametric elements are used. The skyline method is applied to solve linear stiffness equation to reduce required core memories. The basic performance of the code has been proven to be satisfactory. (author)

  14. Development of an object-oriented simulation code for repository performance assessment

    International Nuclear Information System (INIS)

    Tsujimoto, Keiichi; Ahn, J.

    1999-01-01

    As understanding for mechanisms of radioactivity confinement by a deep geologic repository improves at the individual process level, it has become imperative to evaluate consequences of individual processes to the performance of the whole repository system. For this goal, the authors have developed a model for radionuclide transport in, and release from, the repository region by incorporating multiple-member decay chains and multiple waste canisters. A computer code has been developed with C++, an object-oriented language. By utilizing the feature that a geologic repository consists of thousands of objects of the same kind, such as the waste canister, the repository region is divided into multiple compartments and objects for simulation of radionuclide transport. Massive computational tasks are distributed over, and executed by, multiple networked workstations, with the help of parallel virtual machine (PVM) technology. Temporal change of the mass distribution of 28 radionuclides in the repository region for the time period of 100 million yr has been successfully obtained by the code

  15. A conceptual framework for achieving performance enhancing drug compliance in sport.

    Science.gov (United States)

    Donovan, Robert J; Egger, Garry; Kapernick, Vicki; Mendoza, John

    2002-01-01

    There has been, and continues to be, widespread international concern about athletes' use of banned performance enhancing drugs (PEDs). This concern culminated in the formation of the World Anti-Doping Agency (WADA) in November 1999. To date, the main focus on controlling the use of PEDs has been on testing athletes and the development of tests to detect usage. Although athletes' beliefs and values are known to influence whether or not an athlete will use drugs, little is known about athletes' beliefs and attitudes, and the limited empirical literature shows little use of behavioural science frameworks to guide research methodology, results interpretation, and intervention implications. Mindful of this in preparing its anti-doping strategy for the 2000 Olympics, the Australian Sports Drug Agency (ASDA) in 1997 commissioned a study to assess the extent to which models of attitude-behaviour change in the public health/injury prevention literature had useful implications for compliance campaigns in the sport drug area. A preliminary compliance model was developed from three behavioural science frameworks: social cognition models; threat (or fear) appeals; and instrumental and normative approaches. A subsequent review of the performance enhancing drug literature confirmed that the overall framework was consistent with known empirical data, and therefore had at least face validity if not construct validity. The overall model showed six major inputs to an athlete's attitudes and intentions with respect to performance enhancing drug usage: personality factors, threat appraisal, benefit appraisal, reference group influences, personal morality and legitimacy. The model demonstrated that a comprehensive, fully integrated programme is necessary for maximal effect, and provides anti-doping agencies with a structured framework for strategic planning and implementing interventions. Programmes can be developed in each of the six major areas, with allocation of resources to each

  16. Conceptualizing Success And Performance for Adult Learners: Merging the Contexts of Adult Education and Professional Training

    Directory of Open Access Journals (Sweden)

    Rieger Gil

    2017-12-01

    Full Text Available This article examines the concepts of learning performance and learning success within the context of adult learners. The focus is on how these concepts have been defined in the literature, as they are anchored in different didactic theories and how they can effectively be applied to learning contexts with adults. Due to the divergent approaches and definitions in the literature, this article compares, categorizes and merges the literature, providing an overview and recommendations for practice. The overview refers to a critical examination of constructivism based approaches compared to other didactic learning theories such as cognitivism or behaviorism. Adult education presents itself as a dynamic area that can develop progressively, in both the professional and educational environments. Nowadays, it is important to be able to collect and use information quickly. This makes it possible to gain an advantage and deal with problems or questions in more focused ways. One must deal with increasing demands and a higher number of competitors not only in professional life. A synthesis of the literature can be presented by examining the terms of learning performance and learning success in different approaches, regarding implementations, definitions, historical developments as well as continuative and connected concepts, tendencies or point of views.

  17. A Study of Performance in Low-Power Tokamak Reactor with Integrated Predictive Modeling Code

    International Nuclear Information System (INIS)

    Pianroj, Y.; Onjun, T.; Suwanna, S.; Picha, R.; Poolyarat, N.

    2009-07-01

    Full text: A fusion hybrid or a small fusion power output with low power tokamak reactor is presented as another useful application of nuclear fusion. Such tokamak can be used for fuel breeding, high-level waste transmutation, hydrogen production at high temperature, and testing of nuclear fusion technology components. In this work, an investigation of the plasma performance in a small fusion power output design is carried out using the BALDUR predictive integrated modeling code. The simulations of the plasma performance in this design are carried out using the empirical-based Mixed Bohm/gyro Bohm (B/gB) model, whereas the pedestal temperature model is based on magnetic and flow shear (δ α ρ ζ 2 ) stabilization pedestal width scaling. The preliminary results using this core transport model show that the central ion and electron temperatures are rather pessimistic. To improve the performance, the optimization approach are carried out by varying some parameters, such as plasma current and power auxiliary heating, which results in some improvement of plasma performance

  18. Nuclear code case development of printed-circuit heat exchangers with thermal and mechanical performance testing

    Energy Technology Data Exchange (ETDEWEB)

    Aakre, Shaun R. [Univ. of Wisconsin, Madison, WI (United States). Dept. of Mechanical Engineering; Jentz, Ian W. [Univ. of Wisconsin, Madison, WI (United States). Dept. of Mechanical Engineering; Anderson, Mark H. [Univ. of Wisconsin, Madison, WI (United States). Dept. of Mechanical Engineering

    2018-03-27

    The U.S. Department of Energy has agreed to fund a three-year integrated research project to close technical gaps involved with compact heat exchangers to be used in nuclear applications. This paper introduces the goals of the project, the research institutions, and industrial partners working in collaboration to develop a draft Boiler and Pressure Vessel Code Case for this technology. Heat exchanger testing, as well as non-destructive and destructive evaluation, will be performed by researchers across the country to understand the performance of compact heat exchangers. Testing will be performed using coolants and conditions proposed for Gen IV Reactor designs. Preliminary observations of the mechanical failure mechanisms of the heat exchangers using destructive and non-destructive methods is presented. Unit-cell finite element models assembled to help predict the mechanical behavior of these high-temperature components are discussed as well. Performance testing methodology is laid out in this paper along with preliminary modeling results, an introduction to x-ray and neutron inspection techniques, and results from a recent pressurization test of a printed-circuit heat exchanger. The operational and quality assurance knowledge gained from these models and validation tests will be useful to developers of supercritical CO2 systems, which commonly employ printed-circuit heat exchangers.

  19. Performance analysis of linear codes under maximum-likelihood decoding: a tutorial

    National Research Council Canada - National Science Library

    Sason, Igal; Shamai, Shlomo

    2006-01-01

    ..., upper and lower bounds on the error probability of linear codes under ML decoding are surveyed and applied to codes and ensembles of codes on graphs. For upper bounds, we discuss various bounds where focus is put on Gallager bounding techniques and their relation to a variety of other reported bounds. Within the class of lower bounds, we ad...

  20. Improvement of implicit finite element code performance in deep drawing simulations by dynamics contributions

    NARCIS (Netherlands)

    Meinders, Vincent T.; van den Boogaard, Antonius H.; Huetink, Han

    2003-01-01

    To intensify the use of implicit finite element codes for solving large scale problems, the computation time of these codes has to be decreased drastically. A method is developed which decreases the computational time of implicit codes by factors. The method is based on introducing inertia effects

  1. A Conceptual Model of Excellent Performance Mode of Port Enterprise Logistics Management

    Directory of Open Access Journals (Sweden)

    Jiang He

    2017-11-01

    Full Text Available Port as one of the key hubs of international logistics, which has become the main part and the base of global logistics management. The port enterprises, plays an important role in the global supply chain. However, due to the lack of understanding in port supply chain management, coordination between the port enterprises, the integration of business process is not perfect, the lack of information sharing between various organizations, ports enterprises usually failed to fully play its positive role. Based on this, the paper makes the port enterprises as the research object, and introduces the excellent performance mode into the port enterprises. In order to study the port enterprises how to carry out effective quality management, and formation the coordination and integration of upstream and downstream of enterprises, so as to realize the competitive advantage in port logistics.

  2. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings. PMID:27242491

  3. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  4. A conceptual framework for future-proofing the energy performance of buildings

    International Nuclear Information System (INIS)

    Georgiadou, Maria Christina; Hacking, Theophilus; Guthrie, Peter

    2012-01-01

    This paper presents a review undertaken to understand the concept of ‘future-proofing’ the energy performance of buildings. The long lifecycles of the building stock, the impacts of climate change and the requirements for low carbon development underline the need for long-term thinking from the early design stages. ‘Future-proofing’ is an emerging research agenda with currently no widely accepted definition amongst scholars and building professionals. In this paper, it refers to design processes that accommodate explicitly full lifecycle perspectives and energy trends and drivers by at least 2050, when selecting energy efficient measures and low carbon technologies. A knowledge map is introduced, which explores the key axes (or attributes) for achieving a ‘future-proofed’ energy design; namely, coverage of sustainability issues, lifecycle thinking, and accommodating risks and uncertainties that affect the energy consumption. It is concluded that further research is needed so that established building energy assessment methods are refined to better incorporate future-proofing. The study follows an interdisciplinary approach and is targeted at design teams with aspirations to achieve resilient and flexible low-energy buildings over the long-term. - Highlights: ► We examine the concept of ‘future-proofing’ the energy performance of buildings. ► It reconciles sustainability issues, lifecycle thinking, risks and uncertainties. ► A knowledge map with axes and types of ‘future-proofed’ solutions is presented. ► The energy design process should adopt full lifecycle considerations. ► Design for flexibility, use of dynamic models and futures techniques are suggested.

  5. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  6. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    Directory of Open Access Journals (Sweden)

    Sara Moghaddam

    2013-08-01

    Full Text Available Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective.Methods: A cross-sectional comparative study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13.Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  7. Performance assessment of new neutron cross section libraries using MCNP code and some critical benchmarks

    International Nuclear Information System (INIS)

    Bakkari, B El; Bardouni, T El.; Erradi, L.; Chakir, E.; Meroun, O.; Azahra, M.; Boukhal, H.; Khoukhi, T El.; Htet, A.

    2007-01-01

    Full text: New releases of nuclear data files made available during the few recent years. The reference MCNP5 code (1) for Monte Carlo calculations is usually distributed with only one standard nuclear data library for neutron interactions based on ENDF/B-VI. The main goal of this work is to process new neutron cross sections libraries in ACE continuous format for MCNP code based on the most recent data files recently made available for the scientific community : ENDF/B-VII.b2, ENDF/B-VI (release 8), JEFF3.0, JEFF-3.1, JENDL-3.3 and JEF2.2. In our data treatment, we used the modular NJOY system (release 99.9) (2) in conjunction with its most recent upadates. Assessment of the processed point wise cross sections libraries performances was made by means of some criticality prediction and analysis of other integral parameters for a set of reactor benchmarks. Almost all the analyzed benchmarks were taken from the international handbook of Evaluated criticality safety benchmarks experiments from OECD (3). Some revised benchmarks were taken from references (4,5). These benchmarks use Pu-239 or U-235 as the main fissionable materiel in different forms, different enrichments and cover various geometries. Monte Carlo calculations were performed in 3D with maximum details of benchmark description and the S(α,β) cross section treatment was adopted in all thermal cases. The resulting one standard deviation confidence interval for the eigenvalue is typically +/-13% to +/-20 pcm [fr

  8. How grammaticized concepts shape event conceptualization in language production: Insights from linguistic analysis, eye tracking data, and memory performance

    NARCIS (Netherlands)

    Stutterheim, C. von; Andermann, M.; Carroll, M.; Flecken, M.E.P.; Schmiedtová, B.

    2012-01-01

    The role of grammatical systems in profiling particular conceptual categories is used as a key in exploring questions concerning language specificity during the conceptualization phase in language production. This study focuses on the extent to which crosslinguistic differences in the concepts

  9. Validating the Japanese Translation of the Force and Motion Conceptual Evaluation and Comparing Performance Levels of American and Japanese Students

    Science.gov (United States)

    Ishimoto, Michi; Thornton, Ronald K.; Sokoloff, David R.

    2014-01-01

    This study assesses the Japanese translation of the Force and Motion Conceptual Evaluation (FMCE). Researchers are often interested in comparing the conceptual ideas of students with different cultural backgrounds. The FMCE has been useful in identifying the concepts of English-speaking students from different backgrounds. To identify effectively…

  10. Evaluation of Design & Analysis Code, CACTUS, for Predicting Crossflow Hydrokinetic Turbine Performance

    Energy Technology Data Exchange (ETDEWEB)

    Wosnik, Martin [Univ. of New Hampshire, Durham, NH (United States). Center for Ocean Renewable Energy; Bachant, Pete [Univ. of New Hampshire, Durham, NH (United States). Center for Ocean Renewable Energy; Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Murphy, Andrew W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    CACTUS, developed by Sandia National Laboratories, is an open-source code for the design and analysis of wind and hydrokinetic turbines. While it has undergone extensive validation for both vertical axis and horizontal axis wind turbines, and it has been demonstrated to accurately predict the performance of horizontal (axial-flow) hydrokinetic turbines, its ability to predict the performance of crossflow hydrokinetic turbines has yet to be tested. The present study addresses this problem by comparing the predicted performance curves derived from CACTUS simulations of the U.S. Department of Energy’s 1:6 scale reference model crossflow turbine to those derived by experimental measurements in a tow tank using the same model turbine at the University of New Hampshire. It shows that CACTUS cannot accurately predict the performance of this crossflow turbine, raising concerns on its application to crossflow hydrokinetic turbines generally. The lack of quality data on NACA 0021 foil aerodynamic (hydrodynamic) characteristics over the wide range of angles of attack (AoA) and Reynolds numbers is identified as the main cause for poor model prediction. A comparison of several different NACA 0021 foil data sources, derived using both physical and numerical modeling experiments, indicates significant discrepancies at the high AoA experienced by foils on crossflow turbines. Users of CACTUS for crossflow hydrokinetic turbines are, therefore, advised to limit its application to higher tip speed ratios (lower AoA), and to carefully verify the reliability and accuracy of their foil data. Accurate empirical data on the aerodynamic characteristics of the foil is the greatest limitation to predicting performance for crossflow turbines with semi-empirical models like CACTUS. Future improvements of CACTUS for crossflow turbine performance prediction will require the development of accurate foil aerodynamic characteristic data sets within the appropriate ranges of Reynolds numbers and AoA.

  11. Performance Comparison of Orthogonal and Quasi-orthogonal Codes in Quasi-Synchronous Cellular CDMA Communication

    Science.gov (United States)

    Jos, Sujit; Kumar, Preetam; Chakrabarti, Saswat

    Orthogonal and quasi-orthogonal codes are integral part of any DS-CDMA based cellular systems. Orthogonal codes are ideal for use in perfectly synchronous scenario like downlink cellular communication. Quasi-orthogonal codes are preferred over orthogonal codes in the uplink communication where perfect synchronization cannot be achieved. In this paper, we attempt to compare orthogonal and quasi-orthogonal codes in presence of timing synchronization error. This will give insight into the synchronization demands in DS-CDMA systems employing the two classes of sequences. The synchronization error considered is smaller than chip duration. Monte-Carlo simulations have been carried out to verify the analytical and numerical results.

  12. Enhancing Image Processing Performance for PCID in a Heterogeneous Network of Multi-code Processors

    Science.gov (United States)

    Linderman, R.; Spetka, S.; Fitzgerald, D.; Emeny, S.

    The Physically-Constrained Iterative Deconvolution (PCID) image deblurring code is being ported to heterogeneous networks of multi-core systems, including Intel Xeons and IBM Cell Broadband Engines. This paper reports results from experiments using the JAWS supercomputer at MHPCC (60 TFLOPS of dual-dual Xeon nodes linked with Infiniband) and the Cell Cluster at AFRL in Rome, NY. The Cell Cluster has 52 TFLOPS of Playstation 3 (PS3) nodes with IBM Cell Broadband Engine multi-cores and 15 dual-quad Xeon head nodes. The interconnect fabric includes Infiniband, 10 Gigabit Ethernet and 1 Gigabit Ethernet to each of the 336 PS3s. The results compare approaches to parallelizing FFT executions across the Xeons and the Cell's Synergistic Processing Elements (SPEs) for frame-level image processing. The experiments included Intel's Performance Primitives and Math Kernel Library, FFTW3.2, and Carnegie Mellon's SPIRAL. Optimization of FFTs in the PCID code led to a decrease in relative processing time for FFTs. Profiling PCID version 6.2, about one year ago, showed the 13 functions that accounted for the highest percentage of processing were all FFT processing functions. They accounted for over 88% of processing time in one run on Xeons. FFT optimizations led to improvement in the current PCID version 8.0. A recent profile showed that only two of the 19 functions with the highest processing time were FFT processing functions. Timing measurements showed that FFT processing for PCID version 8.0 has been reduced to less than 19% of overall processing time. We are working toward a goal of scaling to 200-400 cores per job (1-2 imagery frames/core). Running a pair of cores on each set of frames reduces latency by implementing parallel FFT processing. Our current results show scaling well out to 100 pairs of cores. These results support the next higher level of parallelism in PCID, where groups of several hundred frames each producing one resolved image are sent to cliques of several

  13. Above the nominal limit performance evaluation of multiwavelength optical code-division multiple-access systems

    Science.gov (United States)

    Inaty, Elie; Raad, Robert; Fortier, Paul; Shalaby, Hossam M. H.

    2009-03-01

    We provide an analysis for the performance of a multiwavelength optical code-division multiple-access (MW-OCDMA) network when the system is working above the nominal transmission rate limit imposed by passive encoding-decoding operation. We address the problem of overlapping in such a system and how it can directly affect the bit error rate (BER). A unified mathematical framework is presented under the assumption of one-coincidence sequences with nonrepeating wavelengths. A closed form expression of the multiple access interference limited BER is provided as a function of different system parameters. Results show that the performance of the MW-OCDMA system can be critically affected when working above the nominal limit, an event that can happen when the network operates at a high transmission rate. In addition, the impact of the derived error probability on the performance of two newly proposed medium access control (MAC) protocols, the S-ALOHA and the R3T, is also investigated. It is shown that for low transmission rates, the S-ALOHA is better than the R3T, while the R3T is better at very high transmission rates. In general, it is postulated that the R3T protocol suffers a higher delay mainly because of the presence of additional modes.

  14. Stego Keys Performance on Feature Based Coding Method in Text Domain

    Directory of Open Access Journals (Sweden)

    Din Roshidi

    2017-01-01

    Full Text Available A main critical factor on embedding process in any text steganography method is a key used known as stego key. This factor will be influenced the success of the embedding process of text steganography method to hide a message from third party or any adversary. One of the important aspects on embedding process in text steganography method is the fitness performance of the stego key. Three parameters of the fitness performance of the stego key have been identified such as capacity ratio, embedded fitness ratio and saving space ratio. It is because a better as capacity ratio, embedded fitness ratio and saving space ratio offers of any stego key; a more message can be hidden. Therefore, main objective of this paper is to analyze three features coding based namely CALP, VERT and QUAD of stego keys in text steganography on their capacity ratio, embedded fitness ratio and saving space ratio. It is found that CALP method give a good effort performance compared to VERT and QUAD methods.

  15. The added value of international benchmarks for fuel performance codes: an illustration on the basis of TRANSURANUS

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Schubert, A.; Gyeori, C.; Van De Laar, J.

    2009-01-01

    Safety authorities and fuel designers, as well as nuclear research centers rely heavily on fuel performance codes for predicting the behaviour and life-time of fuel rods. The simulation tools are developed and validated on the basis of experimental results, some of which is in the public domain such as the International Fuel Performance Experiments database of the OECD/NEA and IAEA. Publicly available data constitute an excellent basis for assessing codes themselves, but also to compare codes that are being developed by independent teams. The present report summarises the advantages for the TRANSURANUS code by taking part in previous benchmarks organised by the IAEA, and outlines the preliminary results along with the perspectives of our participation in the current coordinated research project FUMEXIII

  16. Performance analysis of 2D asynchronous hard-limiting optical code-division multiple access system through atmospheric scattering channel

    Science.gov (United States)

    Zhao, Yaqin; Zhong, Xin; Wu, Di; Zhang, Ye; Ren, Guanghui; Wu, Zhilu

    2013-09-01

    Optical code-division multiple access (OCDMA) systems usually allocate orthogonal or quasi-orthogonal codes to the active users. When transmitting through atmospheric scattering channel, the coding pulses are broadened and the orthogonality of the codes is worsened. In truly asynchronous case, namely both the chips and the bits are asynchronous among each active user, the pulse broadening affects the system performance a lot. In this paper, we evaluate the performance of a 2D asynchronous hard-limiting wireless OCDMA system through atmospheric scattering channel. The probability density function of multiple access interference in truly asynchronous case is given. The bit error rate decreases as the ratio of the chip period to the root mean square delay spread increases and the channel limits the bit rate to different levels when the chip period varies.

  17. Performance of the OVERFLOW-MLP and LAURA-MLP CFD Codes on the NASA Ames 512 CPU Origin System

    Science.gov (United States)

    Taft, James R.

    2000-01-01

    The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full

  18. Multi-Dimensional Simulation of LWR Fuel Behavior in the BISON Fuel Performance Code

    Science.gov (United States)

    Williamson, R. L.; Capps, N. A.; Liu, W.; Rashid, Y. R.; Wirth, B. D.

    2016-11-01

    Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. To simulate this behavior requires a wide variety of material models that are often complex and nonlinear. The recently developed BISON code represents a powerful fuel performance simulation tool based on its material and physical behavior capabilities, finite-element versatility of spatial representation, and use of parallel computing. The code can operate in full three dimensional (3D) mode, as well as in reduced two dimensional (2D) modes, e.g., axisymmetric radial-axial ( R- Z) or plane radial-circumferential ( R- θ), to suit the application and to allow treatment of global and local effects. A BISON case study was used to illustrate analysis of Pellet Clad Mechanical Interaction failures from manufacturing defects using combined 2D and 3D analyses. The analysis involved commercial fuel rods and demonstrated successful computation of metrics of interest to fuel failures, including cladding peak hoop stress and strain energy density. In comparison with a failure threshold derived from power ramp tests, results corroborate industry analyses of the root cause of the pellet-clad interaction failures and illustrate the importance of modeling 3D local effects around fuel pellet defects, which can produce complex effects including cold spots in the cladding, stress concentrations, and hot spots in the fuel that can lead to enhanced cladding degradation such as hydriding, oxidation, CRUD formation, and stress corrosion cracking.

  19. Contributions of Sensory Coding and Attentional Control to Individual Differences in Performance in Spatial Auditory Selective Attention Tasks.

    Science.gov (United States)

    Dai, Lengshi; Shinn-Cunningham, Barbara G

    2016-01-01

    Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics

  20. Contributions of sensory coding and attentional control to individual differences in performance in spatial auditory selective attention tasks

    Directory of Open Access Journals (Sweden)

    Lengshi Dai

    2016-10-01

    Full Text Available Listeners with normal hearing thresholds differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding, onset event-related potentials from the scalp (ERPs, reflecting cortical responses to sound, and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones; however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance, inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with normal hearing thresholds can arise due to both subcortical coding differences and differences in attentional control, depending on

  1. THREEDANT: A code to perform three-dimensional, neutral particle transport calculations

    International Nuclear Information System (INIS)

    Alcouffe, R.E.

    1994-01-01

    The THREEDANT code solves the three-dimensional neutral particle transport equation in its first order, multigroup, discrate ordinate form. The code allows an unlimited number of groups (depending upon the cross section set), angular quadrature up to S-100, and unlimited Pn order again depending upon the cross section set. The code has three options for spatial differencing, diamond with set-to-zero fixup, adaptive weighted diamond, and linear modal. The geometry options are XYZ and RZΘ with a special XYZ option based upon a volume fraction method. This allows objects or bodies of any shape to be modelled as input which gives the code as much geometric description flexibility as the Monte Carlo code MCNP. The transport equation is solved by source iteration accelerated by the DSA method. Both inner and outer iterations are so accelerated. Some results are presented which demonstrate the effectiveness of these techniques. The code is available on several types of computing platforms

  2. Performance of the coupled thermalhydraulics/neutron kinetics code R/P/C on workstation clusters and multiprocessor systems

    International Nuclear Information System (INIS)

    Hammer, C.; Paffrath, M.; Boeer, R.; Finnemann, H.; Jackson, C.J.

    1996-01-01

    The light water reactor core simulation code PANBOX has been coupled with the transient analysis code RELAP5 for the purpose of performing plant safety analyses with a three-dimensional (3-D) neutron kinetics model. The system has been parallelized to improve the computational efficiency. The paper describes the features of this system with emphasis on performance aspects. Performance results are given for different types of parallelization, i. e. for using an automatic parallelizing compiler, using the portable PVM platform on a workstation cluster, using PVM on a shared memory multiprocessor, and for using machine dependent interfaces. (author)

  3. Comparison of Analytical and Measured Performance Results on Network Coding in IEEE 802.11 Ad-Hoc Networks

    DEFF Research Database (Denmark)

    Zhao, Fang; Médard, Muriel; Hundebøll, Martin

    2012-01-01

    CATWOMAN that can run on standard WiFi hardware. We present an analytical model to evaluate the performance of COPE in simple networks, and our results show the excellent predictive quality of this model. By closely examining the performance in two simple topologies, we observe that the coding gain results...

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  6. Systemizers are better code-breakers:Self-reported systemizing predicts code-breaking performance in expert hackers and naïve participants

    Directory of Open Access Journals (Sweden)

    India eHarvey

    2016-05-01

    Full Text Available Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders - attention to detail and systemizing - may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e. crypto-analysis or code-breaking. A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001; Baron-Cohen et al., 2003. They were also tested with behavioural tasks involving code-breaking and a control task involving security x-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing was related with performance in the x-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015. We discuss the theoretical and translational implications of our findings.

  7. Performance Analysis of a New Coded TH-CDMA Scheme in Dispersive Infrared Channel with Additive Gaussian Noise

    Science.gov (United States)

    Hamdi, Mazda; Kenari, Masoumeh Nasiri

    2013-06-01

    We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.

  8. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solis Sanches, L. O.; Miranda, R. Castaneda; Cervantes Viramontes, J. M. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac (Mexico); Vega-Carrillo, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac., Mexico. and Unidad Academica de Estudios Nucleares. C. Cip (Mexico)

    2013-07-03

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in

  9. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    Science.gov (United States)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural

  10. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    International Nuclear Information System (INIS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-01-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural

  11. Conceptual design and performance study for the first implementation of AGATA at the in-flight RIB facility of GSI

    Science.gov (United States)

    Domingo-Pardo, C.; Bazzacco, D.; Doornenbal, P.; Farnea, E.; Gadea, A.; Gerl, J.; Wollersheim, H. J.; Agata Collaboration

    2012-12-01

    The main objective of the Advanced GAmma Tracking Array (AGATA) is the investigation of the structure of exotic nuclei at the new generation of RIB facilities. As part of the preparatory phase for FAIR-NUSTAR, AGATA is going to be installed at the FRS fragmentation facility of the GSI centre for an experimental campaign to be performed in 2012 and 2013. Owing to its γ-ray tracking capabilities and the envisaged enhancement in resolving power, a series of in-flight γ-ray spectroscopy experiments are being planned. The present work describes the conceptual design of this first implementation of AGATA at GSI-FRS, and provides information about the expected performance figures. According to the characteristics of each particular experiment, it is foreseen that the target-array distance is adjusted in order to achieve the optimum compromise between detection efficiency and energy resolution, or to cover an specific angular range of the emitted electromagnetic radiation. Thus, a comprehensive Monte Carlo study of the detection sensitivity in terms of photopeak efficiency, resolution and peak-to-total ratio, as a function of the target-array distance is presented. Several configurations have been investigated, and MC-calculations indicate that a remarkable enhancement in resolving power can be achieved when double-cluster AGATA detectors are developed and implemented. Several experimental effects are also investigated. This concerns the impact of passive materials between the target and the array, the angular distribution of the detection efficiency and the influence of target thickness effects and transition lifetimes in the attainable detection sensitivity. A short overview on half-life measurements via lineshape effects utilizing AGATA is also presented.

  12. Performance and fuel cycle cost analysis of one Janus 30 conceptual design for several fuel element design options

    Energy Technology Data Exchange (ETDEWEB)

    Nurdin, Martias [Research Centre for Nuclear Techniques, National Atomic Energy Agency (Indonesia); Matos, J E; Freese, K E [RERTR Program, Argonne National Laboratory (United States)

    1983-09-01

    The performance and fuel cycle costs for a 25 MW, JANUS 30 reactor conceptual design by INTERATOM, Federal Republic of Germany, for BATAN, Republic of Indonesia have been studied using 19.75% enriched uranium in four fuel element design options. All of these fuel element designs have either been proposed by INTERATOM for various reactors or are currently in use with 93% enriched uranium in reactors in the Federal Republic of Germany. Aluminide, oxide, and silicide fuels were studied for selected designs using the range of uranium densities that are either currently qualified or are being developed and demonstrated internationally. These uranium densities include 1.7-2.3 g/cm{sup 3} in aluminide fuel, 1.7-3.2 g/cm{sup 3} in oxide fuel, and 2.9-6.8 g/cm{sup 3} in silicide fuel. As of November 1982) both the aluminide and the oxide fuels with about 1.7 g U/cm{sup 3} are considered to be fully-proven for licensing purposes. Irradiation screening and proof testing of fuels with uranium densities greater than 1.7 g/cm{sup 3} are currently in progress, and these tests need to be completed in order to obtain licensing authorization for routine reactor use. To assess the long-term fuel adaptation strategy as well as the present fuel acceptance, reactor performance and annual fuel cycle costs were computed for seventeen cases based on a representative end-of-cycle excess reactivity and duty factor. In addition, a study was made to provide data for evaluating the trade-off between the increased safety associated with thicker cladding and the economic penalty due to increased fuel consumption. (author)

  13. Improving performance of single-path code through a time-predictable memory hierarchy

    DEFF Research Database (Denmark)

    Cilku, Bekim; Puffitsch, Wolfgang; Prokesch, Daniel

    2017-01-01

    -predictable memory hierarchy with a prefetcher that exploits the predictability of execution traces in single-path code to speed up code execution. The new memory hierarchy reduces both the cache-miss penalty time and the cache-miss rate on the instruction cache. The benefit of the approach is demonstrated through...

  14. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  15. Conceptual design of emergency communication system to cope with severe accidents in NPPs and its performance evaluation

    International Nuclear Information System (INIS)

    Son, Kwang Seop; Kim, Chang Hwoi; Kang, Hyun Gook

    2015-01-01

    Highlights: • The emergency communication system requires the performances of the throughput of 1 Mbps, BER of 10 −6 and network configuration of 1:12 communication. • The emergency communication system consists of the terrestrial communication and satellite communication system. • In the terrestrial communication system, at least two wireless repeaters are needed to secure LOS and the throughput and delay time are 16 Mbps and 16 ms, respectively. • In the satellite communication system, DSSS and FDMA are used and the fade margin range is from 1.3 to 16 dB. - Abstract: The Fukushima accident induced by the great earthquake and tsunami reveals the vulnerability of I and C System. In the severe environment, the normal I and C system did not work properly and results in false information about the internal situation in NPP. Eventually the accident was not properly handled at the early stage. Therefore advanced emergency response system using a wireless channel is necessary to cope with the severe accident. In this paper, we introduce the ERS consisting of the HMS and MCS the ECS linking the HMS with MCS and the performance requirement of the ECS is analyzed. The ECS satisfying the requirement is designed conceptually and the performance of the ECS is evaluated through analysis and simulator. To secure a reliable and diverse configuration, the ECS is configured as the dual system which consists of the terrestrial communication and satellite communication. The terrestrial communication system is designed based on the IEEE 802.11. Analyzed performance results prove that the performance requirement can be sufficiently achieved. But if the scalability of data capacity is considered later, use of the advanced 802.11 standard such as 802.11n and multiple signal paths between the HMS and MCS are necessary. In the satellite communication system, the FDMA is used in the status link and the DSSS is used in the control link. The network supporting various data rates is

  16. Soft-Decision-Data Reshuffle to Mitigate Pulsed Radio Frequency Interference Impact on Low-Density-Parity-Check Code Performance

    Science.gov (United States)

    Ni, Jianjun David

    2011-01-01

    This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.

  17. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  18. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Tso, C.F. [Arup (United Kingdom); Hueggenberg, R. [Gesellschaft fuer Nuklear-Behaelter mbH (Germany)

    2004-07-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work.

  19. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    International Nuclear Information System (INIS)

    Tso, C.F.; Hueggenberg, R.

    2004-01-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work

  20. A pulse coding and decoding strategy to perform Lamb wave inspections using simultaneously multiple actuators

    Science.gov (United States)

    De Marchi, Luca; Marzani, Alessandro; Moll, Jochen; Kudela, Paweł; Radzieński, Maciej; Ostachowicz, Wiesław

    2017-07-01

    The performance of Lamb wave based monitoring systems, both in terms of diagnosis time and data complexity, can be enhanced by increasing the number of transducers used to actuate simultaneously the guided waves in the inspected medium. However, in case of multiple simultaneously-operated actuators the interference among the excited wave modes within the acquired signals has to be considered for the further processing. To this aim, in this work a code division strategy based on the Warped Frequency Transform is presented. At first, the proposed procedure encodes actuation pulses using Gold sequences. Next, for each considered actuator the acquired signals are compensated from dispersion by cross correlating the warped version of the actuated and received signals. Compensated signals form the base for a final wavenumber imaging meant at emphasizing defects and or anomalies by removing incident wavefield and edge reflections. The proposed strategy is tested numerically, and validated through an experiment in which guided waves are actuated in a plate by four piezoelectric transducers operating simultaneously.

  1. High performance 3D neutron transport on peta scale and hybrid architectures within APOLLO3 code

    International Nuclear Information System (INIS)

    Jamelot, E.; Dubois, J.; Lautard, J-J.; Calvin, C.; Baudron, A-M.

    2011-01-01

    APOLLO3 code is a common project of CEA, AREVA and EDF for the development of a new generation system for core physics analysis. We present here the parallelization of two deterministic transport solvers of APOLLO3: MINOS, a simplified 3D transport solver on structured Cartesian and hexagonal grids, and MINARET, a transport solver based on triangular meshes on 2D and prismatic ones in 3D. We used two different techniques to accelerate MINOS: a domain decomposition method, combined with an accelerated algorithm using GPU. The domain decomposition is based on the Schwarz iterative algorithm, with Robin boundary conditions to exchange information. The Robin parameters influence the convergence and we detail how we optimized the choice of these parameters. MINARET parallelization is based on angular directions calculation using explicit message passing. Fine grain parallelization is also available for each angular direction using shared memory multithreaded acceleration. Many performance results are presented on massively parallel architectures using more than 103 cores and on hybrid architectures using some tens of GPUs. This work contributes to the HPC development in reactor physics at the CEA Nuclear Energy Division. (author)

  2. Coded aperture solution for improving the performance of traffic enforcement cameras

    Science.gov (United States)

    Masoudifar, Mina; Pourreza, Hamid Reza

    2016-10-01

    A coded aperture camera is proposed for automatic license plate recognition (ALPR) systems. It captures images using a noncircular aperture. The aperture pattern is designed for the rapid acquisition of high-resolution images while preserving high spatial frequencies of defocused regions. It is obtained by minimizing an objective function, which computes the expected value of perceptual deblurring error. The imaging conditions and camera sensor specifications are also considered in the proposed function. The designed aperture improves the depth of field (DoF) and subsequently ALPR performance. The captured images can be directly analyzed by the ALPR software up to a specific depth, which is 13 m in our case, though it is 11 m for the circular aperture. Moreover, since the deblurring results of images captured by our aperture yield fewer artifacts than those captured by the circular aperture, images can be first deblurred and then analyzed by the ALPR software. In this way, the DoF and recognition rate can be improved at the same time. Our case study shows that the proposed camera can improve the DoF up to 17 m while it is limited to 11 m in the conventional aperture.

  3. Performance of the MTR core with MOX fuel using the MCNP4C2 code

    International Nuclear Information System (INIS)

    Shaaban, Ismail; Albarhoum, Mohamad

    2016-01-01

    The MCNP4C2 code was used to simulate the MTR-22 MW research reactor and perform the neutronic analysis for a new fuel namely: a MOX (U 3 O 8 &PuO 2 ) fuel dispersed in an Al matrix for One Neutronic Trap (ONT) and Three Neutronic Traps (TNTs) in its core. Its new characteristics were compared to its original characteristics based on the U 3 O 8 -Al fuel. Experimental data for the neutronic parameters including criticality relative to the MTR-22 MW reactor for the original U 3 O 8 -Al fuel at nominal power were used to validate the calculated values and were found acceptable. The achieved results seem to confirm that the use of MOX fuel in the MTR-22 MW will not degrade the safe operational conditions of the reactor. In addition, the use of MOX fuel in the MTR-22 MW core leads to reduce the uranium fuel enrichment with 235 U and the amount of loaded 235 U in the core by about 34.84% and 15.21% for the ONT and TNTs cases, respectively. - Highlights: • Re-cycling of the ETRR-2 reactor by MOX fuel. • Increase the number of the neutronic traps from one neutronic trap to three neutronic trap. • Calculation of the criticality safety and neutronic parameters of the ETRR-2 reactor for the U 3 O 8 -Al original fuel and the MOX fuel.

  4. Near-fault earthquake ground motion prediction by a high-performance spectral element numerical code

    International Nuclear Information System (INIS)

    Paolucci, Roberto; Stupazzini, Marco

    2008-01-01

    Near-fault effects have been widely recognised to produce specific features of earthquake ground motion, that cannot be reliably predicted by 1D seismic wave propagation modelling, used as a standard in engineering applications. These features may have a relevant impact on the structural response, especially in the nonlinear range, that is hard to predict and to be put in a design format, due to the scarcity of significant earthquake records and of reliable numerical simulations. In this contribution a pilot study is presented for the evaluation of seismic ground-motions in the near-fault region, based on a high-performance numerical code for 3D seismic wave propagation analyses, including the seismic fault, the wave propagation path and the near-surface geological or topographical irregularity. For this purpose, the software package GeoELSE is adopted, based on the spectral element method. The set-up of the numerical benchmark of 3D ground motion simulation in the valley of Grenoble (French Alps) is chosen to study the effect of the complex interaction between basin geometry and radiation mechanism on the variability of earthquake ground motion

  5. Performance evaluations of advanced massively parallel platforms based on gyrokinetic toroidal five-dimensional Eulerian code GT5D

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Jolliet, Sebastien

    2010-01-01

    A gyrokinetic toroidal five dimensional Eulerian code GT5D is ported on six advanced massively parallel platforms and comprehensive benchmark tests are performed. A parallelisation technique based on physical properties of the gyrokinetic equation is presented. By extending the parallelisation technique with a hybrid parallel model, the scalability of the code is improved on platforms with multi-core processors. In the benchmark tests, a good salability is confirmed up to several thousands cores on every platforms, and the maximum sustained performance of ∼18.6 Tflops is achieved using 16384 cores of BX900. (author)

  6. Impact of optical hard limiter on the performance of an optical overlapped-code division multiple access system

    Science.gov (United States)

    Inaty, Elie; Raad, Robert; Tablieh, Nicole

    2011-08-01

    Throughout this paper, a closed form expression of the multiple access interference (MAI) limited bit error rate (BER) is provided for the multiwavelength optical code-division multiple-access system when the system is working above the nominal transmission rate limit imposed by the passive encoding-decoding operation. This system is known in literature as the optical overlapped code division multiple access (OV-CDMA) system. A unified analytical framework is presented emphasizing the impact of optical hard limiter (OHL) on the BER performance of such a system. Results show that the performance of the OV-CDMA system may be highly improved when using OHL preprocessing at the receiver side.

  7. A model of R-D performance evaluation for Rate-Distortion-Complexity evaluation of H.264 video coding

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren

    2007-01-01

    This paper considers a method for evaluation of Rate-Distortion-Complexity (R-D-C) performance of video coding. A statistical model of the transformed coefficients is used to estimate the Rate-Distortion (R-D) performance. A model frame work for rate, distortion and slope of the R-D curve for inter...... and intra frame is presented. Assumptions are given for analyzing an R-D model for fast R-D-C evaluation. The theoretical expressions are combined with H.264 video coding, and confirmed by experimental results. The complexity frame work is applied to the integer motion estimation....

  8. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  9. A Navier-Strokes Chimera Code on the Connection Machine CM-5: Design and Performance

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon; Kwak, Dochan (Technical Monitor)

    1994-01-01

    We have implemented a three-dimensional compressible Navier-Stokes code on the Connection Machine CM-5. The code is set up for implicit time-stepping on single or multiple structured grids. For multiple grids and geometrically complex problems, we follow the 'chimera' approach, where flow data on one zone is interpolated onto another in the region of overlap. We will describe our design philosophy and give some timing results for the current code. A parallel machine like the CM-5 is well-suited for finite-difference methods on structured grids. The regular pattern of connections of a structured mesh maps well onto the architecture of the machine. So the first design choice, finite differences on a structured mesh, is natural. We use centered differences in space, with added artificial dissipation terms. When numerically solving the Navier-Stokes equations, there are liable to be some mesh cells near a solid body that are small in at least one direction. This mesh cell geometry can impose a very severe CFL (Courant-Friedrichs-Lewy) condition on the time step for explicit time-stepping methods. Thus, though explicit time-stepping is well-suited to the architecture of the machine, we have adopted implicit time-stepping. We have further taken the approximate factorization approach. This creates the need to solve large banded linear systems and creates the first possible barrier to an efficient algorithm. To overcome this first possible barrier we have considered two options. The first is just to solve the banded linear systems with data spread over the whole machine, using whatever fast method is available. This option is adequate for solving scalar tridiagonal systems, but for scalar pentadiagonal or block tridiagonal systems it is somewhat slower than desired. The second option is to 'transpose' the flow and geometry variables as part of the time-stepping process: Start with x-lines of data in-processor. Form explicit terms in x, then transpose so y-lines of data are

  10. Selection of a computer code for Hanford low-level waste engineered-system performance assessment. Revision 1

    International Nuclear Information System (INIS)

    McGrail, B.P.; Bacon, D.H.

    1998-02-01

    Planned performance assessments for the proposed disposal of low-activity waste (LAW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. The available computer codes with suitable capabilities at the time Revision 0 of this document was prepared were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical processes expected to affect LAW glass corrosion and the mobility of radionuclides. This analysis was repeated in this report but updated to include additional processes that have been found to be important since Revision 0 was issued and to include additional codes that have been released. The highest ranked computer code was found to be the STORM code developed at PNNL for the US Department of Energy for evaluation of arid land disposal sites

  11. Baseline performance and learning rate of conceptual and perceptual skill-learning tasks: the effect of moderate to severe traumatic brain injury.

    Science.gov (United States)

    Vakil, Eli; Lev-Ran Galon, Carmit

    2014-01-01

    Existing literature presents a complex and inconsistent picture of the specific deficiencies involved in skill learning following traumatic brain injury (TBI). In an attempt to address this difficulty, individuals with moderate to severe TBI (n = 29) and a control group (n = 29) were tested with two different skill-learning tasks: conceptual (i.e., Tower of Hanoi Puzzle, TOHP) and perceptual (i.e., mirror reading, MR). Based on previous studies of the effect of divided attention on these tasks and findings regarding the effect of TBI on conceptual and perceptual priming tasks, it was predicted that the group with TBI would show impaired baseline performance compared to controls in the TOHP task though their learning rate would be maintained, while both baseline performance and learning rate on the MR task would be maintained. Consistent with our predictions, overall baseline performance of the group with TBI was impaired in the TOHP test, while the learning rate was not. The learning rate on the MR task was preserved but, contrary to our prediction, response time of the group with TBI was slower than that of controls. The pattern of results observed in the present study was interpreted to possibly reflect an impairment of both the frontal lobes as well as that of diffuse axonal injury, which is well documented as being affected by TBI. The former impairment affects baseline performance of the conceptual learning skill, while the latter affects the overall slower performance of the perceptual learning skill.

  12. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    Science.gov (United States)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  13. New Conceptual Design Tools

    DEFF Research Database (Denmark)

    Pugnale, Alberto; Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal......This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  14. Probabilistic evaluation of fuel element performance by the combined use of a fast running simplistic and a detailed deterministic fuel performance code

    International Nuclear Information System (INIS)

    Misfeldt, I.

    1980-01-01

    A comprehensive evaluation of fuel element performance requires a probabilistic fuel code supported by a well bench-marked deterministic code. This paper presents an analysis of a SGHWR ramp experiment, where the probabilistic fuel code FRP is utilized in combination with the deterministic fuel models FFRS and SLEUTH/SEER. The statistical methods employed in FRP are Monte Carlo simulation or a low-order Taylor approximation. The fast-running simplistic fuel code FFRS is used for the deterministic simulations, whereas simulations with SLEUTH/SEER are used to verify the predictions of FFRS. The ramp test was performed with a SGHWR fuel element, where 9 of the 36 fuel pins failed. There seemed to be good agreement between the deterministic simulations and the experiment, but the statistical evaluation shows that the uncertainty on the important performance parameters is too large for this ''nice'' result. The analysis does therefore indicate a discrepancy between the experiment and the deterministic code predictions. Possible explanations for this disagreement are discussed. (author)

  15. Performance assessment of the Greater Confinement Disposal facility on the Nevada Test Site: Comparing the performance of two conceptual site models

    International Nuclear Information System (INIS)

    Baer, T.A.; Price, L.L.; Gallegos, D.P.

    1993-01-01

    A small amount of transuranic (TRU) waste has been disposed of at the Greater Confinement Disposal (GCD) site located on the Nevada Test Site's (NTS) Radioactive Waste Management Site (RWMS). The waste has been buried in several deep (37 m) boreholes dug into the floor of an alluvial basin. For the waste to remain in its current configuration, the DOE must demonstrate compliance of the site with the TRU disposal requirements, 40 CFR 191. Sandia's approach to process modelling in performance assessment is to use demonstrably conservative models of the site. Choosing the most conservative model, however, can be uncertain. As an example, diffusion of contaminants upward from the buried waste in the vadose zone water is the primary mechanism of release. This process can be modelled as straight upward planar diffusion or as spherical diffusion in all directions. The former has high fluxes but low release areas, the latter has lower fluxes but is spread over a greater area. We have developed analytic solutions to a simple test problem for both models and compared the total integrated discharges. The spherical diffusion conceptual model results in at least five times greater release to the accessible environment than the planar model at all diffusivities. Modifying the planar model to allow for a larger release, however, compensated for the smaller original planar discharge and resulted in a new planar model that was more conservative that the spherical model except at low diffusivities

  16. Achievable Performance of Zero-Delay Variable-Rate Coding in Rate-Constrained Networked Control Systems with Channel Delay

    DEFF Research Database (Denmark)

    Barforooshan, Mohsen; Østergaard, Jan; Stavrou, Fotios

    2017-01-01

    This paper presents an upper bound on the minimum data rate required to achieve a prescribed closed-loop performance level in networked control systems (NCSs). The considered feedback loop includes a linear time-invariant (LTI) plant with single measurement output and single control input. Moreover......, in this NCS, a causal but otherwise unconstrained feedback system carries out zero-delay variable-rate coding, and control. Between the encoder and decoder, data is exchanged over a rate-limited noiseless digital channel with a known constant time delay. Here we propose a linear source-coding scheme...

  17. Performance analysis of quantum access network using code division multiple access model

    International Nuclear Information System (INIS)

    Hu Linxi; Yang Can; He Guangqiang

    2017-01-01

    A quantum access network has been implemented by frequency division multiple access and time division multiple access, while code division multiple access is limited for its difficulty to realize the orthogonality of the code. Recently, the chaotic phase shifters were proposed to guarantee the orthogonality by different chaotic signals and spread the spectral content of the quantum states. In this letter, we propose to implement the code division multiple access quantum network by using chaotic phase shifters and synchronization. Due to the orthogonality of the different chaotic phase shifter, every pair of users can faithfully transmit quantum information through a common channel and have little crosstalk between different users. Meanwhile, the broadband spectra of chaotic signals efficiently help the quantum states to defend against channel loss and noise. (paper)

  18. Building energy performance analysis by an in-house developed dynamic simulation code: An investigation for different case studies

    International Nuclear Information System (INIS)

    Buonomano, Annamaria; Palombo, Adolfo

    2014-01-01

    Highlights: • A new dynamic simulation code for building energy performance analysis is presented. • The thermal behavior of each building element is modeled by a thermal RC network. • The physical models implemented in the code are illustrated. • The code was validated by the BESTEST standard procedure. • We investigate residential buildings, offices and stores in different climates. - Abstract: A novel dynamic simulation model for the building envelope energy performance analysis is presented in this paper. This tool helps the investigation of many new building technologies to increase the system energy efficiency and it can be carried out for scientific research purposes. In addition to the yearly heating and cooling load and energy demand, the obtained output is the dynamic temperature profile of indoor air and surfaces and the dynamic profile of the thermal fluxes through the building elements. The presented simulation model is also validated through the BESTEST standard procedure. Several new case studies are developed for assessing, through the presented code, the energy performance of three different building envelopes with several different weather conditions. In particular, dwelling and commercial buildings are analysed. Light and heavyweight envelopes as well as different glazed surfaces areas have been used for every case study. With the achieved results interesting design and operating guidelines can be obtained. Such data have been also compared vs. those calculated by TRNSYS and EnergyPlus. The detected deviation of the obtained results vs. those of such standard tools are almost always lower than 10%

  19. Performance Assessment and analysis of national building codes with fire safety in all wards of a hospital

    Directory of Open Access Journals (Sweden)

    M. Mahdinia

    2009-04-01

    Full Text Available Background and aimsAIDS as a re-emergent disease and Viral hepatitis (B and C as one of thBackground and objective: Fire safety is an important problem in hospitals. Movement less, lack of awareness and special situation of residents are the reasons of this subject. In more countries such as Iran, fire protection regulations have compiled within the framework of national building codes. Current building codes don't create sufficient safety for patient in the hospitals in different situations and more of the advanced countries in the world effort to establish building code, base  on performance. This study to be accomplished with this goal that determination of fire risk level in the wards of a hospital and survey the efficiency of the national building codes. Methodsfire risk assesses is done, using "engineering fire risk assessment method" with the checklist for Data gathering. In this manner, risk calculate in all compartments and in the next  stage for survey the effect of building codes, with this supposition that all compartment is  conforming to building code requirement, risk level calculate in two situation.Resultsthe results of present study reveals that, risk level in all wards is more than one and even though risk less than one is acceptable, consequently minimum of safely situations didn't  produce in most wards. The results show the national building code in the different conditions  don't have appropriate efficient for creation of suitable safety. Conclusionin order to access to a fire safety design with sufficient efficiency, suitable selection is use of risk assessment based on, design methods.

  20. SIEX: a correlated code for the prediction of liquid metal fast breeder reactor (LMFBR) fuel thermal performance

    International Nuclear Information System (INIS)

    Dutt, D.S.; Baker, R.B.

    1975-06-01

    The SIEX computer program is a steady state heat transfer code developed to provide thermal performance calculations for a mixed-oxide fuel element in a fast neutron environment. Fuel restructuring, fuel-cladding heat conduction and fission gas release are modeled to provide assessment of the temperature. Modeling emphasis has been placed on correlations to measurable quantities from EBR-II irradiation tests and the inclusion of these correlations in a physically based computational scheme. SIEX is completely modular in construction allowing the user options for material properties and correlated models. Required code input is limited to geometric and environmental parameters, with a ''consistent'' set of material properties and correlated models provided by the code. 24 references. (U.S.)

  1. Experimental demonstration of the transmission performance for LDPC-coded multiband OFDM ultra-wideband over fiber system

    Science.gov (United States)

    He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin; Su, Jinshu

    2015-01-01

    To improve the transmission performance of multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband (UWB) over optical fiber, a pre-coding scheme based on low-density parity-check (LDPC) is adopted and experimentally demonstrated in the intensity-modulation and direct-detection MB-OFDM UWB over fiber system. Meanwhile, a symbol synchronization and pilot-aided channel estimation scheme is implemented on the receiver of the MB-OFDM UWB over fiber system. The experimental results show that the LDPC pre-coding scheme can work effectively in the MB-OFDM UWB over fiber system. After 70 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1 × 10-3, the receiver sensitivities are improved about 4 dB when the LDPC code rate is 75%.

  2. Performance Analysis for Bit Error Rate of DS- CDMA Sensor Network Systems with Source Coding

    Directory of Open Access Journals (Sweden)

    Haider M. AlSabbagh

    2012-03-01

    Full Text Available The minimum energy (ME coding combined with DS-CDMA wireless sensor network is analyzed in order to reduce energy consumed and multiple access interference (MAI with related to number of user(receiver. Also, the minimum energy coding which exploits redundant bits for saving power with utilizing RF link and On-Off-Keying modulation. The relations are presented and discussed for several levels of errors expected in the employed channel via amount of bit error rates and amount of the SNR for number of users (receivers.

  3. Performance Analysis of Iterative Decoding Algorithms for PEG LDPC Codes in Nakagami Fading Channels

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2013-11-01

    Full Text Available In this paper we give a comparative analysis of decoding algorithms of Low Density Parity Check (LDPC codes in a channel with the Nakagami distribution of the fading envelope. We consider the Progressive Edge-Growth (PEG method and Improved PEG method for the parity check matrix construction, which can be used to avoid short girths, small trapping sets and a high level of error floor. A comparative analysis of several classes of LDPC codes in various propagation conditions and decoded using different decoding algorithms is also presented.

  4. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  5. Modeling of EAST ICRF antenna performance using the full-wave code TORIC

    Energy Technology Data Exchange (ETDEWEB)

    Edlund, E. M., E-mail: eedlund@pppl.gov [Princeton Plasma Physics Laboratory, Princeton, NJ (United States); Bonoli, P. T.; Porkolab, M.; Wukitch, S. J. [MIT Plasma Science and Fusion Center, Cambridge, MA (United States)

    2015-12-10

    Access to advanced operating regimes in the EAST tokamak will require a combination of electron-cyclotron resonance heating (ECRH), neutral beam injection (NBI) and ion cyclotron range frequency heating (ICRF), with the addition of lower-hybrid current drive (LHCD) for current profile control. Prior experiments at the EAST tokamak facility have shown relatively weak response of the plasma temperature to application of ICRF heating, with typical coupled power about 2 MW out of 12 MW source. The launched spectrum, at n{sub φ} = 34 for 0-π -0-π phasing and 27 MHz, is largely inaccessible at line-averaged densities of approximately 2 × 10{sup 19} m{sup −3}. However, with variable antenna phasing and frequency, this system has considerable latitude to explore different heating schemes. To develop an ICRF actuator control model, we have used the full-wave code TORIC to explore the physics of ICRF wave propagation in EAST. The results presented from this study use a spectrum analysis using a superposition of n{sub φ} spanning −50 to +50. The low density regime typical of EAST plasmas results in a perpendicular wavelength comparable to the minor radius which results in global cavity resonance effects and eigenmode formation when the single-pass absorption is low. This behavior indicates that improved performance can be attained by lowering the peak of the k{sub ||} spectrum by using π/3 phasing of the 4-strap antenna. Based on prior studies conducted at Alcator C-Mod, this phasing is also expected to have the advantage of nearly divergence-free box currents, which should result in reduced levels of impurity production. Significant enhancements of the loading resistance may be achieved by using low k{sub ||} phasing and a combination of magnetic field and frequency to vary the location of the resonance and mode conversion regions. TORIC calculations indicate that the significant power may be channeled to the electrons and deuterium majority. We expect that

  6. Roadmap to an Engineering-Scale Nuclear Fuel Performance and Safety Code

    International Nuclear Information System (INIS)

    Turner, John A.; Clarno, Kevin T.; Hansen, Glen A.

    2009-01-01

    -development activities. Realizing the full benefits of this approach will likely take some time. However, it is important that the developmental activities for modeling and simulation be tightly coupled with the experimental activities to maximize feedback effects and accelerate both the experimental and analytical elements of the program toward a common objective. The close integration of modeling and simulation and experimental activities is key to developing a useful fuel performance simulation capability, providing a validated design and analysis tool, and understanding the uncertainties within the models and design process. The efforts of this project are integrally connected to the Transmutation Fuels Campaign (TFC), which maintains as a primary objective to formulate, fabricate, and qualify a transuranic-based fuel with added minor actinides for use in future fast reactors. Additional details of the TFC scope can be found in the Transmutation Fuels Campaign Execution Plan. This project is an integral component of the TFC modeling and simulation effort, and this multiyear plan borrowed liberally from the Transmutation Fuels Campaign Modeling and Simulation Roadmap. This document provides the multiyear staged development plan to develop a continuum-level Integrated Performance and Safety Code (IPSC) to predict the behavior of the fuel and cladding during normal reactor operations and anticipated transients up to the point of clad breach.

  7. Current status of the transient integral fuel element performance code URANUS

    International Nuclear Information System (INIS)

    Preusser, T.; Lassmann, K.

    1983-01-01

    To investigate the behavior of fuel pins during normal and off-normal operation, the integral fuel rod code URANUS has been extended to include a transient version. The paper describes the current status of the program system including a presentation of newly developed models for hypothetical accident investigation. The main objective of current development work is to improve the modelling of fuel and clad material behavior during fast transients. URANUS allows detailed analysis of experiments until the onset of strong material transport phenomena. Transient fission gas analysis is carried out due to the coupling with a special version of the LANGZEIT-KURZZEIT-code (KfK). Fuel restructuring and grain growth kinetics models have been improved recently to better characterize pre-experimental steady-state operation; transient models are under development. Extensive verification of the new version has been carried out by comparison with analytical solutions, experimental evidence, and code-to-code evaluation studies. URANUS, with all these improvements, has been successfully applied to difficult fast breeder fuel rod analysis including TOP, LOF, TUCOP, local coolant blockage and specific carbide fuel experiments. Objective of further studies is the description of transient PCMI. It is expected that the results of these developments will contribute significantly to the understanding of fuel element structural behavior during severe transients. (orig.)

  8. Optimized Reactive Power Flow of DFIG Power Converters for Better Reliability Performance Considering Grid Codes

    DEFF Research Database (Denmark)

    Zhou, Dao; Blaabjerg, Frede; Lau, Mogens

    2015-01-01

    . In order to fulfill the modern grid codes, over-excited reactive power injection will further reduce the lifetime of the rotor-side converter. In this paper, the additional stress of the power semiconductor due to the reactive power injection is firstly evaluated in terms of modulation index...

  9. Coded throughput performance simulations for the time-varying satellite channel. M.S. Thesis

    Science.gov (United States)

    Han, LI

    1995-01-01

    The design of a reliable satellite communication link involving the data transfer from a small, low-orbit satellite to a ground station, but through a geostationary satellite, was examined. In such a scenario, the received signal power to noise density ratio increases as the transmitting low-orbit satellite comes into view, and then decreases as it then departs, resulting in a short-duration, time-varying communication link. The optimal values of the small satellite antenna beamwidth, signaling rate, modulation scheme and the theoretical link throughput (in bits per day) have been determined. The goal of this thesis is to choose a practical coding scheme which maximizes the daily link throughput while satisfying a prescribed probability of error requirement. We examine the throughput of both fixed rate and variable rate concatenated forward error correction (FEC) coding schemes for the additive white Gaussian noise (AWGN) channel, and then examine the effect of radio frequency interference (RFI) on the best coding scheme among them. Interleaving is used to mitigate degradation due to RFI. It was found that the variable rate concatenated coding scheme could achieve 74 percent of the theoretical throughput, equivalent to 1.11 Gbits/day based on the cutoff rate R(sub 0). For comparison, 87 percent is achievable for AWGN-only case.

  10. Performance analysis of super-orthogonal space-frequency trellis coded OFDM system

    CSIR Research Space (South Africa)

    Sokoya, O

    2009-08-01

    Full Text Available that is used with OFDM. SOSFTC-OFDM utilizes the diversities in frequency and space domain by assuming that coding is done along adjacent subcarrier in an OFDM environment. This paper evaluates the exact pairwise error probability (PEP) of the SOSFTC...

  11. The VULKIN code used for evaluation of the cladding tube's performance

    International Nuclear Information System (INIS)

    Marbach, G.

    1979-01-01

    The VULKIN code gives the evolution of stresses and strains distribution in the thickness of the clad with the hypothesis of revolution symmetry. This programm takes into account temperature dilatation and radial thermal gradient, fission gas pressure and steel swelling due to neutron flux

  12. Comparison of the ENIGMA code with experimental data on thermal performance, stable fission gas and iodine release at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Killeen, J C [Nuclear Electric plc, Barnwood (United Kingdom)

    1997-08-01

    The predictions of the ENIGMA code have been compared with data from high burn-up fuel experiments from the Halden and RISO reactors. The experiments modelled were IFA-504 and IFA-558 from Halden and the test II-5 from the RISO power burnup test series. The code has well modelled the fuel thermal performance and has provided a good measure of iodine release from pre-interlinked fuel. After interlinkage the iodine predictions remain a good fit for one experiment, but there is significant overprediction for a second experiment (IFA-558). Stable fission gas release is also well modelled and the predictions are within the expected uncertainly band throughout the burn-up range. This report presents code predictions for stable fission gas release to 40GWd/tU, iodine release measurements to 50GWd/tU and thermal performance (fuel centre temperature) to 55GWd/tU. Fuel ratings of up to 38kW/m were modelled at the high burn-up levels. The code is shown to accurately or conservatively predict all these parameters. (author). 1 ref., 6 figs.

  13. Assessing the Predictive Capability of the LIFEIV Nuclear Fuel Performance Code using Sequential Calibration

    International Nuclear Information System (INIS)

    Stull, Christopher J.; Williams, Brian J.; Unal, Cetin

    2012-01-01

    This report considers the problem of calibrating a numerical model to data from an experimental campaign (or series of experimental tests). The issue is that when an experimental campaign is proposed, only the input parameters associated with each experiment are known (i.e. outputs are not known because the experiments have yet to be conducted). Faced with such a situation, it would be beneficial from the standpoint of resource management to carefully consider the sequence in which the experiments are conducted. In this way, the resources available for experimental tests may be allocated in a way that best 'informs' the calibration of the numerical model. To address this concern, the authors propose decomposing the input design space of the experimental campaign into its principal components. Subsequently, the utility (to be explained) of each experimental test to the principal components of the input design space is used to formulate the sequence in which the experimental tests will be used for model calibration purposes. The results reported herein build on those presented and discussed in (1,2) wherein Verification and Validation and Uncertainty Quantification (VU) capabilities were applied to the nuclear fuel performance code LIFEIV. In addition to the raw results from the sequential calibration studies derived from the above, a description of the data within the context of the Predictive Maturity Index (PMI) will also be provided. The PMI (3,4) is a metric initiated and developed at Los Alamos National Laboratory to quantitatively describe the ability of a numerical model to make predictions in the absence of experimental data, where it is noted that 'predictions in the absence of experimental data' is not synonymous with extrapolation. This simply reflects the fact that resources do not exist such that each and every execution of the numerical model can be compared against experimental data. If such resources existed, the justification for numerical models

  14. Using clinical data to predict high-cost performance coding issues associated with pressure ulcers: a multilevel cohort model.

    Science.gov (United States)

    Padula, William V; Gibbons, Robert D; Pronovost, Peter J; Hedeker, Donald; Mishra, Manish K; Makic, Mary Beth F; Bridges, John Fp; Wald, Heidi L; Valuck, Robert J; Ginensky, Adam J; Ursitti, Anthony; Venable, Laura Ruth; Epstein, Ziv; Meltzer, David O

    2017-04-01

    Hospital-acquired pressure ulcers (HAPUs) have a mortality rate of 11.6%, are costly to treat, and result in Medicare reimbursement penalties. Medicare codes HAPUs according to Agency for Healthcare Research and Quality Patient-Safety Indicator 3 (PSI-03), but they are sometimes inappropriately coded. The objective is to use electronic health records to predict pressure ulcers and to identify coding issues leading to penalties. We evaluated all hospitalized patient electronic medical records at an academic medical center data repository between 2011 and 2014. These data contained patient encounter level demographic variables, diagnoses, prescription drugs, and provider orders. HAPUs were defined by PSI-03: stages III, IV, or unstageable pressure ulcers not present on admission as a secondary diagnosis, excluding cases of paralysis. Random forests reduced data dimensionality. Multilevel logistic regression of patient encounters evaluated associations between covariates and HAPU incidence. The approach produced a sample population of 21 153 patients with 1549 PSI-03 cases. The greatest odds ratio (OR) of HAPU incidence was among patients diagnosed with spinal cord injury (ICD-9 907.2: OR = 14.3; P  coded for paralysis, leading to a PSI-03 flag. Other high ORs included bed confinement (ICD-9 V49.84: OR = 3.1, P  coded without paralysis, leading to PSI-03 flags. The resulting statistical model can be tested to predict HAPUs during hospitalization. Inappropriate coding of conditions leads to poor hospital performance measures and Medicare reimbursement penalties. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Conceptual study of advanced PWR core design

    International Nuclear Information System (INIS)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong.

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs

  16. Conceptual study of advanced PWR core design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs.

  17. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  18. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  19. An Improved Thermal Blooming Model for the Laser Performance Code Anchor

    Science.gov (United States)

    2016-06-01

    over which a laser beam can maintain transverse coherence throughout its propagation distance. Typical values of ro are on the order of a few...G. Gebhardt, “Twenty-five years of thermal blooming: An overview,” in Proceedings of SPIE 1221 Propagation of High-Energy Laser Beams Through the...TERMS thermal blooming, atmospheric propagation , laser , scaling code, Strehl ratio, ANCHOR, COAMPS, NAVSLaM, LEEDR 15. NUMBER OF PAGES 77 16

  20. Performance-based building codes: a call for injury prevention indicators that bridge health and building sectors.

    Science.gov (United States)

    Edwards, N

    2008-10-01

    The international introduction of performance-based building codes calls for a re-examination of indicators used to monitor their implementation. Indicators used in the building sector have a business orientation, target the life cycle of buildings, and guide asset management. In contrast, indicators used in the health sector focus on injury prevention, have a behavioural orientation, lack specificity with respect to features of the built environment, and do not take into account patterns of building use or building longevity. Suggestions for metrics that bridge the building and health sectors are discussed. The need for integrated surveillance systems in health and building sectors is outlined. It is time to reconsider commonly used epidemiological indicators in the field of injury prevention and determine their utility to address the accountability requirements of performance-based codes.

  1. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    International Nuclear Information System (INIS)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  2. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  3. The VULKIN code used for evaluation of the cladding tube's performance

    International Nuclear Information System (INIS)

    Marbach, G.

    1979-01-01

    Full text: 1 - Introduction. The French approach for fast subassembly project is to analyse each component part of the subassembly and each basic phenomenon to estimate the total behaviour. The VULKIN code describes the mechanical behaviour of a clad alone. A cladding damage parameter is calculated from the observed deformations. When it is greater than a fixed value we consider that the rupture probability is not negligible. But this function is not the only limit for the irradiation project. Other limits are bound to other problems: no fuel melting bundle, interaction behaviour. 2 - VULKIN code - Presentation. The VULKIN code gives the evolution of stresses and strains distribution in the thickness of the clad with the hypothesis of revolution symmetry. This program takes into account temperature dilatation and radial thermal gradient, fission gas pressure and steel swelling due to neutron flux. The fuel clad mechanical interaction is not described by this model. Experimental results show that its influence is negligible for the most unusual subassemblies but, if it is necessary, a special calculation is obtained using a specific code like TUREN, described in another paper. This model does not consider the stresses and strains resulting from interaction between bundle and wrapper. Another model describes the bundle behaviour and determines diametral deformation limit from the subassembly geometrical characteristics. The clad is considered as an elasto-plastic element. Plastic flows instantaneous, thermal creep or irradiation creep are determined at each time. The data of this code are the geometry, the irradiation parameters (temperature, dose), the fission gas pressure evolution, the swelling law and the experimental relations for thermal and irradiation creep. The mechanical resolution is classical: the clad is divided into concentric rings. At each time the equations resulting from the equilibrium of strengths and compatibility of displacements are resolved

  4. Self-Shielding Treatment to Perform Cell Calculation for Seed Furl In Th/U Pwr Using Dragon Code

    Directory of Open Access Journals (Sweden)

    Ahmed Amin El Said Abd El Hameed

    2015-08-01

    Full Text Available Time and precision of the results are the most important factors in any code used for nuclear calculations. Despite of the high accuracy of Monte Carlo codes, MCNP and Serpent, in many cases their relatively long computational time leads to difficulties in using any of them as the main calculation code. Usually, Monte Carlo codes are used only to benchmark the results. The deterministic codes, which are usually used in nuclear reactor’s calculations, have limited precision, due to the approximations in the methods used to solve the multi-group transport equation. Self- Shielding treatment, an algorithm that produces an average cross-section defined over the complete energy domain of the neutrons in a nuclear reactor, is responsible for the biggest error in any deterministic codes. There are mainly two resonance self-shielding models commonly applied: models based on equivalence and dilution and models based on subgroup approach. The fundamental problem with any self-shielding method is that it treats any isotope as there are no other isotopes with resonance present in the reactor. The most practical way to solve this problem is to use multi-energy groups (50-200 that are chosen in a way that allows us to use all major resonances without self-shielding. In this paper, we perform cell calculations, for a fresh seed fuel pin which is used in thorium/uranium reactors, by solving 172 energy group transport equation using the deterministic DRAGON code, for the two types of self-shielding models (equivalence and dilution models and subgroup models Using WIMS-D5 and DRAGON data libraries. The results are then tested by comparing it with the stochastic MCNP5 code.  We also tested the sensitivity of the results to a specific change in self-shielding method implemented, for example the effect of applying Livolant-Jeanpierre Normalization scheme and Rimman Integration improvement on the equivalence and dilution method, and the effect of using Ribbon

  5. Summary of Conceptual Models and Data Needs to Support the INL Remote-Handled Low-Level Waste Disposal Facility Performance Assessment and Composite Analysis

    International Nuclear Information System (INIS)

    Sondrup, A. Jeff; Schafter, Annette L.; Rood, Arthur S.

    2010-01-01

    An overview of the technical approach and data required to support development of the performance assessment, and composite analysis are presented for the remote handled low-level waste disposal facility on-site alternative being considered at Idaho National Laboratory. Previous analyses and available data that meet requirements are identified and discussed. Outstanding data and analysis needs are also identified and summarized. The on-site disposal facility is being evaluated in anticipation of the closure of the Radioactive Waste Management Complex at the INL. An assessment of facility performance and of the composite performance are required to meet the Department of Energy's Low-Level Waste requirements (DOE Order 435.1, 2001) which stipulate that operation and closure of the disposal facility will be managed in a manner that is protective of worker and public health and safety, and the environment. The corresponding established procedures to ensure these protections are contained in DOE Manual 435.1-1, Radioactive Waste Management Manual (DOE M 435.1-1 2001). Requirements include assessment of (1) all-exposure pathways, (2) air pathway, (3) radon, and (4) groundwater pathway doses. Doses are computed from radionuclide concentrations in the environment. The performance assessment and composite analysis are being prepared to assess compliance with performance objectives and to establish limits on concentrations and inventories of radionuclides at the facility and to support specification of design, construction, operation and closure requirements. Technical objectives of the PA and CA are primarily accomplished through the development of an establish inventory, and through the use of predictive environmental transport models implementing an overarching conceptual framework. This document reviews the conceptual model, inherent assumptions, and data required to implement the conceptual model in a numerical framework. Available site-specific data and data sources

  6. Performance, Accuracy and Efficiency Evaluation of a Three-Dimensional Whole-Core Neutron Transport Code AGENT

    International Nuclear Information System (INIS)

    Jevremovic, Tatjana; Hursin, Mathieu; Satvat, Nader; Hopkins, John; Xiao, Shanjie; Gert, Godfree

    2006-01-01

    as three-dimensional maps of the energy-dependent mesh-wise scalar flux, reaction rate and power peaking factor. The AGENT code is in a process of an extensive and rigorous testing for various reactor types through the evaluation of its performance (ability to model any reactor geometry type), accuracy (in comparison with Monte Carlo results and other deterministic solutions or experimental data) and efficiency (computational speed that is directly determined by the mathematical and numerical solution to the iterative approach of the flux convergence). This paper outlines main aspects of the theories unified into the AGENT code formalism and demonstrates the code performance, accuracy and efficiency using few representative examples. The AGENT code is a main part of the so called virtual reactor system developed for numerical simulations of research reactors. Few illustrative examples of the web interface are briefly outlined. (authors)

  7. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  8. Design and performance of coded aperture optical elements for the CESR-TA x-ray beam size monitor

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, J.P.; Chatterjee, A.; Conolly, C.; Edwards, E.; Ehrlichman, M.P. [Cornell University, Ithaca, NY 14853 (United States); Flanagan, J.W. [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Department of Accelerator Science, Graduate University for Advanced Studies (SOKENDAI), Tsukuba (Japan); Fontes, E. [Cornell University, Ithaca, NY 14853 (United States); Heltsley, B.K., E-mail: bkh2@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Lyndaker, A.; Peterson, D.P.; Rider, N.T.; Rubin, D.L.; Seeley, R.; Shanks, J. [Cornell University, Ithaca, NY 14853 (United States)

    2014-12-11

    We describe the design and performance of optical elements for an x-ray beam size monitor (xBSM), a device measuring e{sup +} and e{sup −} beam sizes in the CESR-TA storage ring. The device can measure vertical beam sizes of 10–100μm on a turn-by-turn, bunch-by-bunch basis at e{sup ±} beam energies of ∼2–5GeV. x-rays produced by a hard-bend magnet pass through a single- or multiple-slit (coded aperture) optical element onto a detector. The coded aperture slit pattern and thickness of masking material forming that pattern can both be tuned for optimal resolving power. We describe several such optical elements and show how well predictions of simple models track measured performances. - Highlights: • We characterize optical element performance of an e{sup ±} x-ray beam size monitor. • We standardize beam size resolving power measurements to reference conditions. • Standardized resolving power measurements compare favorably to model predictions. • Key model features include simulation of photon-counting statistics and image fitting. • Results validate a coded aperture design optimized for the x-ray spectrum encountered.

  9. Multimodal Code-pairing and Switching of Visual-verbal Texts in Selected Nigerian Stand-up Comedy Performances

    Directory of Open Access Journals (Sweden)

    Mufutau Temitayo Lamidi

    2017-10-01

    Full Text Available This study examines multimodal pairing and switching of codes as features of visual-verbal texts and how they are used as strategies for evoking humour in Nigerian stand-up comedy performances, an area that has not attracted much scholarly attention. Data were obtained through purposive random sampling and analysed through content analysis. Six DVDs (Vols. 3, 7, 8 & 28 of Nite of a Thousand Laughs; Vols. 27 & 28 of AY LIVE Happiness Edition and 6 video clips (downloaded from the Internet all totalling 8 hours and 20 minutes of play were selected for the study. Incongruity, Layered Meaning and Visual Semiotics serve as theoretical framework. The study identifies different multimodal strategies such as code-pairing and integration in different forms of oral codes, gestures, costume, and symbols; intertextuality; incongruous translations/ deliberate misinterpretations; and mimicry, quotes and paralanguage used to elicit laughter. It suggests that these features are also useful in other speech-making events, and concludes that the integration of codes and modes of communication serves as an effective strategy in evoking humour and laughter in stand-up comedy

  10. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  11. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  12. Transient calculation performance of the MASTER code for control rod ejection problem

    International Nuclear Information System (INIS)

    Cho, B. O.; Joo, H. G.; Yoo, Y. J.; Park, S. Y.; Zee, S. Q.

    1999-01-01

    The accuracy and the effectiveness of the solution methods of the MASTER code for reactor transient problems were analyzed with a set of NEACRP PWR control rod ejection benchmark problems. A series of sensitivity study for the effects on the solution by the neutronic solution methods and the neutronic and thermal-hydraulic model parameters were thus investigated. The MASTER results were then compared with the reference PANTHER results. This indicates that the MASTER solution is sufficiently accurate and the computing time is fast enough for nuclear design application

  13. Transient calculation performance of the MASTER code for control rod ejection problem

    Energy Technology Data Exchange (ETDEWEB)

    Cho, B. O.; Joo, H. G.; Yoo, Y. J.; Park, S. Y.; Zee, S. Q. [KAERI, Taejon (Korea, Republic of)

    1999-10-01

    The accuracy and the effectiveness of the solution methods of the MASTER code for reactor transient problems were analyzed with a set of NEACRP PWR control rod ejection benchmark problems. A series of sensitivity study for the effects on the solution by the neutronic solution methods and the neutronic and thermal-hydraulic model parameters were thus investigated. The MASTER results were then compared with the reference PANTHER results. This indicates that the MASTER solution is sufficiently accurate and the computing time is fast enough for nuclear design application.

  14. Design and Performance Evaluation of Underwater Data Dissemination Strategies using Interference Avoidance and Network Coding

    DEFF Research Database (Denmark)

    Palacios, Raul; Heide, Janus; Fitzek, Frank

    2012-01-01

    constraints and achieve efficient data transmission under water. Network Coding can exploit the broadcast channel to send different information to several receivers simultaneously. With Interference Avoidance the long propagation delay can be used to communicate in full-duplex mode. Alone and combined...... these concepts could increase channel utilisation as well as improve energy efficiency of the network nodes. The main goal is to investigate the potential benefits of new strategies for data dissemination over a string topology scenario. Comprehensive simulations prove the feasibility of Interference Avoidance...

  15. Performance of a neutron transport code with full phase space decomposition on the Cray Research T3D

    International Nuclear Information System (INIS)

    Dorr, M.R.; Salo, E.M.

    1995-01-01

    We present performance results obtained on a 128-node Cray Research T3D computer by a neutron transport code implementing a standard mtiltigroup, discrete ordinates algorithm on a three-dimensional Cartesian grid. After summarizing the implementation strategy used to obtain a full decomposition of phase space (i.e., simultaneous parallelization of the neutron energy, directional and spatial variables), we investigate the scalability of the fundamental source iteration step with respect to each phase space variable. We also describe enhancements that have enabled performance rates approaching 10 gigaflops on the full 128-node machine

  16. Confidence building on the total system performance assessment code, MASCOT-K for permanent disposal of HLW in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H

    2002-12-01

    To perform Total System Performance Assessment(TSPA) of a potential HLW repository, it is necessary to develop the TSPA code. KAERI has developed the one-dimensional PSA code MASCOT-K since 1997 and verified special modules dedicated for the dissolution of spent nuclear fuel. In the second R and D phase, MASCOT-K is once again verified as a part of the confidence building for TSPA. The AMBER code based on the totally different mathematical approach, compartment theory is used together with MASCOT-K to assess the annual individual doses for given K- and Q- scenarios. Results indicate that both AMBER and MASCOT-K simulate the annual individual doses to a potential biosphere. And the MASCOT-K is more flexible to describe the natural barrier such as a fracture for sensitivity studies. In the third R and D phase, MASCOT-K will be actively used to check whether the proposed KAERI reference disposal concept is solid or not.

  17. Confidence building on the total system performance assessment code, MASCOT-K for permanent disposal of HLW in Korea

    International Nuclear Information System (INIS)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H.

    2002-12-01

    To perform Total System Performance Assessment(TSPA) of a potential HLW repository, it is necessary to develop the TSPA code. KAERI has developed the one-dimensional PSA code MASCOT-K since 1997 and verified special modules dedicated for the dissolution of spent nuclear fuel. In the second R and D phase, MASCOT-K is once again verified as a part of the confidence building for TSPA. The AMBER code based on the totally different mathematical approach, compartment theory is used together with MASCOT-K to assess the annual individual doses for given K- and Q- scenarios. Results indicate that both AMBER and MASCOT-K simulate the annual individual doses to a potential biosphere. And the MASCOT-K is more flexible to describe the natural barrier such as a fracture for sensitivity studies. In the third R and D phase, MASCOT-K will be actively used to check whether the proposed KAERI reference disposal concept is solid or not

  18. Zero-forcing pre-coding for MIMO WiMAX transceivers: Performance analysis and implementation issues

    Science.gov (United States)

    Cattoni, A. F.; Le Moullec, Y.; Sacchi, C.

    Next generation wireless communication networks are expected to achieve ever increasing data rates. Multi-User Multiple-Input-Multiple-Output (MU-MIMO) is a key technique to obtain the expected performance, because such a technique combines the high capacity achievable using MIMO channel with the benefits of space division multiple access. In MU-MIMO systems, the base stations transmit signals to two or more users over the same channel, for this reason every user can experience inter-user interference. This paper provides a capacity analysis of an online, interference-based pre-coding algorithm able to mitigate the multi-user interference of the MU-MIMO systems in the context of a realistic WiMAX application scenario. Simulation results show that pre-coding can significantly increase the channel capacity. Furthermore, the paper presents several feasibility considerations for implementation of the analyzed technique in a possible FPGA-based software defined radio.

  19. Development of three-dimensional neoclassical transport simulation code with high performance Fortran on a vector-parallel computer

    International Nuclear Information System (INIS)

    Satake, Shinsuke; Okamoto, Masao; Nakajima, Noriyoshi; Takamaru, Hisanori

    2005-11-01

    A neoclassical transport simulation code (FORTEC-3D) applicable to three-dimensional configurations has been developed using High Performance Fortran (HPF). Adoption of computing techniques for parallelization and a hybrid simulation model to the δf Monte-Carlo method transport simulation, including non-local transport effects in three-dimensional configurations, makes it possible to simulate the dynamism of global, non-local transport phenomena with a self-consistent radial electric field within a reasonable computation time. In this paper, development of the transport code using HPF is reported. Optimization techniques in order to achieve both high vectorization and parallelization efficiency, adoption of a parallel random number generator, and also benchmark results, are shown. (author)

  20. PURDU-WINCOF: A computer code for establishing the performance of a fan-compressor unit with water ingestion

    Science.gov (United States)

    Leonardo, M.; Tsuchiya, T.; Murthy, S. N. B.

    1982-01-01

    A model for predicting the performance of a multi-spool axial-flow compressor with a fan during operation with water ingestion was developed incorporating several two-phase fluid flow effects as follows: (1) ingestion of water, (2) droplet interaction with blades and resulting changes in blade characteristics, (3) redistribution of water and water vapor due to centrifugal action, (4) heat and mass transfer processes, and (5) droplet size adjustment due to mass transfer and mechanical stability considerations. A computer program, called the PURDU-WINCOF code, was generated based on the model utilizing a one-dimensional formulation. An illustrative case serves to show the manner in which the code can be utilized and the nature of the results obtained.

  1. Fuel Rod Performance Evaluation of CE 16 x 16 LTA Operated at Steady State Using Transuranus and Pad Codes

    Energy Technology Data Exchange (ETDEWEB)

    Krasnorutskyy, V.; Slyeptsov, O. [Nuclear Fuel Cycle Science and Technology Establishment (NFCSTE), National Science Center, Kharkhov Institute of Physics and Technology (NSC KIPT), Kharkhov (Ukraine)

    2013-03-15

    The report performed under IAEA research contract No. 15370 describes the results of fuel performance evaluation of PWR fuel rods operated at steady state up to discharge burnup of {approx}60 GWD/MTU using the codes of TRANSURANUS designed by ITU and PAD designed by Westinghouse. The experimental results from US-PWR 16x16 LTA Extended Burnup Demonstration Program presented in the IFPE database of the OECD/NEA have been utilized for assessing the codes themselves during simulation of such properties as rod burnup, cladding corrosion, fuel densification and swelling, cladding irradiation growth and strain, FGR and RIP. The results obtained by PAD showed that the code properly simulates rod burnup, cladding irradiation growth and cladding oxidation with Standard Zr-4 material. The calculated burnup values along the fuel stack vary within {+-} 5% of the rod average burnup. The predicted values of the rod axial growth are (0.88-0.94) % and within the measured ones obtained in the burnup range of (50 - 60) GWD/MTU. With allowance made for probability of crud deposition and hot channel hydraulic diameter variation, the axial distribution of oxide layer is predicted well. For the nominal rod dimensions and operation conditions, the calculated peak oxide thickness is slightly overestimated based on the BE corrosion model parameters. The WEC fuel swelling and densification model together with the US NRC one, which is incorporated in the code, were used to assess the change in fuel pellet density ({Delta}{rho}) and fuel volume ({Delta}V{sub F}/V) vs. burnup as well as the rod void volume change, {Delta}V{sub V}/V, and the cladding outer diameter (OD) variation along the fuel stack. (author)

  2. LISA: A performance assessment code for geological repositories of radioactive waste

    International Nuclear Information System (INIS)

    Bertozzi, G.; Saltelli, A.

    1985-01-01

    LISA, developed at JRC-Ispra, is a statistical code, which calculates the radiation exposures and risks associated with radionuclide releases from geological repositories of nuclear waste. The assessment methodology is described briefly. It requires that a number of probabilistic components be quantified and introduced in the analysis; the results are thus expressed in terms of risk. The subjective judgment of experts may be necessary to quantify the probabilities of occurrence of rare geological events. Because of large uncertainties in input data, statistical treatment of the Monte Carlo type is utilized for the analysis; thus, the output from LISA is obtained in the form of distributions. A few results of an application to a probabilistic scenario for a repository mined in a clay bed are illustrated

  3. Development of a code and models for high burnup fuel performance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kinoshita, M; Kitajima, S [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1997-08-01

    First the high burnup LWR fuel behavior is discussed and necessary models for the analysis are reviewed. These aspects of behavior are the changes of power history due to the higher enrichment, the temperature feedback due to fission gas release and resultant degradation of gap conductance, axial fission gas transport in fuel free volume, fuel conductivity degradation due to fission product solution and modification of fuel micro-structure. Models developed for these phenomena, modifications in the code, and the benchmark results mainly based on Risoe fission gas project is presented. Finally the rim effect which is observe only around the fuel periphery will be discussed focusing into the fuel conductivity degradation and swelling due to the porosity development. (author). 18 refs, 13 figs, 3 tabs.

  4. High performance reconciliation for continuous-variable quantum key distribution with LDPC code

    Science.gov (United States)

    Lin, Dakai; Huang, Duan; Huang, Peng; Peng, Jinye; Zeng, Guihua

    2015-03-01

    Reconciliation is a significant procedure in a continuous-variable quantum key distribution (CV-QKD) system. It is employed to extract secure secret key from the resulted string through quantum channel between two users. However, the efficiency and the speed of previous reconciliation algorithms are low. These problems limit the secure communication distance and the secure key rate of CV-QKD systems. In this paper, we proposed a high-speed reconciliation algorithm through employing a well-structured decoding scheme based on low density parity-check (LDPC) code. The complexity of the proposed algorithm is reduced obviously. By using a graphics processing unit (GPU) device, our method may reach a reconciliation speed of 25 Mb/s for a CV-QKD system, which is currently the highest level and paves the way to high-speed CV-QKD.

  5. Repository seal materials performance for a SALT Repository Project 5-year code/model development plan: Draft

    International Nuclear Information System (INIS)

    1987-06-01

    This document describes an integrated laboratory testing and model development effort for the seal system for a high-level nuclear waste repository in salt. The testing and modeling efforts are designed to determine seal material response in the repository environment, to provide models of seal system components for performance assessment, and to assist in the development of seal system designs. A code/model development and performance analysis program will be performed to predict the short- and long-term response of seal materials and seal components. The results from these analyses will be used to support the material testing activities on this contract and to support performance assessment activities that are conducted in other parts of the Salt Repository Project (SRP). 48 refs., 15 figs., 4 tabs

  6. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    Science.gov (United States)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  7. EXPLORING PROJECT-RELATED FACTORS THAT INFLUENCE LEADERSHIP STYLES AND THEIR EFFECT ON PROJECT PERFORMANCE: A CONCEPTUAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Suzaan Pretorius

    2017-12-01

    Full Text Available It is widely accepted that project leaders should adapt their behaviour to meet the unique leadership demands of a variety of situations. Recently, vertical, shared, and horizontal styles of leadership have gained prominence, especially in the project management literature. Several factors are believed to play a role in determining an appropriate balance between these leadership styles. This theoretical study explores the influence of project types, the stage in the project life cycle, organisational project management maturity, and the level of trust and collaboration between project team members on the appropriate balance of leadership styles in projects. This paper presents a conceptual framework of these factors, while empirical results will be reported on in the sequel to this paper.

  8. Should diagnosis codes from emergency department data be used for case selection for emergency department key performance indicators?

    Science.gov (United States)

    Howell, Stuart C; Wills, Rachael A; Johnston, Trisha C

    2014-02-01

    The aim of the present study was to assess the suitability of emergency department (ED) discharge diagnosis for identifying patient cohorts included in the definitions of key performance indicators (KPIs) that are used to evaluate ED performance. Hospital inpatient episodes of care with a principal diagnosis that corresponded to an ED-defined KPI were extracted from the Queensland Hospital Admitted Patient Data Collection (QHAPDC) for the year 2010-2011. The data were then linked to the corresponding ED patient record and the diagnoses applied in the two settings were compared. The asthma and injury cohorts produced favourable results with respect to matching the QHAPDC principal diagnosis with the ED discharge diagnosis. The results were generally modest when the QHAPDC principal diagnosis was upper respiratory tract infection, poisoning and toxic effects or a mental health diagnosis, and were quite poor for influenza. There is substantial variation in the capture of patient cohorts using discharge diagnosis as recorded on Queensland Hospital Emergency Department data. WHAT IS KNOWN ABOUT THE TOPIC? There are several existing KPIs that are defined according to the diagnosis recorded on ED data collections. However, there have been concerns over the quality of ED diagnosis in Queensland and other jurisdictions, and the value of these data in identifying patient cohorts for the purpose of assessing ED performance remains uncertain. WHAT DOES THIS PAPER ADD? This paper identifies diagnosis codes that are suitable for use in capturing the patient cohorts that are used to evaluate ED performance, as well as those codes that may be of limited value. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? The limitations of diagnosis codes within ED data should be understood by those seeking to use these data items for healthcare planning and management or for research into healthcare quality and outcomes.

  9. Coding task performance in early adolescence: A large-scale controlled study into boy-girl differences

    Directory of Open Access Journals (Sweden)

    Sanne eDekker

    2013-08-01

    Full Text Available This study examined differences between boys and girls regarding efficiency of information processing in early adolescence. 306 healthy adolescents (50.3% boys in grade 7 and 9 (aged 13 and 15 respectively performed a coding task based on over-learned symbols. An age effect was revealed as subjects in grade 9 performed better than subjects in grade 7. Main effects for sex were found in the advantage of girls. The 25% best-performing students comprised twice as many girls as boys. The opposite pattern was found for the worst performing 25%. In addition, a main effect was found for educational track in favor of the highest track. No interaction effects were found. School grades did not explain additional variance in LDST performance. This indicates that cognitive performance is relatively independent from school performance. Student characteristics like age, sex and education level were more important for efficiency of information processing than school performance. The findings imply that after age 13, efficiency of information processing is still developing and that girls outperform boys in this respect. The findings provide new information on the mechanisms underlying boy-girl differences in scholastic performance.

  10. A proposed conceptual model of green practices impacting on the tourism business and their performances – A case of Phu Quoc island Vietnam

    Directory of Open Access Journals (Sweden)

    Vu Minh Hieu

    2017-06-01

    Full Text Available The purpose of this article is to propose a conceptual green practices model in the tourism industry of Phu Quoc island (destination in Vietnam. The model is developed with purpose of providing direction for researchers to empirically examine relationships among demographic variables, innovation characteristics, performance expectancy, social influence, facilitating conditions and effort expectancy, funding availability and environment and business performances. This study uses the secondary research data which is collected from different sources as books, journals, research papers and other online and print media (publications on the subject. The main method used in this study is the content review and analysis. The author suggests that an empirical study should be done to confirm if relationships of variables exit or need to be changed to adapt with the currents of the destination to increase business performance. This model is expected to contribute to the theory of tourism and to apply to Phu Quoc island.

  11. Specification of PWR UO2 pellet design parameters with the fuel performance code FRAPCON-1

    International Nuclear Information System (INIS)

    Silva, A.T.; Marra Neto, A.

    1988-08-01

    UO 2 pellet design parameters are analysed to verify their influence in the fuel basic properties and in its performance under irradiation in pressurized water reactors. Three groups of parameters are discussed: 1) content of fissionable and impurity materials; 2) stoichiometry; 3) density pore morpholoy, and microstructure. A methodology is applied with the fuel performance program FRAPCON-1 to specify these parameters. (author [pt

  12. Inter-comparison of Computer Codes for TRISO-based Fuel Micro-Modeling and Performance Assessment

    International Nuclear Information System (INIS)

    Boer, Brian; Keun Jo, Chang; Wu, Wen; Ougouag, Abderrafi M.; McEachren, Donald; Venneri, Francesco

    2010-01-01

    The Next Generation Nuclear Plant (NGNP), the Deep Burn Pebble Bed Reactor (DB-PBR) and the Deep Burn Prismatic Block Reactor (DB-PMR) are all based on fuels that use TRISO particles as their fundamental constituent. The TRISO particle properties include very high durability in radiation environments, hence the designs reliance on the TRISO to form the principal barrier to radioactive materials release. This durability forms the basis for the selection of this fuel type for applications such as Deep Bun (DB), which require exposures up to four times those expected for light water reactors. It follows that the study and prediction of the durability of TRISO particles must be carried as part of the safety and overall performance characterization of all the designs mentioned above. Such evaluations have been carried out independently by the performers of the DB project using independently developed codes. These codes, PASTA, PISA and COPA, incorporate models for stress analysis on the various layers of the TRISO particle (and of the intervening matrix material for some of them), model for fission products release and migration then accumulation within the SiC layer of the TRISO particle, just next to the layer, models for free oxygen and CO formation and migration to the same location, models for temperature field modeling within the various layers of the TRISO particle and models for the prediction of failure rates. All these models may be either internal to the code or external. This large number of models and the possibility of different constitutive data and model formulations and the possibility of a variety of solution techniques makes it highly unlikely that the model would give identical results in the modeling of identical situations. The purpose of this paper is to present the results of an inter-comparison between the codes and to identify areas of agreement and areas that need reconciliation. The inter-comparison has been carried out by the cooperating

  13. CQUESTRA, a risk and performance assessment code for geological sequestration of carbon dioxide

    International Nuclear Information System (INIS)

    LeNeveu, D.M.

    2008-01-01

    A computationally efficient semi-analytical code, CQUESTRA, has been developed for probabilistic risk assessment and rapid screening of potential sites for geological sequestration of carbon dioxide. The rate of dissolution and leakage from a trapped underground pool of carbon dioxide is determined. The trapped carbon dioxide could be mixed with hydrocarbons and other components to form a buoyant phase. The program considers potential mechanisms for escape from the geological formations such as the movement of the buoyant phase through failed seals in wellbores, the annulus around wellbores and through open fractures in the caprock. Plume animations of dissolved carbon dioxide in formation water around the wellbores are provided. Solubility, density and viscosity of the buoyant phase are determined by equations of state. Advection, dispersion, diffusion, buoyancy, aquifer flow rates and local formation fluid pressure are taken into account in the modeling of the carbon dioxide movement. Results from a hypothetical example simulation based on data from the Williston basin near Weyburn, Saskatchewan, indicate that this site is potentially a viable candidate for carbon dioxide sequestration. Sensitivity analysis of CQUESTRA indicates that criteria such as siting below aquifers with large flow rates and siting in reservoirs having fluid pressure below the pressure of the formations above can promote complete dissolution of the carbon dioxide during movement toward the surface, thereby preventing release to the biosphere. Formation of very small carbon dioxide bubbles within the fluid in the wellbores can also lead to complete dissolution

  14. Mathematical model and computer code for coated particles performance at normal operating conditions

    International Nuclear Information System (INIS)

    Golubev, I.; Kadarmetov, I.; Makarov, V.

    2002-01-01

    Computer modeling of thermo-mechanical behavior of coated particles during operating both at normal and off-normal conditions has a very significant role particularly on a stage of new reactors development. In Russia a big experience has been accumulated on fabrication and reactor tests of CP and fuel elements with UO 2 kernels. However, this experience cannot be using in full volume for development of a new reactor installation GT-MHR. This is due to very deep burn-up of the fuel based on plutonium oxide (up to 70% fima). Therefore the mathematical modeling of CP thermal-mechanical behavior and failure prediction becomes particularly important. The authors have a clean understanding that serviceability of fuel with high burn-ups are defined not only by thermo-mechanics, but also by structured changes in coating materials, thermodynamics of chemical processes, 'amoeba-effect', formation CO etc. In the report the first steps of development of integrate code for numerical modeling of coated particles behavior and some calculating results concerning the influence of various design parameters on fuel coated particles endurance for GT-MHR normal operating conditions are submitted. A failure model is developed to predict the fraction of TRISO-coated particles. In this model it is assumed that the failure of CP depends not only on probability of SiC-layer fracture but also on the PyC-layers damage. The coated particle is considered as a uniform design. (author)

  15. A comparative study of fission gas behaviour in UO2 and MOX fuels using the meteor fuel performance code

    International Nuclear Information System (INIS)

    Struzik, C.; Garcia, Ph.; Noirot, L.

    2002-01-01

    The paper reviews some of the fission-gas-related differences observed between MOX MIMAS AUC fuels and homogeneous UO 2 fuels. Under steady-state conditions, the apparently higher fractional release in MOX fuels is interpreted with the METEOR fuel performance code as a consequence of their lower thermal conductivity and the higher linear heat rates to which MOX fuel rods are subjected. Although more fundamental diffusion properties are needed, the apparently greater swelling of MOX fuel rods at higher linear heat rates can be ascribed to enhanced diffusion properties. (authors)

  16. SIEX3: A correlated computer code for prediction of fast reactor mixed oxide fuel and blanket pin performance

    International Nuclear Information System (INIS)

    Baker, R.B.; Wilson, D.R.

    1986-04-01

    The SIEX3 computer program was developed to calculate the fuel and cladding performance of oxide fuel and oxide blanket pins irradiated in the fast neutron environment of a liquid metal cooled reactor. The code is uniquely designed to be accurate yet quick running and use a minimum of computer core storage. This was accomplished through the correlation of physically based models to very large data bases of irradiation test results. Data from over 200 fuel pins and over 800 transverse fuel microscopy samples were used in the calibrations

  17. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  18. The performance test of anti-scattering x-ray grid with inclined shielding material by MCNP code simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Jun Woo; Kim, Hee Reyoung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2016-06-15

    The scattered photons cause reduction of the contrast of radiographic image and it results in the degradation of the quality of the image. In order to acquire better quality image, an anti-scattering x-ray gird should be equipped in radiography system. The X-ray anti-scattering grid of the inclined type based on the hybrid concept for that of parallel and focused type was tested by MCNP code. The MCNPX 2.7.0 was used for the simulation based test. The geometry for the test was based on the IEC 60627 which was an international standard for diagnostic X-ray imaging equipment-Characteristics of general purpose and mammographic anti-scatter grids. The performance of grids with four inclined shielding material types was compared with that of the parallel type. The grid with completely tapered type the best performance where there were little performance difference according to the degree of inclination.

  19. Development of methodology to construct a generic conceptual model of river-valley evolution for performance assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Kawamura, Makoto; Tanikawa, Shin-ichi; Yasue, Ken-ichi; Niizato, Tadafumi

    2011-01-01

    In order to assess the long-term safety of a geological disposal system for high-level radioactive waste (HLW), it is important to consider the impact of uplift and erosion, which cannot be precluded on a timescale in the order of several hundred thousand years for many locations in Japan. Geomorphic evolution, caused by uplift and erosion and coupled to climatic and sea-level changes, will impact the geological disposal system due to resulting spatial and temporal changes in the disposal environment. Degradation of HLW barrier performance will be particularly significant when the remnant repository structures near, and are eventually exposed at, the ground surface. In previous studies, fluvial erosion was densified as the key concern in most settings in Japan. Interpretation of the impact of the phenomena at relevant locations in Japan has led to development of a generic conceptual model which contains the features typical at middle reach of rivers. Here, therefore, we present a methodology for development of a generic conceptual model based on best current understanding of fluvial erosion in Japan, which identifies the simplifications and uncertainties involved and assesses their consequences in the context of repository performance. (author)

  20. LEARNING ALGORITHM EFFECT ON MULTILAYER FEED FORWARD ARTIFICIAL NEURAL NETWORK PERFORMANCE IN IMAGE CODING

    Directory of Open Access Journals (Sweden)

    OMER MAHMOUD

    2007-08-01

    Full Text Available One of the essential factors that affect the performance of Artificial Neural Networks is the learning algorithm. The performance of Multilayer Feed Forward Artificial Neural Network performance in image compression using different learning algorithms is examined in this paper. Based on Gradient Descent, Conjugate Gradient, Quasi-Newton techniques three different error back propagation algorithms have been developed for use in training two types of neural networks, a single hidden layer network and three hidden layers network. The essence of this study is to investigate the most efficient and effective training methods for use in image compression and its subsequent applications. The obtained results show that the Quasi-Newton based algorithm has better performance as compared to the other two algorithms.

  1. Compliance Performance: Effects of a Provider Incentive Program and Coding Compliance Plan

    National Research Council Canada - National Science Library

    Tudela, Joseph A

    2004-01-01

    The purpose of this project is to study provider and coder related performance, i.e., provider compliance rate and coder productivity/accuracy rates and average dollar difference between coder and auditor, at Brooke Army Medical Center...

  2. Analysis of UO{sub 2}-BeO fuel under transient using fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Abe, Alfredo Y.; Muniz, Rafael O.R.; Giovedi, Claudia, E-mail: dsgomes@ipen.br, E-mail: alfredo@ctmsp.mar.mil.br, E-mail: rafael.orm@gmail.com, E-mail: claudia.giovedi@ctmsp.mar.mil.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Universidade de São Paulo (USP), São Paulo, SP (Brazil). Departamento de Engenharia Naval e Oceânica

    2017-11-01

    Recent research has appointed the need to replace the classic fuel concept, used in light water reactors. Uranium dioxide has a weak point due to the low thermal conductivity, that produce high temperatures on the fuel. The ceramic composite fuel formed of uranium dioxide (UO{sub 2}), with the addition of beryllium oxide (BeO), presents high thermal conductivity compared with UO{sub 2}. The oxidation of zirconium generates hydrogen gas that can create a detonation condition. One of the preferred options are the ferritic alloys formed of iron-chromium and aluminum (FeCrAl), that should avoid the hydrogen release due to oxidation. In general, the FeCrAl alloys containing 10 - 20Cr, 3 - 5Al, and 0 - 0.12Y in weight percent. The FeCrAl alloys should exhibit a slow oxidation kinetics due to chemical composition. Resistance to oxidation in the presence of steam is improved as a function of the content of chromium and aluminum. In this way, the thermal and mechanical properties of the UO{sub 2}-BeO-10%vol, composite fuel were coupled with FeCrAl alloys and added to the fuel codes. In this work, we examine the fuel rod behavior of UO{sub 2}-10%vol-BeO/FeCrAl, including a simulated transient of reactivity. The fuels behavior shown reduced temperature with UO{sub 2}-BeO/Zr, UO{sub 2}-BeO/FeCrAl also were compared with UO{sub 2}/Zr system. The case reactivity initiated accident analyzed, reproducing the fuel rod called VA-1 using UO{sub 2}/Zr alloys and compared with UO{sub 2}-BeO/FeCrAl. (author)

  3. Perceptual Processing Affects Conceptual Processing

    Science.gov (United States)

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  4. Performance Analysis of Korean Liquid metal type TBM based on Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. H.; Han, B. S.; Park, H. J.; Park, D. K. [Seoul National Univ., Seoul (Korea, Republic of)

    2007-01-15

    The objective of this project is to analyze a nuclear performance of the Korean HCML(Helium Cooled Molten Lithium) TBM(Test Blanket Module) which will be installed in ITER(International Thermonuclear Experimental Reactor). This project is intended to analyze a neutronic design and nuclear performances of the Korean HCML ITER TBM through the transport calculation of MCCARD. In detail, we will conduct numerical experiments for analyzing the neutronic design of the Korean HCML TBM and the DEMO fusion blanket, and improving the nuclear performances. The results of the numerical experiments performed in this project will be utilized further for a design optimization of the Korean HCML TBM. In this project, Monte Carlo transport calculations for evaluating TBR (Tritium Breeding Ratio) and EMF (Energy Multiplication factor) were conducted to analyze a nuclear performance of the Korean HCML TBM. The activation characteristics and shielding performances for the Korean HCML TBM were analyzed using ORIGEN and MCCARD. We proposed the neutronic methodologies for analyzing the nuclear characteristics of the fusion blanket, which was applied to the blanket analysis of a DEMO fusion reactor. In the results, the TBR of the Korean HCML ITER TBM is 0.1352 and the EMF is 1.362. Taking into account a limitation for the Li amount in ITER TBM, it is expected that tritium self-sufficiency condition can be satisfied through a change of the Li quantity and enrichment. In the results of activation and shielding analysis, the activity drops to 1.5% of the initial value and the decay heat drops to 0.02% of the initial amount after 10 years from plasma shutdown.

  5. Performance Study of Monte Carlo Codes on Xeon Phi Coprocessors — Testing MCNP 6.1 and Profiling ARCHER Geometry Module on the FS7ONNi Problem

    Science.gov (United States)

    Liu, Tianyu; Wolfe, Noah; Lin, Hui; Zieb, Kris; Ji, Wei; Caracappa, Peter; Carothers, Christopher; Xu, X. George

    2017-09-01

    This paper contains two parts revolving around Monte Carlo transport simulation on Intel Many Integrated Core coprocessors (MIC, also known as Xeon Phi). (1) MCNP 6.1 was recompiled into multithreading (OpenMP) and multiprocessing (MPI) forms respectively without modification to the source code. The new codes were tested on a 60-core 5110P MIC. The test case was FS7ONNi, a radiation shielding problem used in MCNP's verification and validation suite. It was observed that both codes became slower on the MIC than on a 6-core X5650 CPU, by a factor of 4 for the MPI code and, abnormally, 20 for the OpenMP code, and both exhibited limited capability of strong scaling. (2) We have recently added a Constructive Solid Geometry (CSG) module to our ARCHER code to provide better support for geometry modelling in radiation shielding simulation. The functions of this module are frequently called in the particle random walk process. To identify the performance bottleneck we developed a CSG proxy application and profiled the code using the geometry data from FS7ONNi. The profiling data showed that the code was primarily memory latency bound on the MIC. This study suggests that despite low initial porting e_ort, Monte Carlo codes do not naturally lend themselves to the MIC platform — just like to the GPUs, and that the memory latency problem needs to be addressed in order to achieve decent performance gain.

  6. Performance investigation of the pulse and Campbelling modes of a fission chamber using a Poisson pulse train simulation code

    Energy Technology Data Exchange (ETDEWEB)

    Elter, Zs. [CEA, DEN, DER, Instrumentation, Sensors and Dosimetry Laboratory, Cadarache, F-13108 Saint-Paul-lez-Durance (France); Chalmers University of Technology, Department of Applied Physics, Division of Nuclear Engineering, SE-412 96 Göteborg (Sweden); Jammes, C., E-mail: christian.jammes@cea.fr [CEA, DEN, DER, Instrumentation, Sensors and Dosimetry Laboratory, Cadarache, F-13108 Saint-Paul-lez-Durance (France); Pázsit, I. [Chalmers University of Technology, Department of Applied Physics, Division of Nuclear Engineering, SE-412 96 Göteborg (Sweden); Pál, L. [Centre for Energy Research, Hungarian Academy of Sciences, H-1525 Budapest 114, POB 49 (Hungary); Filliatre, P. [CEA, DEN, DER, Instrumentation, Sensors and Dosimetry Laboratory, Cadarache, F-13108 Saint-Paul-lez-Durance (France)

    2015-02-21

    The detectors of the neutron flux monitoring system of the foreseen French GEN-IV sodium-cooled fast reactor (SFR) will be high temperature fission chambers placed in the reactor vessel in the vicinity of the core. The operation of a fission chamber over a wide-range neutron flux will be feasible provided that the overlap of the applicability of its pulse and Campbelling operational modes is ensured. This paper addresses the question of the linearity of these two modes and it also presents our recent efforts to develop a specific code for the simulation of fission chamber pulse trains. Our developed simulation code is described and its overall verification is shown. An extensive quantitative investigation was performed to explore the applicability limits of these two standard modes. It was found that for short pulses the overlap between the pulse and Campbelling modes can be guaranteed if the standard deviation of the background noise is not higher than 5% of the pulse amplitude. It was also shown that the Campbelling mode is sensitive to parasitic noise, while the performance of the pulse mode is affected by the stochastic amplitude distributions.

  7. High performance fiber reinforced concrete : Progress in knowledge and design codes

    NARCIS (Netherlands)

    Walraven, J.C.

    2009-01-01

    High performance fiber reinforced concrete is developing quickly to a modern structural material with a high potential. As for instance testified by the recent symposium on HPFRC in Kassel, Germany (April 2008) the number of structural applications increases. At this moment studies are carried out

  8. ITER conceptual design

    International Nuclear Information System (INIS)

    Tomabechi, K.; Gilleland, J.R.; Sokolov, Yu.A.; Toschi, R.

    1991-01-01

    The Conceptual Design Activities of the International Thermonuclear Experimental Reactor (ITER) were carried out jointly by the European Community, Japan, the Soviet Union and the United States of America, under the auspices of the International Atomic Energy Agency. The European Community provided the site for joint work sessions at the Max-Planck-Institut fuer Plasmaphysik in Garching, Germany. The Conceptual Design Activities began in the spring of 1988 and ended in December 1990. The objectives of the activities were to develop the design of ITER, to perform a safety and environmental analysis, to define the site requirements as well as the future research and development needs, to estimate the cost and manpower, and to prepare a schedule for detailed engineering design, construction and operation. On the basis of the investigation and analysis performed, a concept of ITER was developed which incorporated maximum flexibility of the performance of the device and allowed a variety of operating scenarios to be adopted. The heart of the machine is a tokamak having a plasma major radius of 6 m, a plasma minor radius of 2.15 m, a nominal plasma current of 22 MA and a nominal fusion power of 1 GW. The conceptual design can meet the technical objectives of the ITER programme. Because of the success of the Conceptual Design Activities, the Parties are now considering the implementation of the next phase, called the Engineering Design Activities. (author). Refs, figs and tabs

  9. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  10. PHENIX Conceptual Design Report. An experiment to be performed at the Brookhaven National Laboratory Relativistic Heavy Ion Collider

    Energy Technology Data Exchange (ETDEWEB)

    Nagamiya, Shoji; Aronson, Samuel H.; Young, Glenn R.; Paffrath, Leo

    1993-01-29

    The PHENIX Conceptual Design Report (CDR) describes the detector design of the PHENIX experiment for Day-1 operation at the Relativistic Heavy Ion Collider (RHIC). The CDR presents the physics capabilities, technical details, cost estimate, construction schedule, funding profile, management structure, and possible upgrade paths of the PHENIX experiment. The primary goals of the PHENIX experiment are to detect the quark-gluon plasma (QGP) and to measure its properties. Many of the potential signatures for the QGP are measured as a function of a well-defined common variable to see if any or all of these signatures show a simultaneous anomaly due to the formation of the QGP. In addition, basic quantum chromodynamics phenomena, collision dynamics, and thermodynamic features of the initial states of the collision are studied. To achieve these goals, the PHENIX experiment measures lepton pairs (dielectrons and dimuons) to study various properties of vector mesons, such as the mass, the width, and the degree of yield suppression due to the formation of the QGP. The effect of thermal radiation on the continuum is studied in different regions of rapidity and mass. The e{mu} coincidence is measured to study charm production, and aids in understanding the shape of the continuum dilepton spectrum. Photons are measured to study direct emission of single photons and to study {pi}{sup 0} and {eta} production. Charged hadrons are identified to study the spectrum shape, production of antinuclei, the {phi} meson (via K{sup +}K{sup {minus}} decay), jets, and two-boson correlations. The measurements are made down to small cross sections to allow the study of high p{sub T} spectra, and J/{psi} and {Upsilon} production. The PHENIX collaboration consists of over 300 scientists, engineers, and graduate students from 43 institutions in 10 countries. This large international collaboration is supported by US resources and significant foreign resources.

  11. Of norms and words: in order to think school in a performative code

    Directory of Open Access Journals (Sweden)

    Yuing Alfaro Tuillang

    2013-01-01

    Full Text Available This article explores a hypothesis stating that the performative character of norms offers many important points to think school. Thus, in the first place it is checked Canguilhem’s contributions related to the way the normative development works, and how it has to do with historical institutional processes in which school plays a key role. Besides, it is appealed to Foucault and Esposito’s inquiries which complement such contributions in the institutional field, showing the productive and performative character of norms. Finally, some contributions by Rancière related to the explanatory order and the prominence of teacher’s word as basis and support of school institutions are analyzed. Instead of conclusions, the convergence of these ideas allows to state suggestions generating analysis perspectives in order to think pedagogy and schooling from a perspective in which the constitutive value of norms sheds light on the crossing of relations between knowledge and power.

  12. Validation of the solar heating and cooling high speed performance (HISPER) computer code

    Science.gov (United States)

    Wallace, D. B.

    1980-01-01

    Developed to give a quick and accurate predictions HISPER, a simplification of the TRNSYS program, achieves its computational speed by not simulating detailed system operations or performing detailed load computations. In order to validate the HISPER computer for air systems the simulation was compared to the actual performance of an operational test site. Solar insolation, ambient temperature, water usage rate, and water main temperatures from the data tapes for an office building in Huntsville, Alabama were used as input. The HISPER program was found to predict the heating loads and solar fraction of the loads with errors of less than ten percent. Good correlation was found on both a seasonal basis and a monthly basis. Several parameters (such as infiltration rate and the outside ambient temperature above which heating is not required) were found to require careful selection for accurate simulation.

  13. Development of Tokamak Reactor System Code and Performance for Early Realization of DEMO

    International Nuclear Information System (INIS)

    Hong, B. G.; Lee, D. W.; Kim, Y.

    2006-01-01

    To develop the concepts of DEMO and identify the design parameters, dependence on performance objectives, design features and physical and technical constraints have to be considered. System analyses are necessary to find device variables which optimize figures of merit such as major radius, ignition margin, divertor heat load, neutron wall load, etc. Demonstration fusion power plant, DEMO is regarded as the last step before the development of a commercial fusion reactor in Korea National Basic Plan for the Development of Fusion Energy. The DEMO should demonstrate a net electric power generation, a tritium self sufficiency, and the safety aspect of a power plant. Performance of DEMO for early realization has been investigated with a limited extension from the plasma physics and technology in the 2nd phase of the ITER operation (EPP phase)

  14. Prediction of BWR performance under the influence of Isolation Condenser-using RAMONA-4 code

    International Nuclear Information System (INIS)

    Khan, H.J.; Cheng, H.S.; Rohatgi, U.S.

    1992-01-01

    The purpose of the Boiling Water Reactor (BWR) Isolation Condenser (IC) is to passively control the reactor pressure by removing heat from the system. This type of control is expected to reduce the frequency of opening and closing of the Safety Relief Valves (SRV). A comparative analysis is done for a BWR operating with and without the influence of an IC under Main Steam Isolation Valve (MSIV) closure. A regular BWR, with forced flow and high thermal power, has been considered for analysis. In addition, the effect of ICs on the BWR performance is studied for natural convection flow at lower power and modified riser geometry. The IC is coupled to the steam dome for the steam inlet flow and the Reactor Pressure Vessel (RPV) near the feed water entrance for the condensate return flow. Transient calculations are performed using prescribed pressure set points for the SRVs and given time settings for MSIV closure. The effect of the IC on the forced flow is to reduce the rate of pressure rise and thereby decrease the cycling frequency ofthe SRVS. This is the primary objective of any operating IC in a BWR (e.g. Oyster Creek). The response of the reactor thermal and fission power, steam flow rate, collapsed liquid level, and core average void fraction are found to agree with the trend of pressure. The variations in the case of an active IC can be closely related to the creation of a time lag and changes in the cycling frequency of the SRVS. An analysis for natural convection flow in a BWR indicates that the effect of an IC on its transient performance is similar to that for the forced convection system. In this case, the MSIV closure, has resulted in a lower peak pressure due to the magnitude of reduced power. However, the effect of reduced cycling frequency of the SRV due to the IC, and the time lag between the events, are comparable to that for forced convection

  15. Performance of an improved logarithmic phase mask with optimized parameters in a wavefront-coding system.

    Science.gov (United States)

    Zhao, Hui; Li, Yingcai

    2010-01-10

    In two papers [Proc. SPIE 4471, 272-280 (2001) and Appl. Opt. 43, 2709-2721 (2004)], a logarithmic phase mask was proposed and proved to be effective in extending the depth of field; however, according to our research, this mask is not that perfect because the corresponding defocused modulation transfer function has large oscillations in the low-frequency region, even when the mask is optimized. So, in a previously published paper [Opt. Lett. 33, 1171-1173 (2008)], we proposed an improved logarithmic phase mask by making a small modification. The new mask can not only eliminate the drawbacks to a certain extent but can also be even less sensitive to focus errors according to Fisher information criteria. However, the performance comparison was carried out with the modified mask not being optimized, which was not reasonable. In this manuscript, we optimize the modified logarithmic phase mask first before analyzing its performance and more convincing results have been obtained based on the analysis of several frequently used metrics.

  16. Effect of beat noise on the performance of two-dimensional time-spreading/wavelength-hopping optical code-division multiple-access systems

    Science.gov (United States)

    Bazan, T.; Harle, D.; Andonovic, I.; Meenakshi, M.

    2005-03-01

    The effect of beat noise on optical code-division multiple-access (OCDMA) systems using a range of two-dimensional (2-D) time-spreading/wavelength-hopping (TW) code families is presented. A derivation of a general formula for the error probability of the system is given. The properties of the 2-D codes--namely, the structure, length, and cross-correlation characteristics--are found to have a great influence on system performance. Improved performance can be obtained by use of real-time dynamic thresholding.

  17. A Test and Extension of Lane and Terry's (2000) Conceptual Model of Mood-Performance Relationships Using a Large Internet Sample.

    Science.gov (United States)

    Lane, Andrew M; Terry, Peter C; Devonport, Tracey J; Friesen, Andrew P; Totterdell, Peter A

    2017-01-01

    The present study tested and extended Lane and Terry (2000) conceptual model of mood-performance relationships using a large dataset from an online experiment. Methodological and theoretical advances included testing a more balanced model of pleasant and unpleasant emotions, and evaluating relationships among emotion regulation traits, states and beliefs, psychological skills use, perceptions of performance, mental preparation, and effort exerted during competition. Participants ( N = 73,588) completed measures of trait emotion regulation, emotion regulation beliefs, regulation efficacy, use of psychological skills, and rated their anger, anxiety, dejection, excitement, energy, and happiness before completing a competitive concentration task. Post-competition, participants completed measures of effort exerted, beliefs about the quality of mental preparation, and subjective performance. Results showed that dejection associated with worse performance with the no-dejection group performing 3.2% better. Dejection associated with higher anxiety and anger scores and lower energy, excitement, and happiness scores. The proposed moderating effect of dejection was supported for the anxiety-performance relationship but not the anger-performance relationship. In the no-dejection group, participants who reported moderate or high anxiety outperformed those reporting low anxiety by about 1.6%. Overall, results showed partial support for Lane and Terry's model. In terms of extending the model, results showed dejection associated with greater use of suppression, less frequent use of re-appraisal and psychological skills, lower emotion regulation beliefs, and lower emotion regulation efficacy. Further, dejection associated with greater effort during performance, beliefs that pre-competition emotions did not assist goal achievement, and low subjective performance. Future research is required to investigate the role of intense emotions in emotion regulation and performance.

  18. A Conceptual Framework for Assessment of Governance Performance of Lake Basins: Towards Transformation to Adaptive and Integrative Governance

    Directory of Open Access Journals (Sweden)

    Peter Emmanuel Cookey

    2016-03-01

    Full Text Available Governance is essential to lake basin management, but it is the most challenged and needs increased attention. Lake Basin Governance performance assessment is designed to measure the progress and impacts of policies, institutions and the roles of various actors in ensuring sustainability. It measures the performance of technical/operational, social/networks, and institutional arrangement that make up the socio-ecological system. Governance performance assessment becomes very necessary with over-emphasis of institutions on resources utilization and exploitation. The purpose of this paper is to present a governance performance assessment framework specifically for lake basins. The Adaptive Integrated Lake Basin Management (AILBM framework is a diagnostic and prescriptive performance assessment tool with an outcome to produce an adaptive and integrative system with equity, inclusiveness, transparency, accountability and flexibility to problem-solving and resilience. A case study on water governance performance assessment of the Songkhla Lake Basin (SLB in Thailand is provided for illustration and application and indicated a poor performance rating on governance in the Basin, revealing gaps, defects, strengths and weaknesses in the current system, necessary to recom