WorldWideScience

Sample records for performance code conceptual

  1. Nuclear-thermal-coupled optimization code for the fusion breeding blanket conceptual design

    International Nuclear Information System (INIS)

    Li, Jia; Jiang, Kecheng; Zhang, Xiaokang; Nie, Xingchen; Zhu, Qinjun; Liu, Songlin

    2016-01-01

    Highlights: • A nuclear-thermal-coupled predesign code has been developed for optimizing the radial build arrangement of fusion breeding blanket. • Coupling module aims at speeding up the efficiency of design progress by coupling the neutronics calculation code with the thermal-hydraulic analysis code. • Radial build optimization algorithm aims at optimal arrangement of breeding blanket considering one or multiple specified objectives subject to the design criteria such as material temperature limit and available TBR. - Abstract: Fusion breeding blanket as one of the key in-vessel components performs the functions of breeding the tritium, removing the nuclear heat and heat flux from plasma chamber as well as acting as part of shielding system. The radial build design which determines the arrangement of function zones and material properties on the radial direction is the basis of the detailed design of fusion breeding blanket. For facilitating the radial build design, this study aims for developing a pre-design code to optimize the radial build of blanket with considering the performance of nuclear and thermal-hydraulic simultaneously. Two main features of this code are: (1) Coupling of the neutronics analysis with the thermal-hydraulic analysis to speed up the analysis progress; (2) preliminary optimization algorithm using one or multiple specified objectives subject to the design criteria in the form of constrains imposed on design variables and performance parameters within the possible engineering ranges. This pre-design code has been applied to the conceptual design of water-cooled ceramic breeding blanket in project of China fusion engineering testing reactor (CFETR).

  2. Nuclear-thermal-coupled optimization code for the fusion breeding blanket conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jia, E-mail: lijia@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230027, Anhui (China); Jiang, Kecheng; Zhang, Xiaokang [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031, Anhui (China); Nie, Xingchen [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230027, Anhui (China); Zhu, Qinjun; Liu, Songlin [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031, Anhui (China)

    2016-12-15

    Highlights: • A nuclear-thermal-coupled predesign code has been developed for optimizing the radial build arrangement of fusion breeding blanket. • Coupling module aims at speeding up the efficiency of design progress by coupling the neutronics calculation code with the thermal-hydraulic analysis code. • Radial build optimization algorithm aims at optimal arrangement of breeding blanket considering one or multiple specified objectives subject to the design criteria such as material temperature limit and available TBR. - Abstract: Fusion breeding blanket as one of the key in-vessel components performs the functions of breeding the tritium, removing the nuclear heat and heat flux from plasma chamber as well as acting as part of shielding system. The radial build design which determines the arrangement of function zones and material properties on the radial direction is the basis of the detailed design of fusion breeding blanket. For facilitating the radial build design, this study aims for developing a pre-design code to optimize the radial build of blanket with considering the performance of nuclear and thermal-hydraulic simultaneously. Two main features of this code are: (1) Coupling of the neutronics analysis with the thermal-hydraulic analysis to speed up the analysis progress; (2) preliminary optimization algorithm using one or multiple specified objectives subject to the design criteria in the form of constrains imposed on design variables and performance parameters within the possible engineering ranges. This pre-design code has been applied to the conceptual design of water-cooled ceramic breeding blanket in project of China fusion engineering testing reactor (CFETR).

  3. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  4. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1991-10-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues are raised by various critics. In this paper I provide some perspective on issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass-action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes

  5. Methodology for thermal hydraulic conceptual design and performance analysis of KALIMER core

    International Nuclear Information System (INIS)

    Young-Gyun Kim; Won-Seok Kim; Young-Jin Kim; Chang-Kue Park

    2000-01-01

    This paper summarizes the methodology for thermal hydraulic conceptual design and performance analysis which is used for KALIMER core, especially the preliminary methodology for flow grouping and peak pin temperature calculation in detail. And the major technical results of the conceptual design for the KALIMER 98.03 core was shown and compared with those of KALIMER 97.07 design core. The KALIMER 98.03 design core is proved to be more optimized compared to the 97.07 design core. The number of flow groups are reduced from 16 to 11, and the equalized peak cladding midwall temperature from 654 deg. C to 628 deg. C. It was achieved from the nuclear and thermal hydraulic design optimization study, i.e. core power flattening and increase of radial blanket power fraction. Coolant flow distribution to the assemblies and core coolant/component temperatures should be determined in core thermal hydraulic analysis. Sodium flow is distributed to core assemblies with the overall goal of equalizing the peak cladding midwall temperatures for the peak temperature pin of each bundle, thus pin cladding damage accumulation and pin reliability. The flow grouping and the peak pin temperature calculation for the preliminary conceptual design is performed with the modules ORFCE-F60 and ORFCE-T60 respectively. The basic subchannel analysis will be performed with the SLTHEN code, and the detailed subchannel analysis will be done with the MATRA-LMR code which is under development for the K-Core system. This methodology was proved practical to KALIMER core thermal hydraulic design from the related benchmark calculation studies, and it is used to KALIMER core thermal hydraulic conceptual design. (author)

  6. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream.

    Science.gov (United States)

    Martin, Chris B; Douglas, Danielle; Newsome, Rachel N; Man, Louisa Ly; Barense, Morgan D

    2018-02-02

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. © 2018, Martin et al.

  7. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream

    Science.gov (United States)

    Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY

    2018-01-01

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853

  8. Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter

    Science.gov (United States)

    Russell, Carl; Johnson, Wayne

    2012-01-01

    A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.

  9. Conceptual frameworks of individual work performance: A systematic review

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Schaufeli, W.B.; Vet, H.C.W. de; Beek, A.J. van der

    2011-01-01

    Objective: Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework.

  10. Conceptual frameworks of individual work performance a systematic review

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Schaufeli, W.B.; de Vet, H.C.W.; van der Beek, A.J.

    2011-01-01

    Objective: Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework.

  11. Middle school teachers' familiarity with, interest in, performance on, and conceptual and pedagogical knowledge of light

    Science.gov (United States)

    Mbewe, Simeon

    The purpose of this study was threefold: Examine middle school teachers' familiarity with, interest in, conceptual knowledge of and performance on light; Examine their ability to identify misconceptions on light and their suggested pedagogical ideas to address the identified misconceptions; and Establish the relationship between the middle school teachers' interest, familiarity, conceptual understanding, performance, misconception identification, and pedagogical ideas for light. Sixty six (66) middle school science teachers enrolled in three math and science teacher professional development projects at Southern Illinois University Carbondale participated in this study. This study used mixed-methods approach to collect and analyze data. The participants responded in writing to four different instruments: Familiarity and Interest Questionnaire, Conceptual Knowledge Test, Two-tier Performance Test, and Misconceptions Identification Questionnaire. Data was analyzed quantitatively by conducting non-parametric (Wilcoxon, Mann-Whitney U, and Kruskal-Wallis) and parametric (paired samples, independent samples, and One-Way ANOVA) tests. Qualitative data was analyzed using thematic analysis and open coding to identify emerging themes and categories. The results showed that the teachers reported high levels of familiarity with and interest in learning more about light concepts. However, they had low conceptual knowledge and performance on light concepts. As such, middle school teachers' perceived knowledge of light concepts was not consistent with their actual knowledge of light. To some extent, the teachers identified students' misconceptions expressed in some scenarios on light and also suggested pedagogical ideas for addressing such misconceptions in middle school science classrooms. However, most teachers did not provide details on their pedagogical ideas for light. Correlations among the four constructs (familiarity, interest, conceptual understanding, and performance

  12. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  13. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  14. Performance Assessment Strategies: A computational framework for conceptual design of large roofs

    Directory of Open Access Journals (Sweden)

    Michela Turrin

    2014-01-01

    Full Text Available Using engineering performance evaluations to explore design alternatives during the conceptual phase of architectural design helps to understand the relationships between form and performance; and is crucial for developing well-performing final designs. Computer aided conceptual design has the potential to aid the design team in discovering and highlighting these relationships; especially by means of procedural and parametric geometry to support the generation of geometric design, and building performance simulation tools to support performance assessments. However, current tools and methods for computer aided conceptual design in architecture do not explicitly reveal nor allow for backtracking the relationships between performance and geometry of the design. They currently support post-engineering, rather than the early design decisions and the design exploration process. Focusing on large roofs, this research aims at developing a computational design approach to support designers in performance driven explorations. The approach is meant to facilitate the multidisciplinary integration and the learning process of the designer; and not to constrain the process in precompiled procedures or in hard engineering formulations, nor to automatize it by delegating the design creativity to computational procedures. PAS (Performance Assessment Strategies as a method is the main output of the research. It consists of a framework including guidelines and an extensible library of procedures for parametric modelling. It is structured on three parts. Pre-PAS provides guidelines for a design strategy-definition, toward the parameterization process. Model-PAS provides guidelines, procedures and scripts for building the parametric models. Explore-PAS supports the solutions-assessment based on numeric evaluations and performance simulations, until the identification of a suitable design solution. PAS has been developed based on action research. Several case studies

  15. Blood and Books: Performing Code Switching

    Directory of Open Access Journals (Sweden)

    Jeff Friedman

    2008-05-01

    Full Text Available Code switching is a linguistic term that identifies ways individuals use communication modes and registers to negotiate difference in social relations. This essay suggests that arts-based inquiry, in the form of choreography and performance, provides a suitable and efficacious location within which both verbal and nonverbal channels of code switching can be investigated. Blood and Books, a case study of dance choreography within the context of post-colonial Maori performance in Aotearoa/New Zealand, is described and analyzed for its performance of code switching. The essay is framed by a discussion of how arts-based research within tertiary higher education requires careful negotiation in the form of code switching, as performed by the author's reflexive use of vernacular and formal registers in the essay. URN: urn:nbn:de:0114-fqs0802462

  16. Driver Performance Model: 1. Conceptual Framework

    National Research Council Canada - National Science Library

    Heimerl, Joseph

    2001-01-01

    ...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.

  17. Conceptual-driven classification for coding advise in health insurance reimbursement.

    Science.gov (United States)

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in

  18. Development of intelligent code system to support conceptual design of nuclear reactor core

    International Nuclear Information System (INIS)

    Kugo, Teruhiko; Nakagawa, Masayuki; Tsuchihashi, Keichiro

    1997-01-01

    An intelligent reactor design system IRDS has been developed to support conceptual design of new type reactor cores in the fields of neutronics, thermal-hydraulics and fuel behavior. The features of IRDS are summarized as follows: 1) a variety of computer codes to cover various design tasks relevant to 'static' and 'burnup' problems are implemented, 2) all the information necessary to the codes implemented is unified in a data base, 3) several data and knowledge bases are referred to in order to proceed design process efficiently for non-expert users, 4) advanced man-machine interface to communicate with the system through an interactive and graphical user interface is equipped and 5) a function to search automatically a design window, which is defined as a feasible parameter range to satisfy design requirement and criteria is employed to support the optimization or satisfication process. Applicability and productivity of the system are demonstrated by the design study of fuel pin for new type FBR cores. (author)

  19. The Coding Process and Its Challenges

    Directory of Open Access Journals (Sweden)

    Judith A. Holton, Ph.D.

    2010-02-01

    Full Text Available Coding is the core process in classic grounded theory methodology. It is through coding that the conceptual abstraction of data and its reintegration as theory takes place. There are two types of coding in a classic grounded theory study: substantive coding, which includes both open and selective coding procedures, and theoretical coding. In substantive coding, the researcher works with the data directly, fracturing and analysing it, initially through open coding for the emergence of a core category and related concepts and then subsequently through theoretical sampling and selective coding of data to theoretically saturate the core and related concepts. Theoretical saturation is achieved through constant comparison of incidents (indicators in the data to elicit the properties and dimensions of each category (code. This constant comparing of incidents continues until the process yields the interchangeability of indicators, meaning that no new properties or dimensions are emerging from continued coding and comparison. At this point, the concepts have achieved theoretical saturation and the theorist shifts attention to exploring the emergent fit of potential theoretical codes that enable the conceptual integration of the core and related concepts to produce hypotheses that account for relationships between the concepts thereby explaining the latent pattern of social behaviour that forms the basis of the emergent theory. The coding of data in grounded theory occurs in conjunction with analysis through a process of conceptual memoing, capturing the theorist’s ideation of the emerging theory. Memoing occurs initially at the substantive coding level and proceeds to higher levels of conceptual abstraction as coding proceeds to theoretical saturation and the theorist begins to explore conceptual reintegration through theoretical coding.

  20. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  1. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  2. Performance measures for transform data coding.

    Science.gov (United States)

    Pearl, J.; Andrews, H. C.; Pratt, W. K.

    1972-01-01

    This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.

  3. Alternative conceptual models and codes for unsaturated flow in fractured tuff: Preliminary assessments for GWTT-95

    International Nuclear Information System (INIS)

    Ho, C.K.; Altman, S.J.; Arnold, B.W.

    1995-09-01

    Groundwater travel time (GWTT) calculations will play an important role in addressing site-suitability criteria for the potential high-level nuclear waste repository at Yucca Mountain,Nevada. In support of these calculations, Preliminary assessments of the candidate codes and models are presented in this report. A series of benchmark studies have been designed to address important aspects of modeling flow through fractured media representative of flow at Yucca Mountain. Three codes (DUAL, FEHMN, and TOUGH 2) are compared in these benchmark studies. DUAL is a single-phase, isothermal, two-dimensional flow simulator based on the dual mixed finite element method. FEHMN is a nonisothermal, multiphase, multidimensional simulator based primarily on the finite element method. TOUGH2 is anon isothermal, multiphase, multidimensional simulator based on the integral finite difference method. Alternative conceptual models of fracture flow consisting of the equivalent continuum model (ECM) and the dual permeability (DK) model are used in the different codes

  4. Developments of fuel performance analysis codes in KEPCO NF

    International Nuclear Information System (INIS)

    Han, H. T.; Choi, J. M.; Jung, C. D.; Yoo, J. S.

    2012-01-01

    The KEPCO NF has developed fuel performance analysis and design code named as ROPER, and utility codes of XGCOL and XDNB in order to perform fuel rod design evaluation for Korean nuclear power plants. The ROPER code intends to cover full range of fuel performance evaluation. The XGCOL code is for the clad flattening evaluation and the XDNB code is for the extensive DNB propagation evaluation. In addition to these, the KEPCO NF is now in the developing stage for 3-dimensional fuel performance analysis code, named as OPER3D, using 3-dimensional FEM for the nest generation within the joint project CANDU ENERGY in order to analyze PCMI behavior and fuel performance under load following operation. Of these, the ROPER code is now in the stage of licensing activities by Korean regulatory body and the other two are almost in the final developing stage. After finishing the developing, licensing activities are to be performed. These activities are intending to acquire competitiveness, originality, vendor-free ownership of fuel performance codes in the KEPCO NF

  5. Performance analysis of conceptual waste package designs in salt repositories

    International Nuclear Information System (INIS)

    Jansen, G. Jr.; Raines, G.E.; Kircher, J.F.

    1984-01-01

    A performance analysis of commercial high-level waste and spent fuel conceptual package designs in reference repositories in three salt formations was conducted with the WAPPA waste package code. Expected conditions for temperature, stress, brine composition, radiation level, and brine flow rate were used as boundary conditions to compute expected corrosion of a thick-walled overpack of 1025 wrought steel. In all salt formations corrosion by low Mg salt-dissolution brines typical of intrusion scenarios was too slow to cause the package to fail for thousands of years after burial. In high Mg brines judged typical of thermally migrating brines in bedded salt formations, corrosion rates which would otherwise have caused the packages to fail within a few hundred years were limited by brine availability. All of the brine reaching the package was consumed by reaction with the iron in the overpack, thus preventing further corrosion. Uniform brine distribution over the package surface was an important factor in predicting long package lifetimes for the high Mg brines. 14 references, 15 figures

  6. To master or perform? Exploring relations between achievement goals and conceptual change learning.

    Science.gov (United States)

    Ranellucci, John; Muis, Krista R; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M

    2013-09-01

    Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Seventy-three undergraduate students were assessed on their prior knowledge and misconceptions about Newtonian mechanics, and then reported their achievement goals and participated in think-aloud protocols while reading Newtonian physics texts. A mastery-approach goal orientation positively predicted deep processing strategies, shallow processing strategies, and conceptual change. In contrast, a performance-approach goal orientation did not predict either of the processing strategies, but negatively predicted conceptual change. A performance-avoidance goal orientation negatively predicted deep processing strategies and conceptual change. Moreover, deep and shallow processing strategies positively predicted conceptual change as well as recall. Finally, both deep and shallow processing strategies mediated relations between mastery-approach goals and conceptual change. Results provide some support for Dole and Sinatra's (1998) Cognitive Reconstruction of Knowledge Model of conceptual change but also challenge specific facets with regard to the role of depth of processing in conceptual change. © 2012 The British Psychological Society.

  7. On the Performance of the Cache Coding Protocol

    Directory of Open Access Journals (Sweden)

    Behnaz Maboudi

    2018-03-01

    Full Text Available Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent these attacks while allowing for the benefits of coding in mesh networks, the cache coding protocol was proposed. This protocol only allows recoding at the relays when the relay has received enough coded packets to decode an entire generation of packets. At that point, the relay node recodes and signs the recoded packets with its own private key, allowing the system to detect and minimize the effect of pollution attacks and making the relays accountable for changes on the data. This paper analyzes the delay performance of cache coding to understand the security-performance trade-off of this scheme. We introduce an analytical model for the case of two relays in an erasure channel relying on an absorbing Markov chain and an approximate model to estimate the performance in terms of the number of transmissions before successfully decoding at the receiver. We confirm our analysis using simulation results. We show that cache coding can overcome the security issues of unrestricted recoding with only a moderate decrease in system performance.

  8. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  9. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  10. Neutronic performance of Indian LLCB TBM set conceptual design in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Swami, H.L., E-mail: hswami@ipr.res.in; Shaw, A.K.; Mistry, A.N.; Danani, C.

    2016-12-15

    Highlights: • Neutronic analyses of conceptual design of LLCB test blanket module in ITER have been performed. • The estimated total tritium production rate in the LLCB TBM is 1.66E + 17 tritons/s. • Total heat deposited in the LLCB TBM is 0.46 MW and highest power density at TBM first wall is 5.2 Watt/cc. • The estimation shows the maximum DPA 2.72 at TBM FW. - Abstract: Tritium breeding blanket testing program in ITER is an important milestone towards the development of the fusion reactors. ITER organization is providing an opportunity to the partner countries to test their breeding blanket concepts. A mock-up of Indian Lead Lithium Ceramic Breeder (LLCB) tritium breeding blanket known as LLCB Test Blanket Module (TBM) will be tested in ITER equatorial port no. 2. LLCB blanket consists of lead lithium (PbLi) as a neutron multiplier & tritium breeder, ceramic breeder (Li{sub 2}TiO{sub 3}) as a tritium breeder and India specific Reduced Activation Ferretic Martinic Steel (IN-RAFMS) as a structural material. A stainless steel block which is cooled by water, called as shield block, is attached with TBM to provide neutron shield to ITER TBM port. A comprehensive neutronic performance evaluation is required for the design of the LLCB TBM set (TBM + shield block) and associated ancillary systems in ITER. The neutronic performance of the conceptual design of TBM set in ITER has been carried out and reported here. In order to carry out the neutronic performance evaluation, the neutronic models of the LLCB TBM set along with TBM frame have been constructed and inserted in the equatorial port of ITER reference neutronic model C-lite. Neutronic responses such as tritium production rate, nuclear heating, neutron flux & spectra, gas production & DPA in the LLCB TBM set are calculated considering 500 MW fusion power & fluence level of 0.3 MWa/m{sup 2}. Radiation transport code MCNP6 and FENDL 2.1 nuclear cross-section data library are used to perform the neutronic

  11. High performance APCS conceptual design and evaluation scoping study

    International Nuclear Information System (INIS)

    Soelberg, N.; Liekhus, K.; Chambers, A.; Anderson, G.

    1998-02-01

    This Air Pollution Control System (APCS) Conceptual Design and Evaluation study was conducted to evaluate a high-performance (APC) system for minimizing air emissions from mixed waste thermal treatment systems. Seven variations of high-performance APCS designs were conceptualized using several design objectives. One of the system designs was selected for detailed process simulation using ASPEN PLUS to determine material and energy balances and evaluate performance. Installed system capital costs were also estimated. Sensitivity studies were conducted to evaluate the incremental cost and benefit of added carbon adsorber beds for mercury control, specific catalytic reduction for NO x control, and offgas retention tanks for holding the offgas until sample analysis is conducted to verify that the offgas meets emission limits. Results show that the high-performance dry-wet APCS can easily meet all expected emission limits except for possibly mercury. The capability to achieve high levels of mercury control (potentially necessary for thermally treating some DOE mixed streams) could not be validated using current performance data for mercury control technologies. The engineering approach and ASPEN PLUS modeling tool developed and used in this study identified APC equipment and system performance, size, cost, and other issues that are not yet resolved. These issues need to be addressed in feasibility studies and conceptual designs for new facilities or for determining how to modify existing facilities to meet expected emission limits. The ASPEN PLUS process simulation with current and refined input assumptions and calculations can be used to provide system performance information for decision-making, identifying best options, estimating costs, reducing the potential for emission violations, providing information needed for waste flow analysis, incorporating new APCS technologies in existing designs, or performing facility design and permitting activities

  12. Performance in Public Organizations: Clarifying the Conceptual Space

    DEFF Research Database (Denmark)

    Andersen, Lotte Bøgh; Boesen, Andreas; Holm Pedersen, Lene

    2016-01-01

    's perspective is performance being assessed? Are the criteria formal or informal? Are the criteria subjective? Which process focus and product focus do they have, if any? What is the unit of analysis? Based on these distinctions, the performance criteria of existing studies used in an empirical review...... of management and performance are classified. The results illustrate how a systematization of the conceptual space of performance in public organizations can help researchers select what to study and what to leave out with greater accuracy while also bringing greater clarity to public debates about performance....

  13. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  14. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Yidong Xia; Mitch Plummer; Robert Podgorney; Ahmad Ghassemi

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation angle for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.

  15. ETF system code: composition and applications

    International Nuclear Information System (INIS)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies, such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system

  16. Conceptual Approach to Forming the Basic Code of Neo-Industrial Development of a Region

    Directory of Open Access Journals (Sweden)

    Elena Leonidovna Andreeva

    2017-09-01

    Full Text Available In the article, the authors propose the conceptual fundamentals of the “code approach” to the regional neo-industrial development. The purpose of the research is to reveal the essence of the transition to a new type of industrial and economic relations through a prism of “genetic codes” of the region. We consider these codes as a system of the “racial memory” of a territory, which determines the specificity and features of neo-industrialization realization. We substantiated the hypothesis about the influence of the “genetic codes” of the region on the effectiveness of the neo-industrialization. We have defined the participants, or else the carriers of the codes in the transformation of regional inheritance for the stimulation of the neoindustrial development of region’s economy. The subject matter of the research is the distinctive features of the functioning of the determinative region’s codes. Their content determines the socio-economic specificity of the region and the features of innovative, informational, value-based and competence-based development of the territory. The determinative codes generate the dynamic codes of the region, which are understood as their derivatives. They have a high probability of occurrence, higher speed of development and distribution, internal forces that make possible the self-development of the region. The scientific contribution is the substantiation of the basic code of the regional neo-industrial development. It represents the evolutionary accumulation of the rapid changes of its innovative, informational, value-based and competence-based codes stimulating the generation and implementation of new ideas regarding to economic entities adapted to the historical and cultural conditions. The article presents the code model of neo-industrial development of the region described by formulas. We applied the system analysis methods, historical and civilization approaches, evolutionary and

  17. Word meaning in the ventral visual path: a perceptual to conceptual gradient of semantic coding.

    Science.gov (United States)

    Borghesani, Valentina; Pedregosa, Fabian; Buiatti, Marco; Amadon, Alexis; Eger, Evelyn; Piazza, Manuela

    2016-12-01

    The meaning of words referring to concrete items is thought of as a multidimensional representation that includes both perceptual (e.g., average size, prototypical color) and conceptual (e.g., taxonomic class) dimensions. Are these different dimensions coded in different brain regions? In healthy human subjects, we tested the presence of a mapping between the implied real object size (a perceptual dimension) and the taxonomic categories at different levels of specificity (conceptual dimensions) of a series of words, and the patterns of brain activity recorded with functional magnetic resonance imaging in six areas along the ventral occipito-temporal cortical path. Combining multivariate pattern classification and representational similarity analysis, we found that the real object size implied by a word appears to be primarily encoded in early visual regions, while the taxonomic category and sub-categorical cluster in more anterior temporal regions. This anteroposterior gradient of information content indicates that different areas along the ventral stream encode complementary dimensions of the semantic space. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. A Constructive Conceptual Approach to Strategic Performance Measurement

    DEFF Research Database (Denmark)

    Mitchell, Falconer; Nielsen, Lars Bråd; Nørreklit, Hanne

    This paper focuses on identifying the key characteristics of a good strategic performance. It does this from a conceptual base founded in the paradigm of pragmatic constructivism. This involves analysing real world activities such as strategy setting and implementation in terms of the facts on wh...

  19. Reactor physics methods, models, and applications used to support the conceptual design of the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Gehin, J.C.; Worley, B.A.; Renier, J.P.; Wemple, C.A.; Jahshan, S.N.; Ryskammp, J.M.

    1995-08-01

    This report summarizes the neutronics analysis performed during 1991 and 1992 in support of characterization of the conceptual design of the Advanced Neutron Source (ANS). The methods used in the analysis, parametric studies, and key results supporting the design and safety evaluations of the conceptual design are presented. The analysis approach used during the conceptual design phase followed the same approach used in early ANS evaluations: (1) a strong reliance on Monte Carlo theory for beginning-of-cycle reactor performance calculations and (2) a reliance on few-group diffusion theory for reactor fuel cycle analysis and for evaluation of reactor performance at specific time steps over the fuel cycle. The Monte Carlo analysis was carried out using the MCNP continuous-energy code, and the few- group diffusion theory calculations were performed using the VENTURE and PDQ code systems. The MCNP code was used primarily for its capability to model the reflector components in realistic geometries as well as the inherent circumvention of cross-section processing requirements and use of energy-collapsed cross sections. The MCNP code was used for evaluations of reflector component reactivity effects and of heat loads in these components. The code was also used as a benchmark comparison against the diffusion-theory estimates of key reactor parameters such as region fluxes, control rod worths, reactivity coefficients, and material worths. The VENTURE and PDQ codes were used to provide independent evaluations of burnup effects, power distributions, and small perturbation worths. The performance and safety calculations performed over the subject time period are summarized, and key results are provided. The key results include flux and power distributions over the fuel cycle, silicon production rates, fuel burnup rates, component reactivities, control rod worths, component heat loads, shutdown reactivity margins, reactivity coefficients, and isotope production rates

  20. Conceptual OOP design of Pilot Code for Two-Fluid, Three-field Model with C++ 6.0

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B. D.; Lee, Y. J

    2006-09-15

    To establish the concept of the objective oriented program (OOP) design for reactor safety analysis code, the preliminary OOP design for PILOT code, which based on one dimensional two fluid three filed model, has been attempted with C++ language feature. Microsoft C++ language has been used since it is available as groupware utilization in KAERI. The language has can be merged with Compac Visual Fortran 6.6 in Visual Studio platform. In the development platform, C++ has been used as main language and Fortran has been used as mixed language in connection with C++ main drive program. The mixed language environment is a specific feature provided in visual studio. Existing Fortran source was utilized for input routine of reading steam table from generated file and routine of steam property calculation. The calling convention and passing argument from C++ driver was corrected. The mathematical routine, such as inverse matrix conversion and tridiagonal matrix solver, has been used as PILOT Fortran routines. Simple volume and junction utilized in PILOT code can be treated as objects, since they are the basic construction elements of code system. Other routines for overall solution scheme have been realized as procedure C functions. The conceptual design which consists of hydraulic loop, component, volume, and junction class has been described in the appendix in order to give the essential OOP structure of system safety analysis code. The attempt shows that many part of system analysis code can be expressed as objects, although the overall structure should be maintained as procedure functions. The encapsulation of data and functions within an object can provide many beneficial aspects in programming of system code.

  1. Conceptual OOP design of Pilot Code for Two-Fluid, Three-field Model with C++ 6.0

    International Nuclear Information System (INIS)

    Chung, B. D.; Lee, Y. J.

    2006-09-01

    To establish the concept of the objective oriented program (OOP) design for reactor safety analysis code, the preliminary OOP design for PILOT code, which based on one dimensional two fluid three filed model, has been attempted with C++ language feature. Microsoft C++ language has been used since it is available as groupware utilization in KAERI. The language has can be merged with Compac Visual Fortran 6.6 in Visual Studio platform. In the development platform, C++ has been used as main language and Fortran has been used as mixed language in connection with C++ main drive program. The mixed language environment is a specific feature provided in visual studio. Existing Fortran source was utilized for input routine of reading steam table from generated file and routine of steam property calculation. The calling convention and passing argument from C++ driver was corrected. The mathematical routine, such as inverse matrix conversion and tridiagonal matrix solver, has been used as PILOT Fortran routines. Simple volume and junction utilized in PILOT code can be treated as objects, since they are the basic construction elements of code system. Other routines for overall solution scheme have been realized as procedure C functions. The conceptual design which consists of hydraulic loop, component, volume, and junction class has been described in the appendix in order to give the essential OOP structure of system safety analysis code. The attempt shows that many part of system analysis code can be expressed as objects, although the overall structure should be maintained as procedure functions. The encapsulation of data and functions within an object can provide many beneficial aspects in programming of system code

  2. Safety performance of preliminary KALIMER conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong [Korea atomic Energy Resarch Inst., Taejon (Korea)

    1999-07-01

    The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)

  3. Safety performance of preliminary KALIMER conceptual design

    International Nuclear Information System (INIS)

    Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong

    1999-01-01

    The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)

  4. Cost and performance analysis of conceptual designs of physical protection systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-01-01

    CPA -- Cost and Performance Analysis -- is a methodology that joins Activity Based Cost (ABC) estimation with performance based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology are addressing analysis of alternative conceptual designs. To support these activities, the original architecture for CPA, is being expanded to incorporate results from a suite of performance and consequence analysis tools such as JTS (Joint Tactical Simulation), ERAD (Explosive Release Atmospheric Dispersion) and blast effect models. The process flow for applying CPA to the development and analysis conceptual designs is illustrated graphically

  5. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  6. Conceptual Designs for the Performance Improvement of APR1400 SIT and Preliminary Performance Evaluation

    International Nuclear Information System (INIS)

    Chu, In-Cheol; Kwon, Tae-Soon; Song, Chul-Hwa

    2008-01-01

    Some evolutionary type PWRs such as APR1400 and APWR adopt advanced safety injection tank (SIT). The SIT of APR1400 has a fluidic device (FD) which passively controls ECC water injection flow rate into reactor coolant system during refill and reflood phases of LB-LOCA (i.e., a high injection flow rate during the refill phase and a low injection flow rate during the reflood phase). The benefit of the FD is the elimination of the function of low pressure safety injection pump from the safety injection system. The flow controlling performance of the APR1400 FD was evaluated using a prototypical full-scale test facility, called VAPER (Valve Performance Evaluation Rig). Even though the performance of the APR1400 FD satisfied major design and licensing requirements, further improvement of the performance is expected such as the extension of total injection period, the delay of nitrogen gas discharge. Several conceptual designs have been being drawn out in order to improve the performance of the APR1400 SIT. The performance of some designs was evaluated using a small scale SIT test rig. The present paper introduces some of the conceptual designs and shows the performance evaluation experimental results

  7. The UK core performance code package

    International Nuclear Information System (INIS)

    Hutt, P.K.; Gaines, N.; McEllin, M.; White, R.J.; Halsall, M.J.

    1991-01-01

    Over the last few years work has been co-ordinated by Nuclear Electric, originally part of the Central Electricity Generating Board, with contributions from the United Kingdom Atomic Energy Authority and British Nuclear Fuels Limited, to produce a generic, easy-to-use and integrated package of core performance codes able to perform a comprehensive range of calculations for fuel cycle design, safety analysis and on-line operational support for Light Water Reactor and Advanced Gas Cooled Reactor plant. The package consists of modern rationalized generic codes for lattice physics (WIMS), whole reactor calculations (PANTHER), thermal hydraulics (VIPRE) and fuel performance (ENIGMA). These codes, written in FORTRAN77, are highly portable and new developments have followed modern quality assurance standards. These codes can all be run ''stand-alone'' but they are also being integrated within a new UNIX-based interactive system called the Reactor Physics Workbench (RPW). The RPW provides an interactive user interface and a sophisticated data management system. It offers quality assurance features to the user and has facilities for defining complex calculational sequences. The Paper reviews the current capabilities of these components, their integration within the package and outlines future developments underway. Finally, the Paper describes the development of an on-line version of this package which is now being commissioned on UK AGR stations. (author)

  8. BER performance comparison of optical CDMA systems with/without turbo codes

    Science.gov (United States)

    Kulkarni, Muralidhar; Chauhan, Vijender S.; Dutta, Yashpal; Sinha, Ravindra K.

    2002-08-01

    In this paper, we have analyzed and simulated the BER performance of a turbo coded optical code-division multiple-access (TC-OCDMA) system. A performance comparison has been made between uncoded OCDMA and TC-OCDMA systems employing various OCDMA address codes (optical orthogonal codes (OOCs), Generalized Multiwavelength Prime codes (GMWPC's), and Generalized Multiwavelength Reed Solomon code (GMWRSC's)). The BER performance of TC-OCDMA systems has been analyzed and simulated by varying the code weight of address code employed by the system. From the simulation results, it is observed that lower weight address codes can be employed for TC-OCDMA systems that can have the equivalent BER performance of uncoded systems employing higher weight address codes for a fixed number of active users.

  9. Performance of code 'FAIR' in IAEA CRP on FUMEX

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Kakodkar, A.

    1996-01-01

    A modern fuel performance analysis code FAIR has been developed for analysing high burnup fuel pins of water/heavy water cooled reactors. The code employs finite element method for modelling thermo mechanical behaviour of fuel pins and mechanistic models for modelling various physical and chemical phenomena affecting the behaviour of nuclear reactor fuel pins. High burnup affects such as pellet thermal conductivity degradation, enhanced fission gas release and radial flux redistribution are incorporated in the code FAIR. The code FAIR is capable of performing statistical analysis of fuel pins using Monte Carlo technique. The code is implemented on BARC parallel processing system ANUPAM. The code has recently participated in an International Atomic Energy Agency (IAEA) coordinated research program (CRP) on fuel modelling at extended burnups (FUMEX). Nineteen agencies from different countries participated in this exercise. In this CRP, spread over a period of three years, a number of high burnup fuel pins irradiated at Halden reactor are analysed. The first phase of the CRP is a blind code comparison exercise, where the computed results are compared with experimental results. The second phase consists of modifications to the code based on the experimental results of first phase and statistical analysis of fuel pins. The performance of the code FAIR in this CRP has been very good. The present report highlights the main features of code FAIR and its performance in the IAEA CRP on FUMEX. 14 refs., 5 tabs., ills

  10. On the performance of diagonal lattice space-time codes

    KAUST Repository

    Abediseid, Walid

    2013-11-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple output (MIMO) channel. All the coding design up-to-date focuses on either high-performance, high rates, low complexity encoding and decoding, or targeting a combination of these criteria [1]-[9]. In this paper, we analyze in details the performance limits of diagonal lattice space-time codes under lattice decoding. We present both lower and upper bounds on the average decoding error probability. We first derive a new closed-form expression for the lower bound using the so-called sphere lower bound. This bound presents the ultimate performance limit a diagonal lattice space-time code can achieve at any signal-to-noise ratio (SNR). The upper bound is then derived using the union-bound which demonstrates how the average error probability can be minimized by maximizing the minimum product distance of the code. Combining both the lower and the upper bounds on the average error probability yields a simple upper bound on the the minimum product distance that any (complex) lattice code can achieve. At high-SNR regime, we discuss the outage performance of such codes and provide the achievable diversity-multiplexing tradeoff under lattice decoding. © 2013 IEEE.

  11. Deconstructing Masculinity: A Qualitative Study of College Men's Masculine Conceptualizations and Gender Performance

    Science.gov (United States)

    Harris, Frank, III

    2008-01-01

    Informed by the constructionist epistemological perspective, the purpose of this study was to examine socially constructed conceptualizations of masculinity and gender performance among 12 culturally diverse undergraduate men. The participants espoused seemingly productive conceptualizations of masculinity, yet their gendered behaviors were…

  12. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  13. Performance and Fabrication Status of TREAT LEU Conversion Conceptual Design Concepts

    Energy Technology Data Exchange (ETDEWEB)

    IJ van Rooyen; SR Morrell; AE Wright; E. P Luther; K Jamison; AL Crawford; HT III Hartman

    2014-10-01

    Resumption of transient testing at the TREAT facility was approved in February 2014 to meet U.S. Department of Energy (DOE) objectives. The National Nuclear Security Administration’s Global Threat Reduction Initiative Convert Program is evaluating conversion of TREAT from its existing highly enriched uranium (HEU) core to a new core containing low enriched uranium (LEU). This paper describes briefly the initial pre-conceptual designs screening decisions with more detailed discussions on current feasibility, qualification and fabrication approaches. Feasible fabrication will be shown for a LEU fuel element assembly that can meet TREAT design, performance, and safety requirements. The statement of feasibility recognizes that further development, analysis, and testing must be completed to refine the conceptual design. Engineering challenges such as cladding oxidation, high temperature material properties, and fuel block fabrication along with neutronics performance, will be highlighted. Preliminary engineering and supply chain evaluation provided confidence that the conceptual designs can be achieved.

  14. Performance of FSO-OFDM based on BCH code

    Directory of Open Access Journals (Sweden)

    Jiao Xiao-lu

    2016-01-01

    Full Text Available As contrasted with the traditional OOK (on-off key system, FSO-OFDM system can resist the atmospheric scattering and improve the spectrum utilization rate effectively. Due to the instability of the atmospheric channel, the system will be affected by various factors, and resulting in a high BER. BCH code has a good error correcting ability, particularly in the short-length and medium-length code, and its performance is close to the theoretical value. It not only can check the burst errors but also can correct the random errors. Therefore, the BCH code is applied to the system to reduce the system BER. At last, the semi-physical simulation has been conducted with MATLAB. The simulation results show that when the BER is 10-2, the performance of OFDM is superior 4dB compared with OOK. In different weather conditions (extension rain, advection fog, dust days, when the BER is 10-5, the performance of BCH (255,191 channel coding is superior 4~5dB compared with uncoded system. All in all, OFDM technology and BCH code can reduce the system BER.

  15. Performance of JPEG Image Transmission Using Proposed Asymmetric Turbo Code

    Directory of Open Access Journals (Sweden)

    Siddiqi Mohammad Umar

    2007-01-01

    Full Text Available This paper gives the results of a simulation study on the performance of JPEG image transmission over AWGN and Rayleigh fading channels using typical and proposed asymmetric turbo codes for error control coding. The baseline JPEG algorithm is used to compress a QCIF ( "Suzie" image. The recursive systematic convolutional (RSC encoder with generator polynomials , that is, (13/11 in decimal, and 3G interleaver are used for the typical WCDMA and CDMA2000 turbo codes. The proposed asymmetric turbo code uses generator polynomials , that is, (13/11; 13/9 in decimal, and a code-matched interleaver. The effect of interleaver in the proposed asymmetric turbo code is studied using weight distribution and simulation. The simulation results and performance bound for proposed asymmetric turbo code for the frame length , code rate with Log-MAP decoder over AWGN channel are compared with the typical system. From the simulation results, it is observed that the image transmission using proposed asymmetric turbo code performs better than that with the typical system.

  16. Predictive Bias and Sensitivity in NRC Fuel Performance Codes

    Energy Technology Data Exchange (ETDEWEB)

    Geelhood, Kenneth J.; Luscher, Walter G.; Senor, David J.; Cunningham, Mitchel E.; Lanning, Donald D.; Adkins, Harold E.

    2009-10-01

    The latest versions of the fuel performance codes, FRAPCON-3 and FRAPTRAN were examined to determine if the codes are intrinsically conservative. Each individual model and type of code prediction was examined and compared to the data that was used to develop the model. In addition, a brief literature search was performed to determine if more recent data have become available since the original model development for model comparison.

  17. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  18. A comparison of thermal algorithms of fuel rod performance code systems

    International Nuclear Information System (INIS)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C.

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance

  19. A comparison of thermal algorithms of fuel rod performance code systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance.

  20. Performance testing of thermal analysis codes for nuclear fuel casks

    International Nuclear Information System (INIS)

    Sanchez, L.C.

    1987-01-01

    In 1982 Sandia National Laboratories held the First Industry/Government Joint Thermal and Structural Codes Information Exchange and presented the initial stages of an investigation of thermal analysis computer codes for use in the design of nuclear fuel shipping casks. The objective of the investigation was to (1) document publicly available computer codes, (2) assess code capabilities as determined from their user's manuals, and (3) assess code performance on cask-like model problems. Computer codes are required to handle the thermal phenomena of conduction, convection and radiation. Several of the available thermal computer codes were tested on a set of model problems to assess performance on cask-like problems. Solutions obtained with the computer codes for steady-state thermal analysis were in good agreement and the solutions for transient thermal analysis differed slightly among the computer codes due to modeling differences

  1. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  2. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  3. Interim performance specifications for conceptual waste-package designs for geologic isolation in salt repositories

    International Nuclear Information System (INIS)

    1983-06-01

    The interim performance specifications and data requirements presented apply to conceptual waste package designs for all waste forms which will be isolated in salt geologic repositories. The waste package performance specifications and data requirements respond to the waste package performance criteria. Subject areas treated include: containment and controlled release, operational period safety, criticality control, identification, and waste package performance testing requirements. This document was generated for use in the development of conceptual waste package designs in salt. It will be revised as additional data, analyses, and regulatory requirements become available

  4. Performance Evaluation of Spectral Amplitude Codes for OCDMA PON

    DEFF Research Database (Denmark)

    Binti Othman, Maisara; Jensen, Jesper Bevensee; Zhang, Xu

    2011-01-01

    the MAI effects in OCDMA. The performance has been characterized through received optical power (ROP) sensitivity and dispersion tolerance assessments. The numerical results show that the ZCC code has a slightly better performance compared to the other two codes for the ROP and similar behavior against...

  5. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  6. Preserving Envelope Efficiency in Performance Based Code Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A. [Thornton Energy Consulting (United States); Sullivan, Greg P. [Efficiency Solutions (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baechler, Michael C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-20

    The City of Seattle 2012 Energy Code (Seattle 2014), one of the most progressive in the country, is under revision for its 2015 edition. Additionally, city personnel participate in the development of the next generation of the Washington State Energy Code and the International Energy Code. Seattle has pledged carbon neutrality by 2050 including buildings, transportation and other sectors. The United States Department of Energy (DOE), through Pacific Northwest National Laboratory (PNNL) provided technical assistance to Seattle in order to understand the implications of one potential direction for its code development, limiting trade-offs of long-lived building envelope components less stringent than the prescriptive code envelope requirements by using better-than-code but shorter-lived lighting and heating, ventilation, and air-conditioning (HVAC) components through the total building performance modeled energy compliance path. Weaker building envelopes can permanently limit building energy performance even as lighting and HVAC components are upgraded over time, because retrofitting the envelope is less likely and more expensive. Weaker building envelopes may also increase the required size, cost and complexity of HVAC systems and may adversely affect occupant comfort. This report presents the results of this technical assistance. The use of modeled energy code compliance to trade-off envelope components with shorter-lived building components is not unique to Seattle and the lessons and possible solutions described in this report have implications for other jurisdictions and energy codes.

  7. Fuel pin design algorithm for conceptual design studies

    International Nuclear Information System (INIS)

    Uselman, J.P.

    1979-01-01

    Two models are available which are currently verified by part of the requirements and which are adaptable as algorithms for the complete range. Fuel thermal performance is described by the HEDL SIEX model. Cladding damage and total deformation are determined by the GE GRO-II structural analysis code. A preliminary fuel pin performance model for analysis of (U, P/sub U/)O 2 pins in the COROPT core conceptual design system has been constructed by combining the key elements of SIEX and GRO-II. This memo describes the resulting pin performance model and its interfacing with COROPT system. Some exemplary results are presented

  8. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K. [Cray Inc., St. Paul, MN 55101 (United States); Porter, D. [Minnesota Supercomputing Institute for Advanced Computational Research, Minneapolis, MN USA (United States); O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W. [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Edmon, P., E-mail: pjm@cray.com, E-mail: nradclif@cray.com, E-mail: kkandalla@cray.com, E-mail: oneill@astro.umn.edu, E-mail: nolt0040@umn.edu, E-mail: donnert@ira.inaf.it, E-mail: twj@umn.edu, E-mail: dhp@umn.edu, E-mail: pedmon@cfa.harvard.edu [Institute for Theory and Computation, Center for Astrophysics, Harvard University, Cambridge, MA 02138 (United States)

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  9. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    International Nuclear Information System (INIS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W.; Edmon, P.

    2017-01-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  10. On the Performance of the Cache Coding Protocol

    DEFF Research Database (Denmark)

    Maboudi, Behnaz; Sehat, Hadi; Pahlevani, Peyman

    2018-01-01

    Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent thes...

  11. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-09-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location. Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams. Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed. Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model. Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  12. The fuel performance code future

    International Nuclear Information System (INIS)

    Ronchi, C.; Van de Laar, J.

    1988-01-01

    The paper describes the LWR version of the fuel performance code FUTURE, which was recently developed to calculate the fuel response (swelling, cladding deformation, release) to reactor transient conditions, starting from a broad-based description of the processes of major concern. The main physical models assumed are presented together with the scheme of the computer program

  13. Development of LWR fuel performance code FEMAXI-6

    International Nuclear Information System (INIS)

    Suzuki, Motoe

    2006-01-01

    LWR fuel performance code: FEMAXI-6 (Finite Element Method in AXIs-symmetric system) is a representative fuel analysis code in Japan. Development history, background, design idea, features of model, and future are stated. Characteristic performance of LWR fuel and analysis code, what is model, development history of FEMAXI, use of FEMAXI code, fuel model, and a special feature of FEMAXI model is described. As examples of analysis, PCMI (Pellet-Clad Mechanical Interaction), fission gas release, gap bonding, and fission gas bubble swelling are reported. Thermal analysis and dynamic analysis system of FEMAXI-6, function block at one time step of FEMAXI-6, analytical example of PCMI in the output increase test by FEMAXI-III, analysis of fission gas release in Halden reactor by FEMAXI-V, comparison of the center temperature of fuel in Halden reactor, and analysis of change of diameter of fuel rod in high burn up BWR fuel are shown. (S.Y.)

  14. Conceptual design for accelerator-driven sodium-cooled sub-critical transmutation reactors using scale laws

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kwang Gu; Chang, Soon Heung [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    The feasibility study on conceptual design methodology for accelerator-driven sodium-cooled sub-critical transmutation reactors has been conducted to optimize the design parameters from the scale laws and validates the reactor performance with the integrated code system. A 1000 MWth sodium-cooled sub-critical transmutation reactor has been scaled and verified through the methodology in this paper, which is referred to Advanced Liquid Metal Reactor (ALMR). A Pb-Bi target material and a partitioned fuel are the liquid phases, and they are cooled by the circulation of secondary Pb-Bi coolant and by primary sodium coolant, respectively. Overall key design parameters are generated from the scale laws and they are improved and validated by the integrated code system. Integrated Code System (ICS) consists of LAHET, HMCNP, ORIGEN2, and COMMIX codes and some files. Through ICS the target region, the core region, and thermal-hydraulic related regions are analyzed once-through Results of conceptual design are attached in this paper. 5 refs., 4 figs., 1 tab. (Author)

  15. Performance Analysis of Optical Code Division Multiplex System

    Science.gov (United States)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  16. Preliminary conceptual design and analysis on KALIMER reactor structures

    International Nuclear Information System (INIS)

    Kim, Jong Bum

    1996-10-01

    The objectives of this study are to perform preliminary conceptual design and structural analyses for KALIMER (Korea Advanced Liquid Metal Reactor) reactor structures to assess the design feasibility and to identify detailed analysis requirements. KALIMER thermal hydraulic system analysis results and neutronic analysis results are not available at present, only-limited preliminary structural analyses have been performed with the assumptions on the thermal loads. The responses of reactor vessel and reactor internal structures were based on the temperature difference of core inlet and outlet and on engineering judgments. Thermal stresses from the assumed temperatures were calculated using ANSYS code through parametric finite element heat transfer and elastic stress analyses. While, based on the results of preliminary conceptual design and structural analyses, the ASME Code limits for the reactor structures were satisfied for the pressure boundary, the needs for inelastic analyses were indicated for evaluation of design adequacy of the support barrel and the thermal liner. To reduce thermal striping effects in the bottom are of UIS due to up-flowing sodium form reactor core, installation of Inconel-718 liner to the bottom area was proposed, and to mitigate thermal shock loads, additional stainless steel liner was also suggested. The design feasibilities of these were validated through simplified preliminary analyses. In conceptual design phase, the implementation of these results will be made for the design of the reactor structures and the reactor internal structures in conjunction with the thermal hydraulic, neutronic, and seismic analyses results. 4 tabs., 24 figs., 4 refs. (Author)

  17. Code structure for U-Mo fuel performance analysis in high performance research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Gwan Yoon; Cho, Tae Won; Lee, Chul Min; Sohn, Dong Seong [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Kyu Hong; Park, Jong Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A performance analysis modeling applicable to research reactor fuel is being developed with available models describing fuel performance phenomena observed from in-pile tests. We established the calculation algorithm and scheme to best predict fuel performance using radio-thermo-mechanically coupled system to consider fuel swelling, interaction layer growth, pore formation in the fuel meat, and creep fuel deformation and mass relocation, etc. In this paper, we present a general structure of the performance analysis code for typical research reactor fuel and advanced features such as a model to predict fuel failure induced by combination of breakaway swelling and pore growth in the fuel meat. Thermo-mechanical code dedicated to the modeling of U-Mo dispersion fuel plates is being under development in Korea to satisfy a demand for advanced performance analysis and safe assessment of the plates. The major physical phenomena during irradiation are considered in the code such that interaction layer formation by fuel-matrix interdiffusion, fission induced swelling of fuel particle, mass relocation by fission induced stress, and pore formation at the interface between the reaction product and Al matrix.

  18. Structure of fuel performance audit code for SFR metal fuel

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yong Sik; Kim, Hyo Chan [KAERI, Daejeon (Korea, Republic of); Jeong, Hye Dong; Shin, An Dong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    A Sodium Cooled Fast Reactor (SFR) is a promising option to solve the spent fuel problems, but, there are still much technical issues to commercialize a SFR. One of issues is a development of advanced fuel which can solve the safety and the economic issues at the same time. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured. In Korea Institute of Nuclear Safety (KINS), the new project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. To develop the new code system, the code structure design and its requirements need to be studied. Various performance models and code systems are reviewed and their characteristics are analyzed in this paper. Based on this study, the fundamental performance models are deduced and basic code requirements and structure are established.

  19. Performance Analysis of New Binary User Codes for DS-CDMA Communication

    Science.gov (United States)

    Usha, Kamle; Jaya Sankar, Kottareddygari

    2016-03-01

    This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.

  20. A fuel performance code TRUST VIc and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, M; Kogai, T [Nippon Nuclear Fuel Development Co. Ltd., Oarai, Ibaraki (Japan)

    1997-08-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs.

  1. A fuel performance code TRUST VIc and its validation

    International Nuclear Information System (INIS)

    Ishida, M.; Kogai, T.

    1997-01-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs

  2. The NMC code: conduct, performance and ethics.

    Science.gov (United States)

    Goldsmith, Jan

    The Code: Standards of Conduct, Performance and Ethics for Nurses and Midwives is a set of key principles that should underpin the practice of all nurses and midwives, and remind them of their professional responsibilities. It is not just a tool used in fitness-to-practise cases--it should be used to guide daily practice for all nurses and midwives. Alongside other standards, guidance and advice from the NMC, the code should be used to support professional development.

  3. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  4. On the performance of diagonal lattice space-time codes

    KAUST Repository

    Abediseid, Walid; Alouini, Mohamed-Slim

    2013-01-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple output (MIMO) channel. All the coding design up-to-date focuses on either high-performance, high rates, low complexity encoding

  5. The METEOR/TRANSURANUS fuel performance code

    International Nuclear Information System (INIS)

    Struzik, C.; Guerin, Y.

    1996-01-01

    The first calculations for the FUMEX exercise were performed using version 1.1 of the METEOR/TRANSURANUS code. Since then, important improvements have been implemented on several models. In its present state, the code describes fuel rod behaviour in standard PWR conditions. Its validity extends to UO 2 and MOX fuels clad in Zircaloy-4. Power transient calculations for UO 2 and Gd doped fuel calculations are possible, but further developments are in progress, and the applications will be fully qualified in version 2.0. A considerable effort is made to replace semi-empirical models with models that have a sounder physical basis. (authors). 14 refs

  6. Iterative optimization of performance libraries by hierarchical division of codes

    International Nuclear Information System (INIS)

    Donadio, S.

    2007-09-01

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  7. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    International Nuclear Information System (INIS)

    Jang, Yu Jin

    2013-01-01

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  8. Automatic performance estimation of conceptual temperature control system design for rapid development of real system

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Yu Jin [Dongguk University, GyeongJu (Korea, Republic of)

    2013-07-15

    This paper presents an automatic performance estimation scheme of conceptual temperature control system with multi-heater configuration prior to constructing the physical system for achieving rapid validation of the conceptual design. An appropriate low-order discrete-time model, which will be used in the controller design, is constructed after determining several basic factors including the geometric shape of controlled object and heaters, material properties, heater arrangement, etc. The proposed temperature controller, which adopts the multivariable GPC (generalized predictive control) scheme with scale factors, is then constructed automatically based on the above model. The performance of the conceptual temperature control system is evaluated by using a FEM (finite element method) simulation combined with the controller.

  9. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    Science.gov (United States)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  10. Optimizing fusion PIC code performance at scale on Cori Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, T. S.; Deslippe, J.

    2017-07-23

    In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale well up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.

  11. Performance Analysis for Cooperative Communication System with QC-LDPC Codes Constructed with Integer Sequences

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2015-01-01

    Full Text Available This paper presents four different integer sequences to construct quasi-cyclic low-density parity-check (QC-LDPC codes with mathematical theory. The paper introduces the procedure of the coding principle and coding. Four different integer sequences constructing QC-LDPC code are compared with LDPC codes by using PEG algorithm, array codes, and the Mackey codes, respectively. Then, the integer sequence QC-LDPC codes are used in coded cooperative communication. Simulation results show that the integer sequence constructed QC-LDPC codes are effective, and overall performance is better than that of other types of LDPC codes in the coded cooperative communication. The performance of Dayan integer sequence constructed QC-LDPC is the most excellent performance.

  12. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  13. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  14. FLASH: A finite element computer code for variably saturated flow

    International Nuclear Information System (INIS)

    Baca, R.G.; Magnuson, S.O.

    1992-05-01

    A numerical model was developed for use in performance assessment studies at the INEL. The numerical model, referred to as the FLASH computer code, is designed to simulate two-dimensional fluid flow in fractured-porous media. The code is specifically designed to model variably saturated flow in an arid site vadose zone and saturated flow in an unconfined aquifer. In addition, the code also has the capability to simulate heat conduction in the vadose zone. This report presents the following: description of the conceptual frame-work and mathematical theory; derivations of the finite element techniques and algorithms; computational examples that illustrate the capability of the code; and input instructions for the general use of the code. The FLASH computer code is aimed at providing environmental scientists at the INEL with a predictive tool for the subsurface water pathway. This numerical model is expected to be widely used in performance assessments for: (1) the Remedial Investigation/Feasibility Study process and (2) compliance studies required by the US Department of Energy Order 5820.2A

  15. Organizational Culture and Safety Performance in the Manufacturing Companies in Malaysia: A Conceptual Analysis

    OpenAIRE

    Ong Choon Hee; Lim Lee Ping

    2014-01-01

    The purpose of this paper is to provide a conceptual analysis of organizational culture and safety performance in the manufacturing companies in Malaysia. Our conceptual analysis suggests that manufacturing companies that adopt group culture or hierarchical culture are more likely to demonstrate safety compliance and safety participation. Manufacturing companies that adopt rational culture or developmental culture are less likely to demonstrate safety compliance and safety participation. Give...

  16. The JAERI code system for evaluation of BWR ECCS performance

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Akimoto, Masayuki; Asahi, Yoshiro; Abe, Kiyoharu; Muramatsu, Ken; Araya, Fumimasa; Sato, Kazuo

    1982-12-01

    Development of respective computer code system of BWR and PWR for evaluation of ECCS has been conducted since 1973 considering the differences of the reactor cooling system, core structure and ECCS. The first version of the BWR code system, of which developmental work started earlier than that of the PWR, has been completed. The BWR code system is designed to provide computational tools to analyze all phases of LOCAs and to evaluate the performance of the ECCS including an ''Evaluation Model (EM)'' feature in compliance with the requirements of the current Japanese Evaluation Guideline of ECCS. The BWR code system could be used for licensing purpose, i.e. for ECCS performance evaluation or audit calculations to cross-examine the methods and results of applicants or vendors. The BWR code system presented in this report comprises several computer codes, each of which analyzes a particular phase of a LOCA or a system blowdown depending on a range of LOCAs, i.e. large and small breaks in a variety of locations in the reactor system. The system includes ALARM-B1, HYDY-B1 and THYDE-B1 for analysis of the system blowdown for various break sizes, THYDE-B-REFLOOD for analysis of the reflood phase and SCORCH-B2 for the calculation of the fuel assembl hot plane temperature. When the multiple codes are used to analyze a broad range of LOCA as stated above, it is very important to evaluate the adequacy and consistency between the codes used to cover an entire break spectrum. The system consistency together with the system performance are discussed for a large commercial BWR. (author)

  17. A comparative evaluation of NDR and PSAR using the CASMO-3/MASTER code system

    International Nuclear Information System (INIS)

    Sim, Jeoung Hun; Kim, Han Gon

    2009-01-01

    In order to validate nuclear design data such as the nuclear design report (NDR) and data in preliminary (or final) safety analysis report (PSAR/FSAR) and to use data for the conceptual design of new plants, the CASMO-3/MASTER code system is selected as utility code. The nuclear design of OPR1000 and APR1400 is performed with the DIT/ROCS code system. In contrast with this design code system, the accuracy of CASMO- 3/MASTER code system has not been verified. Relatively little design data has been calculated by the CASMO-3/MASTER code system for OPR1000 and APR1400 and a bias system has not been developed yet. As such, validation of the performance of the CASMO- 3/MASTER code system is necessary. In order to validate the performance of the CASMO- 3/MASTER code system and to develop a calculation methodology, a comparative evaluation with NDR of Ulchin unit 4, cycle 1(U4C1) and the PSAR of Shinkori units 3 and 4 is carried out. The results of this evaluation are presented in this paper

  18. Performance analysis of LDPC codes on OOK terahertz wireless channels

    International Nuclear Information System (INIS)

    Liu Chun; Wang Chang; Cao Jun-Cheng

    2016-01-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. (paper)

  19. Performance enhancement of successive interference cancellation scheme based on spectral amplitude coding for optical code-division multiple-access systems using Hadamard codes

    Science.gov (United States)

    Eltaif, Tawfig; Shalaby, Hossam M. H.; Shaari, Sahbudin; Hamarsheh, Mohammad M. N.

    2009-04-01

    A successive interference cancellation scheme is applied to optical code-division multiple-access (OCDMA) systems with spectral amplitude coding (SAC). A detailed analysis of this system, with Hadamard codes used as signature sequences, is presented. The system can easily remove the effect of the strongest signal at each stage of the cancellation process. In addition, simulation of the prose system is performed in order to validate the theoretical results. The system shows a small bit error rate at a large number of active users compared to the SAC OCDMA system. Our results reveal that the proposed system is efficient in eliminating the effect of the multiple-user interference and in the enhancement of the overall performance.

  20. Structured Review of Code Clone Literature

    NARCIS (Netherlands)

    Hordijk, W.T.B.; Ponisio, Laura; Wieringa, Roelf J.

    2008-01-01

    This report presents the results of a structured review of code clone literature. The aim of the review is to assemble a conceptual model of clone-related concepts which helps us to reason about clones. This conceptual model unifies clone concepts from a wide range of literature, so that findings

  1. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    Science.gov (United States)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in

  2. Conceptual study of advanced PWR core design

    International Nuclear Information System (INIS)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong.

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs

  3. Conceptual study of advanced PWR core design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs.

  4. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  5. The new Italian code of medical ethics.

    Science.gov (United States)

    Fineschi, V; Turillazzi, E; Cateni, C

    1997-01-01

    In June 1995, the Italian code of medical ethics was revised in order that its principles should reflect the ever-changing relationship between the medical profession and society and between physicians and patients. The updated code is also a response to new ethical problems created by scientific progress; the discussion of such problems often shows up a need for better understanding on the part of the medical profession itself. Medical deontology is defined as the discipline for the study of norms of conduct for the health care professions, including moral and legal norms as well as those pertaining more strictly to professional performance. The aim of deontology is therefore, the in-depth investigation and revision of the code of medical ethics. It is in the light of this conceptual definition that one should interpret a review of the different codes which have attempted, throughout the various periods of Italy's recent history, to adapt ethical norms to particular social and health care climates. PMID:9279746

  6. Development of best estimate auditing code for CANDU thermal-hydraulic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Won Jae; Hwang, Moon Kyu; Lim, Hong Sik [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    The main purpose of this study is to develop a thermal hydraulic auditing code for the CANDU reactor, modifying the model of existing PWR auditing tool, i.e. RELAP5/MOD3.The study was performed by reconsideration of the previous code assessment works and phenomena identification for essential accident scenario. Improvement areas of model development for auditing tool were identified based on the code comparison and PIRT results. Nine models have been improved significantly for the analysis of LOCA and Mon LOCA event. Conceptual problem or separate effect assessment have been performed to verify the model improvement. The linking calculation with CONTAIN 2.0 has been also enabled to establish the unified auditing code system. Analysis for the CANDU plant real transient and hypothetical LOCA bas been performed using the improved version. It has been concluded that the developed version can be utilized for the auditing analysis of LOCA and non-LOCA event for the CANDU reactor. 25 refs., 84 figs., 36 tabs. (Author)

  7. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  8. CONCEPTUAL APRROACH OF LEADERSHIP AND EMPLOYEES’ INDIVIDUAL PERFORMANCE RELATIONSHIP

    Directory of Open Access Journals (Sweden)

    Ispas Andreia

    2012-07-01

    Full Text Available There is a need for more leadership research in the hotel industry. A reason is the better understanding of the effects and influences that manager-employee relationship can have. Further, this can improve employee performance at their work and clients would be more satisfied. Due to the emergent stage of this field of research, the leadership and employee performance relationship has a lot of potential for being investigated in the context of hotel industry. There are many types of leadership styles recommended to be used in managers’ daily hotels’ activities. The most suggested is transformational leadership. Hotel industry is part of hospitality industry and is providing different services such as accommodation, food nbeverage, entertainment. This makes hotel industry very specific and the hotel products, services unique for each of client. That’s why high standards of organizational performance, a good manager-employee relationship and a high perceived employee performance by the clients are the keys for a long and sustainable business. Leadership is important in getting people to do the right things at their job. More, it helps employee to work better, to be motivated, to strive for excellence and perform beyond expectations. The first objective of this paper is to analyze the main approaches of leadership styles applied in the context of hotel industry studies. The second objective is to explore, from a theoretical point of view, the direct and indirect relationship between leadership style and employee performance based on two mediator factors: job satisfaction and organizational commitment. These two mediators are the most used in organizational researches and have a high degree of validity. The methodology applied is based on documentary study review and critic analysis of the concepts used by the literature. The results of the conceptual paper are: the construction of the conceptual model of scientific research

  9. FEMAXI-III, a computer code for fuel rod performance analysis

    International Nuclear Information System (INIS)

    Ito, K.; Iwano, Y.; Ichikawa, M.; Okubo, T.

    1983-01-01

    This paper presents a method of fuel rod thermal-mechanical performance analysis used in the FEMAXI-III code. The code incorporates the models describing thermal-mechanical processes such as pellet-cladding thermal expansion, pellet irradiation swelling, densification, relocation and fission gas release as they affect pellet-cladding gap thermal conductance. The code performs the thermal behavior analysis of a full-length fuel rod within the framework of one-dimensional multi-zone modeling. The mechanical effects including ridge deformation is rigorously analyzed by applying the axisymmetric finite element method. The finite element geometrical model is confined to a half-pellet-height region with the assumption that pellet-pellet interaction is symmetrical. The 8-node quadratic isoparametric ring elements are adopted for obtaining accurate finite element solutions. The Newton-Raphson iteration with an implicit algorithm is applied to perform the analysis of non-linear material behaviors accurately and stably. The pellet-cladding interaction mechanism is exactly treated using the nodal continuity conditions. The code is applicable to the thermal-mechanical analysis of water reactor fuel rods experiencing variable power histories. (orig.)

  10. SCANAIR: A transient fuel performance code

    International Nuclear Information System (INIS)

    Moal, Alain; Georgenthum, Vincent; Marchand, Olivier

    2014-01-01

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  11. SCANAIR: A transient fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Moal, Alain, E-mail: alain.moal@irsn.fr; Georgenthum, Vincent; Marchand, Olivier

    2014-12-15

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  12. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Directory of Open Access Journals (Sweden)

    Nawawi N. M.

    2017-01-01

    Full Text Available In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA system using Zero Cross Correlation (ZCC code and multiband Orthogonal Frequency Division Multiplexing (OFDM called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  13. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Science.gov (United States)

    Nawawi, N. M.; Anuar, M. S.; Junita, M. N.; Rashidi, C. B. M.

    2017-11-01

    In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA) system using Zero Cross Correlation (ZCC) code and multiband Orthogonal Frequency Division Multiplexing (OFDM) called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  14. Conceptual Design of Object Oriented Program (OOP) for Pilot Code of Two-Fluid, Three-field Model with C++ 6.0

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin

    2006-01-01

    Engineering software for design purpose in nuclear industries have been developed since early 1970s, and well established in 1980s. The most popular and common language for the software development has been FORTRAN series, until the more sophisticated GUI and software coupling is needed. The advanced computer language, such as C++, C has been developed to help the programming for the easy GUI need and reuse of well developed routines, with adopting the objective oriented program. A recent trend of programming becomes objective-oriented since the results are often more intuitive and easier to maintain than procedure program. The main motivation of this work is to capture objective oriented concepts for conventional safety analysis programs which consist of many functions and procedure oriented structures. In this work, the new objective programming with C++ 6.0 language has been tried for the PILOT code written in FORTRAN language, and conceptual OOP design of the system safety analysis code has been done

  15. New Conceptual Design Tools

    DEFF Research Database (Denmark)

    Pugnale, Alberto; Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal......This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  16. Conceptualizing performance of nursing care as a prerequisite for better measurement: a systematic and interpretive review

    Science.gov (United States)

    2013-01-01

    Background Despite the critical role of nursing care in determining high-performing healthcare delivery, performance science in this area is still at an early stage of development and nursing’s contribution most often remains invisible to policy-makers and managers. The objectives of this study were: 1) to develop a theoretically based framework to conceptualize nursing care performance; 2) to analyze how the different components of the framework have been operationalized in the literature; and 3) to develop a pool of indicators sensitive to various aspects of nursing care that can be used as a basis for designing a performance measurement system. Methods We carried out a systematic review of published literature across three databases (MEDLINE, EMBASE and CINAHL), focusing on literature between 1990 and 2008. Screening of 2,103 papers resulted in final selection of 101 papers. A detailed template was used to extract the data. For the analysis, we used the method of interpretive synthesis, focusing first on 31 papers with theoretical or conceptual frameworks; the remaining 70 articles were used to strengthen and consolidate the findings. Results Current conceptualizations of nursing care performance mostly reflect a system perspective that builds on system theory, Donabedian’s earlier works on healthcare organization, and Parsons’ theory of social action. Drawing on these foundational works and the evidence collated, the Nursing Care Performance Framework (NCPF) we developed conceptualizes nursing care performance as resulting from three nursing subsystems that operate together to achieve three key functions: (1) acquiring, deploying and maintaining nursing resources, (2) transforming nursing resources into nursing services, and (3) producing changes in patients’ conditions. Based on the literature review, these three functions are operationalized through 14 dimensions that cover 51 variables. The NCPF not only specifies core aspects of nursing performance

  17. DIANA Code: Design and implementation of an analytic core calculus code by two group, two zone diffusion

    International Nuclear Information System (INIS)

    Mochi, Ignacio

    2005-01-01

    The principal parameters of nuclear reactors are determined in the conceptual design stage.For that purpose, it is necessary to have flexible calculation tools that represent the principal dependencies of such parameters.This capability is of critical importance in the design of innovative nuclear reactors.In order to have a proper tool that could assist the conceptual design of innovative nuclear reactors, we developed and implemented a neutronic core calculus code: DIANA (Diffusion Integral Analytic Neutron Analysis).To calculate the required parameters, this code generates its own cross sections using an analytic two group, two zones diffusion scheme based only on a minimal set of data (i.e. 2200 m/s and fission averaged microscopic cross sections, Wescott factors and Effective Resonance Integrals).Both to calculate cross sections and core parameters, DIANA takes into account heterogeneity effects that are included when it evaluates each zone.Among them lays the disadvantage factor of each energy group.DIANA was totally implemented through Object Oriented Programming using C++ language. This eases source code understanding and would allow a quick expansion of its capabilities if needed.The final product is a versatile and easy-to-use code that allows core calculations with a minimal amount of data.It also contains the required tools needed to perform many variational calculations such as the parameterisation of effective multiplication factors for different radii of the core.The diffusion scheme s simplicity allows an easy following of the involved phenomena, making DIANA the most suitable tool to design reactors whose physics lays beyond the parameters of present reactors.All this reasons make DIANA a good candidate for future innovative reactor analysis

  18. Performance Analysis of Faulty Gallager-B Decoding of QC-LDPC Codes with Applications

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2014-06-01

    Full Text Available In this paper we evaluate the performance of Gallager-B algorithm, used for decoding low-density parity-check (LDPC codes, under unreliable message computation. Our analysis is restricted to LDPC codes constructed from circular matrices (QC-LDPC codes. Using Monte Carlo simulation we investigate the effects of different code parameters on coding system performance, under a binary symmetric communication channel and independent transient faults model. One possible application of the presented analysis in designing memory architecture with unreliable components is considered.

  19. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  20. Development of 1D Liner Compression Code for IDL

    Science.gov (United States)

    Shimazu, Akihisa; Slough, John; Pancotti, Anthony

    2015-11-01

    A 1D liner compression code is developed to model liner implosion dynamics in the Inductively Driven Liner Experiment (IDL) where FRC plasmoid is compressed via inductively-driven metal liners. The driver circuit, magnetic field, joule heating, and liner dynamics calculations are performed at each time step in sequence to couple these effects in the code. To obtain more realistic magnetic field results for a given drive coil geometry, 2D and 3D effects are incorporated into the 1D field calculation through use of correction factor table lookup approach. Commercial low-frequency electromagnetic fields solver, ANSYS Maxwell 3D, is used to solve the magnetic field profile for static liner condition at various liner radius in order to derive correction factors for the 1D field calculation in the code. The liner dynamics results from the code is verified to be in good agreement with the results from commercial explicit dynamics solver, ANSYS Explicit Dynamics, and previous liner experiment. The developed code is used to optimize the capacitor bank and driver coil design for better energy transfer and coupling. FRC gain calculations are also performed using the liner compression data from the code for the conceptual design of the reactor sized system for fusion energy gains.

  1. Performance of Product Codes and Related Structures with Iterated Decoding

    DEFF Research Database (Denmark)

    Justesen, Jørn

    2011-01-01

    Several modifications of product codes have been suggested as standards for optical networks. We show that the performance exhibits a threshold that can be estimated from a result about random graphs. For moderate input bit error probabilities, the output error rates for codes of finite length can...

  2. PAPIRUS - a computer code for FBR fuel performance analysis

    International Nuclear Information System (INIS)

    Kobayashi, Y.; Tsuboi, Y.; Sogame, M.

    1991-01-01

    The FBR fuel performance analysis code PAPIRUS has been developed to design fuels for demonstration and future commercial reactors. A pellet structural model was developed to describe the generation, depletion and transport of vacancies and atomic elements in unified fashion. PAPIRUS results in comparison with the power - to - melt test data from HEDL showed validity of the code at the initial reactor startup. (author)

  3. Inter Organizational Relationships Performance in Third Party Logistics: conceptual framework and case study

    NARCIS (Netherlands)

    Aziz, Romana; Aziz, R.; van Hillegersberg, Jos; Kumar, Kuldeep; Kersten, W.; Blecker, T.; Luthje, C.

    2010-01-01

    Supplier relationship management is an important challenge for shippers in logistics outsourcing. This paper attempts to understand the factors which affect inter organizational relationships performance in third party logistics and proposes a conceptual framework specifically for inter

  4. University Library Strategy Development: A Conceptual Model of Researcher Performance to Inform Service Delivery

    Science.gov (United States)

    Maddox, Alexia; Zhao, Linlin

    2017-01-01

    This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…

  5. The Effect of Contextual and Conceptual Rewording on Mathematical Problem-Solving Performance

    Science.gov (United States)

    Haghverdi, Majid; Wiest, Lynda R.

    2016-01-01

    This study shows how separate and combined contextual and conceptual problem rewording can positively influence student performance in solving mathematical word problems. Participants included 80 seventh-grade Iranian students randomly assigned in groups of 20 to three experimental groups involving three types of rewording and a control group. All…

  6. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  7. A conceptual framework for evaluation of public health and primary care system performance in iran.

    Science.gov (United States)

    Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Akbari Sari, Ali; Mesdaghinia, Alireza

    2015-01-26

    The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report.

  8. Heterogeneous fuels for minor actinides transmutation: Fuel performance codes predictions in the EFIT case study

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, R., E-mail: rolando.calabrese@enea.i [ENEA, Innovative Nuclear Reactors and Fuel Cycle Closure Division, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Vettraino, F.; Artioli, C. [ENEA, Innovative Nuclear Reactors and Fuel Cycle Closure Division, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Sobolev, V. [SCK.CEN, Belgian Nuclear Research Centre, Boeretang 200, B-2400 Mol (Belgium); Thetford, R. [Serco Technical and Assurance Services, 150 Harwell Business Centre, Didcot OX11 0QB (United Kingdom)

    2010-06-15

    Plutonium recycling in new-generation fast reactors coupled with minor actinides (MA) transmutation in dedicated nuclear systems could achieve a decrease of nuclear waste long-term radiotoxicity by two orders of magnitude in comparison with current once-through strategy. In a double-strata scenario, purpose-built accelerator-driven systems (ADS) could transmute minor actinides. The innovative nuclear fuel conceived for such systems demands significant R and D efforts in order to meet the safety and technical performance of current fuel systems. The Integrated Project EUROTRANS (EUROpean research programme for the TRANSmutation of high level nuclear waste in ADS), part of the EURATOM Framework Programme 6 (FP6), undertook some of this research. EUROTRANS developed from the FP5 research programmes on ADS (PDS-XADS) and on fuels dedicated to MA transmutation (FUTURE, CONFIRM). One of its main objectives is the conceptual design of a small sub-critical nuclear system loaded with uranium-free fuel to provide high MA transmutation efficiency. These principles guided the design of EFIT (European Facility for Industrial Transmutation) in the domain DESIGN of IP EUROTRANS. The domain AFTRA (Advanced Fuels for TRAnsmutation system) identified two composite fuel systems: a ceramic-ceramic (CERCER) where fuel particles are dispersed in a magnesia matrix, and a ceramic-metallic (CERMET) with a molybdenum matrix in the place of MgO matrix to host a ceramic fissile phase. The EFIT fuel is composed of plutonium and MA oxides in solid solution with isotopic vectors typical of LWR spent fuel with 45 MWd/kg{sub HM} discharge burnup and 30 years interim storage before reprocessing. This paper is focused on the thermomechanical state of the hottest fuel pins of two EFIT cores of 400 MW{sub (th)} loaded with either CERCER or CERMET fuels. For calculations three fuel performance codes were used: FEMALE, TRAFIC and TRANSURANUS. The analysis was performed at the beginning of fuel life

  9. Neutron transport-burnup code MCORGS and its application in fusion fission hybrid blanket conceptual research

    Science.gov (United States)

    Shi, Xue-Ming; Peng, Xian-Jue

    2016-09-01

    Fusion science and technology has made progress in the last decades. However, commercialization of fusion reactors still faces challenges relating to higher fusion energy gain, irradiation-resistant material, and tritium self-sufficiency. Fusion Fission Hybrid Reactors (FFHR) can be introduced to accelerate the early application of fusion energy. Traditionally, FFHRs have been classified as either breeders or transmuters. Both need partition of plutonium from spent fuel, which will pose nuclear proliferation risks. A conceptual design of a Fusion Fission Hybrid Reactor for Energy (FFHR-E), which can make full use of natural uranium with lower nuclear proliferation risk, is presented. The fusion core parameters are similar to those of the International Thermonuclear Experimental Reactor. An alloy of natural uranium and zirconium is adopted in the fission blanket, which is cooled by light water. In order to model blanket burnup problems, a linkage code MCORGS, which couples MCNP4B and ORIGEN-S, is developed and validated through several typical benchmarks. The average blanket energy Multiplication and Tritium Breeding Ratio can be maintained at 10 and 1.15 respectively over tens of years of continuous irradiation. If simple reprocessing without separation of plutonium from uranium is adopted every few years, FFHR-E can achieve better neutronic performance. MCORGS has also been used to analyze the ultra-deep burnup model of Laser Inertial Confinement Fusion Fission Energy (LIFE) from LLNL, and a new blanket design that uses Pb instead of Be as the neutron multiplier is proposed. In addition, MCORGS has been used to simulate the fluid transmuter model of the In-Zinerater from Sandia. A brief comparison of LIFE, In-Zinerater, and FFHR-E will be given.

  10. Stereotype threat? Effects of inquiring about test takers' gender on conceptual test performance in physics

    Science.gov (United States)

    Maries, Alexandru; Singh, Chandralekha

    2015-12-01

    It has been found that activation of a stereotype, for example by indicating one's gender before a test, typically alters performance in a way consistent with the stereotype, an effect called "stereotype threat." On a standardized conceptual physics assessment, we found that asking test takers to indicate their gender right before taking the test did not deteriorate performance compared to an equivalent group who did not provide gender information. Although a statistically significant gender gap was present on the standardized test whether or not students indicated their gender, no gender gap was observed on the multiple-choice final exam students took, which included both quantitative and conceptual questions on similar topics.

  11. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    Highlights: ► The application of advanced validation techniques (sensitivity, calibration and prediction) to nuclear performance codes FRAPCON and LIFE-4 is the focus of the paper. ► A sensitivity ranking methodology narrows down the number of selected modeling parameters from 61 to 24 for FRAPCON and from 69 to 35 for LIFE-4. ► Fuel creep, fuel thermal conductivity, fission gas transport/release, crack/boundary, and fuel gap conductivity models of LIFE-4 are identified for improvements. ► FRAPCON sensitivity results indicated the importance of the fuel thermal conduction and the fission gas release models. -- Abstract: Evolving nuclear energy programs expect to use enhanced modeling and simulation (M and S) capabilities, using multiscale, multiphysics modeling approaches, to reduce both cost and time from the design through the licensing phases. Interest in the development of the multiscale, multiphysics approach has increased in the last decade because of the need for predictive tools for complex interacting processes as a means of eliminating the limited use of empirically based model development. Complex interacting processes cannot be predicted by analyzing each individual component in isolation. In most cases, the mathematical models of complex processes and their boundary conditions are nonlinear. As a result, the solutions of these mathematical models often require high-performance computing capabilities and resources. The use of multiscale, multiphysics (MS/MP) models in conjunction with high-performance computational software and hardware introduces challenges in validating these predictive tools—traditional methodologies will have to be modified to address these challenges. The advanced MS/MP codes for nuclear fuels and reactors are being developed within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of the US Department of Energy (DOE) – Nuclear Energy (NE). This paper does not directly address challenges in calibration

  12. The development of the fuel rod transient performance analysis code FTPAC

    International Nuclear Information System (INIS)

    Han Zhijie; Ji Songtao

    2014-01-01

    Fuel rod behavior, especially the integrity of cladding, played an important role in fuel safety research during reactor transient and hypothetical accidents conditions. In order to study fuel rod performance under transient accidents, FTPAC (Fuel Transient Performance Analysis Code) has been developed for simulating light water reactor fuel rod transient behavior when power or coolant boundary conditions are rapidly changing. It is composed of temperature, mechanical deformation, cladding oxidation and gas pressure model. The assessment was performed by comparing FTPAC code analysis result to experiments data and FRAPTRAN code calculations. Comparison shows that, the FTPAC gives reasonable agreement in temperature, deformation and gas pressure prediction. And the application of slip coefficient is more suitable for simulating the sliding between pellet and cladding when the gap is closed. (authors)

  13. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  14. FLAME: A finite element computer code for contaminant transport n variably-saturated media

    International Nuclear Information System (INIS)

    Baca, R.G.; Magnuson, S.O.

    1992-06-01

    A numerical model was developed for use in performance assessment studies at the INEL. The numerical model referred to as the FLAME computer code, is designed to simulate subsurface contaminant transport in a variably-saturated media. The code can be applied to model two-dimensional contaminant transport in an and site vadose zone or in an unconfined aquifer. In addition, the code has the capability to describe transport processes in a porous media with discrete fractures. This report presents the following: description of the conceptual framework and mathematical theory, derivations of the finite element techniques and algorithms, computational examples that illustrate the capability of the code, and input instructions for the general use of the code. The development of the FLAME computer code is aimed at providing environmental scientists at the INEL with a predictive tool for the subsurface water pathway. This numerical model is expected to be widely used in performance assessments for: (1) the Remedial Investigation/Feasibility Study process and (2) compliance studies required by the US Department of energy Order 5820.2A

  15. FLAME: A finite element computer code for contaminant transport n variably-saturated media

    Energy Technology Data Exchange (ETDEWEB)

    Baca, R.G.; Magnuson, S.O.

    1992-06-01

    A numerical model was developed for use in performance assessment studies at the INEL. The numerical model referred to as the FLAME computer code, is designed to simulate subsurface contaminant transport in a variably-saturated media. The code can be applied to model two-dimensional contaminant transport in an and site vadose zone or in an unconfined aquifer. In addition, the code has the capability to describe transport processes in a porous media with discrete fractures. This report presents the following: description of the conceptual framework and mathematical theory, derivations of the finite element techniques and algorithms, computational examples that illustrate the capability of the code, and input instructions for the general use of the code. The development of the FLAME computer code is aimed at providing environmental scientists at the INEL with a predictive tool for the subsurface water pathway. This numerical model is expected to be widely used in performance assessments for: (1) the Remedial Investigation/Feasibility Study process and (2) compliance studies required by the US Department of energy Order 5820.2A.

  16. Sensitivity Analysis of FEAST-Metal Fuel Performance Code: Initial Results

    International Nuclear Information System (INIS)

    Edelmann, Paul Guy; Williams, Brian J.; Unal, Cetin; Yacout, Abdellatif

    2012-01-01

    This memo documents the completion of the LANL milestone, M3FT-12LA0202041, describing methodologies and initial results using FEAST-Metal. The FEAST-Metal code calculations for this work are being conducted at LANL in support of on-going activities related to sensitivity analysis of fuel performance codes. The objective is to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. This report summarizes our preliminary results for the sensitivity analysis using 6 calibration datasets for metallic fuel developed at ANL for EBR-II experiments. Sensitivity ranking methodology was deployed to narrow down the selected parameters for the current study. There are approximately 84 calibration parameters in the FEAST-Metal code, of which 32 were ultimately used in Phase II of this study. Preliminary results of this sensitivity analysis led to the following ranking of FEAST models for future calibration and improvements: fuel conductivity, fission gas transport/release, fuel creep, and precipitation kinetics. More validation data is needed to validate calibrated parameter distributions for future uncertainty quantification studies with FEAST-Metal. Results of this study also served to point out some code deficiencies and possible errors, and these are being investigated in order to determine root causes and to improve upon the existing code models.

  17. High-performance computational fluid dynamics: a custom-code approach

    International Nuclear Information System (INIS)

    Fannon, James; Náraigh, Lennon Ó; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain

    2016-01-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier–Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing. (paper)

  18. High-performance computational fluid dynamics: a custom-code approach

    Science.gov (United States)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  19. A Conceptual Framework for Evaluation of Public Health and Primary Care System Performance in Iran

    Science.gov (United States)

    Jahanmehr, Nader; Rashidian, Arash; Khosravi, Ardeshir; Farzadfar, Farshad; Shariati, Mohammad; Majdzadeh, Reza; Sari, Ali Akbari; Mesdaghinia, Alireza

    2015-01-01

    Introduction: The main objective of this study was to design a conceptual framework, according to the policies and priorities of the ministry of health to evaluate provincial public health and primary care performance and to assess their share in the overall health impacts of the community. Methods: We used several tools and techniques, including system thinking, literature review to identify relevant attributes of health system performance framework and interview with the key stakeholders. The PubMed, Scopus, web of science, Google Scholar and two specialized databases of Persian language literature (IranMedex and SID) were searched using main terms and keywords. Following decision-making and collective agreement among the different stakeholders, 51 core indicators were chosen from among 602 obtained indicators in a four stage process, for monitoring and evaluation of Health Deputies. Results: We proposed a conceptual framework by identifying the performance area for Health Deputies between other determinants of health, as well as introducing a chain of results, for performance, consisting of Input, Process, Output and Outcome indicators. We also proposed 5 dimensions for measuring the performance of Health Deputies, consisting of efficiency, effectiveness, equity, access and improvement of health status. Conclusion: The proposed Conceptual Framework illustrates clearly the Health Deputies success in achieving best results and consequences of health in the country. Having the relative commitment of the ministry of health and Health Deputies at the University of Medical Sciences is essential for full implementation of this framework and providing the annual performance report. PMID:25946937

  20. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  1. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  2. PERFORMANCE ANALYSIS OF OPTICAL CDMA SYSTEM USING VC CODE FAMILY UNDER VARIOUS OPTICAL PARAMETERS

    Directory of Open Access Journals (Sweden)

    HASSAN YOUSIF AHMED

    2012-06-01

    Full Text Available The intent of this paper is to study the performance of spectral-amplitude coding optical code-division multiple-access (OCDMA systems using Vector Combinatorial (VC code under various optical parameters. This code can be constructed by an algebraic way based on Euclidian vectors for any positive integer number. One of the important properties of this code is that the maximum cross-correlation is always one which means that multi-user interference (MUI and phase induced intensity noise are reduced. Transmitter and receiver structures based on unchirped fiber Bragg grating (FBGs using VC code and taking into account effects of the intensity, shot and thermal noise sources is demonstrated. The impact of the fiber distance effects on bit error rate (BER is reported using a commercial optical systems simulator, virtual photonic instrument, VPITM. The VC code is compared mathematically with reported codes which use similar techniques. We analyzed and characterized the fiber link, received power, BER and channel spacing. The performance and optimization of VC code in SAC-OCDMA system is reported. By comparing the theoretical and simulation results taken from VPITM, we have demonstrated that, for a high number of users, even if data rate is higher, the effective power source is adequate when the VC is used. Also it is found that as the channel spacing width goes from very narrow to wider, the BER decreases, best performance occurs at a spacing bandwidth between 0.8 and 1 nm. We have shown that the SAC system utilizing VC code significantly improves the performance compared with the reported codes.

  3. Code division multiple-access techniques in optical fiber networks. II - Systems performance analysis

    Science.gov (United States)

    Salehi, Jawad A.; Brackett, Charles A.

    1989-08-01

    A technique based on optical orthogonal codes was presented by Salehi (1989) to establish a fiber-optic code-division multiple-access (FO-CDMA) communications system. The results are used to derive the bit error rate of the proposed FO-CDMA system as a function of data rate, code length, code weight, number of users, and receiver threshold. The performance characteristics for a variety of system parameters are discussed. A means of reducing the effective multiple-access interference signal by placing an optical hard-limiter at the front end of the desired optical correlator is presented. Performance calculations are shown for the FO-CDMA with an ideal optical hard-limiter, and it is shown that using a optical hard-limiter would, in general, improve system performance.

  4. A Conceptual Framework for the Electronic Performance Support Systems within IBM Lotus Notes 6 (LN6 Example

    Directory of Open Access Journals (Sweden)

    Servet BAYRAM

    2005-10-01

    Full Text Available A Conceptual Framework for the Electronic PerformanceSupport Systems within IBM Lotus Notes 6 (LN6 Example Assoc. Prof. Dr. Servet BAYRAM Computer Education & Instructional Technologies Marmara University, TURKEYsbayram@marmara.edu.tr ABSTRACT The concept of Electronic Performance Support Systems (EPSS is containing multimedia or computer based instruction components that improves human performance by providing process simplification, performance information and decision support system. EPSS has become a hot topic for organizational development, human resources, performance technology, training, and educational development professionals. A conceptual framework of EPSS is constructed under five interrelated and interdependent domains for educational implications. The domains of the framework are online collaboration, cost-effectiveness, motivation, service management, and performance empowering. IBM Lotus Notes 6 (LN6 is used as an example application tool to illustrate the power of this framework. The framework describes a set of relevant events based upon deductive analyses for improving our understanding of the EPSS and its implications on education and training. The article is also pointed out that there are some similarities between the EPSS’ and the LN6’s specific features within this conceptual framework. It can provide some guidelines and benefits to researchers, educators, and designers as well.

  5. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    Science.gov (United States)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  6. Fusion PIC code performance analysis on the Cori KNL system

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, Tuomas S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Friesen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Raman, Karthic [INTEL Corp. (United States)

    2017-05-25

    We study the attainable performance of Particle-In-Cell codes on the Cori KNL system by analyzing a miniature particle push application based on the fusion PIC code XGC1. We start from the most basic building blocks of a PIC code and build up the complexity to identify the kernels that cost the most in performance and focus optimization efforts there. Particle push kernels operate at high AI and are not likely to be memory bandwidth or even cache bandwidth bound on KNL. Therefore, we see only minor benefits from the high bandwidth memory available on KNL, and achieving good vectorization is shown to be the most beneficial optimization path with theoretical yield of up to 8x speedup on KNL. In practice we are able to obtain up to a 4x gain from vectorization due to limitations set by the data layout and memory latency.

  7. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  8. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  9. Cold neutron source conceptual designing for Tehran Research Reactor

    International Nuclear Information System (INIS)

    Khajvand, N.; Mirvakili, S.M.; Faghihi, F.

    2016-01-01

    Highlights: • Cold neutron source conceptual designing for Tehran research reactor is carried out. • Type and geometry of moderator and dimensions of cold neutron source are analyzed. • Liquid hydrogen with more ortho-concentration can be better option as moderator. - Abstract: A cold neutron source (CNS) conceptual designing for the Tehran Research Reactor (TRR) were carried out using MCNPX code. In this study, a horizontal beam tube of the core which has appropriate the highest thermal flux is selected and parametric analysis to choose the type and geometry of the moderator, and the required CNS dimensions for maximizing the cold neutron production was performed. In this design the moderator cell has a spherical annulus structure, and the cold neutron flux and its brightness are calculated together with the nuclear heat load of the CNS for a variety of materials including liquid hydrogen, liquid deuterium, and solid methane. Based on our study, liquid hydrogen with more ortho-concentration than para and solid methane are the best options.

  10. CONCEPTUAL FRAMEWORK FOR ESTIMATING THE PERFORMANCE OF INTERORGANIZATIONAL COLLABORATIVE INNOVATION

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2013-10-01

    Full Text Available The construction of a sequential model performance based on indicators broken down by factors and variables is relevant because it provides a complete picture of the effectiveness of collaborative structures built and operated on the basis of policies induced by the organizations involved. It also provides a longitudinal analysis of the effectiveness of collaboration for innovation. I will define the performance analysis model of collaborative innovation as a conceptual tool consisting of a set of elements and relationships between them, allowing the quantification of the expression of innovation performance as a result of interaction of several organizations. Applying such a model involves the analysis and estimation of the added value of each segment of interorganizational innovation cycle as part of the overall performance obtained by combining existing or created knowledge. From this point of view, it is necessary to develop an ontology, a common ground on which this model can be built.

  11. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  12. Sensitivity assessment of fuel performance codes for LOCA accident scenario

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Alfredo; Gomes, Daniel; Silva, Antonio Teixeira e; Muniz, Rafael O.R. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Giovedi, Claudia; Martins, Marcelo, E-mail: ayabe@ipen.br, E-mail: claudia.giovedi@labrisco.usp.br [Universidade de Sao Paulo (LABRISCO/USP), Sao Paulo, SP (Brazil). Lab. de Análise, Avaliação e Gerenciamento de Risco

    2017-07-01

    FRAPCON code predicts fuel rod performance in LWR (Light Water Reactor) by modeling fuel responses under normal operating conditions and anticipated operational occurrences; FRAPTRAN code is applied for fuel transient under fast transient and accident conditions. The codes are well known and applied for different purposes and one of the use is to address sensitivity analysis considering fuel design parameters associated to fabrication, moreover can address the effect of physical models bias. The objective of this work was to perform an assessment of fuel manufacturing parameters tolerances and fuel models bias using FRAPCON and FRAPTRAN codes for Loss of Coolant Accident (LOCA) scenario. The preliminary analysis considered direct approach taken into account most relevant manufacturing tolerances (lower and upper bounds) related to design parameters and physical models bias without considering their statistical distribution. The simulations were carried out using the data available in the open literature related to the series of LOCA experiment performed at the Halden reactor (specifically IFA-650.5). The manufacturing tolerances associated to design parameters considered in this paper were: enrichment, cladding thickness, pellet diameter, pellet density, and filling gas pressure. The physical models considered were: fuel thermal expansion, fission gas release, fuel swelling, irradiation creep, cladding thermal expansion, cladding corrosion, and cladding hydrogen pickup. The results obtained from sensitivity analysis addressed the impact of manufacturing tolerances and physical models in the fuel cladding burst time observed for the IFA-650.5 experiment. (author)

  13. Sensitivity assessment of fuel performance codes for LOCA accident scenario

    International Nuclear Information System (INIS)

    Abe, Alfredo; Gomes, Daniel; Silva, Antonio Teixeira e; Muniz, Rafael O.R.; Giovedi, Claudia; Martins, Marcelo

    2017-01-01

    FRAPCON code predicts fuel rod performance in LWR (Light Water Reactor) by modeling fuel responses under normal operating conditions and anticipated operational occurrences; FRAPTRAN code is applied for fuel transient under fast transient and accident conditions. The codes are well known and applied for different purposes and one of the use is to address sensitivity analysis considering fuel design parameters associated to fabrication, moreover can address the effect of physical models bias. The objective of this work was to perform an assessment of fuel manufacturing parameters tolerances and fuel models bias using FRAPCON and FRAPTRAN codes for Loss of Coolant Accident (LOCA) scenario. The preliminary analysis considered direct approach taken into account most relevant manufacturing tolerances (lower and upper bounds) related to design parameters and physical models bias without considering their statistical distribution. The simulations were carried out using the data available in the open literature related to the series of LOCA experiment performed at the Halden reactor (specifically IFA-650.5). The manufacturing tolerances associated to design parameters considered in this paper were: enrichment, cladding thickness, pellet diameter, pellet density, and filling gas pressure. The physical models considered were: fuel thermal expansion, fission gas release, fuel swelling, irradiation creep, cladding thermal expansion, cladding corrosion, and cladding hydrogen pickup. The results obtained from sensitivity analysis addressed the impact of manufacturing tolerances and physical models in the fuel cladding burst time observed for the IFA-650.5 experiment. (author)

  14. Student performance on conceptual questions: Does instruction matter?

    Science.gov (United States)

    Heron, Paula R. L.

    2013-01-01

    As part of the tutorial component of introductory calculus-based physics at the University of Washington, students take weekly pretests that consist of conceptual questions. Pretests are so named because they precede each tutorial, but they are frequently administered after lecture instruction. Many variables associated with class composition and prior instruction (if any) could, in principle, affect student performance on these questions. Nonetheless, the results are often found to be "essentially the same" in all classes. With data available from a large number of classes, it is possible to characterize the typical variation quantitatively. In this paper three questions for which we have accumulated thousands of responses, from dozens of classes representing different conditions with respect to the textbook in use, the amount of prior instruction, etc., serve as examples. For each question, we examine the variation in student performance across all classes. We also compare subsets categorized according to the amount of relevant prior instruction each class had received. A preliminary analysis suggests that the variation in performance is essentially random. No statistically significant difference is observed between results obtained before relevant instruction begins and after it has been completed. The results provide evidence that exposure to concepts in lecture and textbook is not sufficient to ensure an improvement in performance on questions that require qualitative reasoning.

  15. Sources/treatment of uncertainties in the performance assessment of geologic radioactive waste repositories

    International Nuclear Information System (INIS)

    Cranwell, R.M.

    1987-01-01

    Uncertainties in the performance assessment of geologic radioactive waste repositories have several sources. The more important ones include: 1) uncertainty in the conditions of a disposal system over the temporal scales set forth in regulations, 2) uncertainty in the conceptualization of the geohydrologic system, 3) uncertainty in the theoretical description of a given conceptual model of the system, 4) uncertainty in the development of computer codes to implement the solution of a mathematical model, and 5) uncertainty in the parameters and data required in the models and codes used to assess the long-term performance of the disposal system. This paper discusses each of these uncertainties and outlines methods for addressing these uncertainties

  16. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  17. GAP--a PIC-type fluid code

    International Nuclear Information System (INIS)

    Marder, B.M.

    1975-01-01

    GAP, a PIC-type fluid code for computing compressible flows, is described and demonstrated. While retaining some features of PIC, it is felt that the GAP approach is conceptually and operationally simpler. 9 figures

  18. A methodology for aeroelastic constraint analysis in a conceptual design environment

    Science.gov (United States)

    de Baets, Peter Wilfried Gaston

    The objective of this study is the infusion of aeroelastic constraint knowledge into the design space. The mapping of such aeroelastic information in the conceptual design space has long been a desire of the design community. The conceptual design phase of an aircraft is a multidisciplinary environment and has the most influence on the future design of the vehicle. However, sufficient results cannot he obtained in a timely enough manner to materially contribute to early design decisions. Furthermore, the natural division of the engineering team into specialty groups is not well supported by the monolithic aerodynamic-structures codes typically used in modern aeroelastic analysis. The research examines how the Bi-Level Integrated System Synthesis decomposition technique can be adapted to perform as the conceptual aeroelastic design tool. The study describes a comprehensive solution of the aeroelastic coupled problem cast in this decomposition format and implemented in an integrated framework. The method is supported by application details of a proof of concept high speed vehicle. Physics-based codes such as finite element and an aerodynamic panel method are used to model the high-definition geometric characteristics of the vehicle. A synthesis and sizing code was added to referee the conflicts that arise between the two disciplines. This research's novelty lies in four points. First is the use of physics-based tools at the conceptual design phase to calculate the aeroelastic properties. Second is the projection of flutter and divergence velocity constraint lines in a power loading versus wing loading graph. Third is the aeroelastic assessment time reduction, which has moved from a matter of years to months. Lastly, this assessment allowed verification of the impact of changing velocity, altitude, and angle of attack on the aeroelastic properties. This then allowed identification of robust design space with respect to these three mission properties. The method

  19. User manual for the probabilistic fuel performance code FRP

    International Nuclear Information System (INIS)

    Friis Jensen, J.; Misfeldt, I.

    1980-10-01

    This report describes the use of the probabilistic fuel performance code FRP. Detailed description of both input to and output from the program are given. The use of the program is illustrated by an example. (author)

  20. Performance of Low-Density Parity-Check Coded Modulation

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.

  1. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  2. MINIMARS conceptual design: Final report

    International Nuclear Information System (INIS)

    Lee, J.D.

    1986-09-01

    This volume of the conceptual design report contains detailed information on the following: (1) plasma engineering, (2) tandem mirror optimization code, (3) configuration, (4) assembly and maintenance, (5) availability, (6) site and facilities, (7) magnet design, (8) end-cell shielding, (9) drift pumping system, (10) rf systems, (11) negative-ion neutral beam injection system, (12) sloshing-ion beamline, and (13) power balance and electrical systems

  3. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  4. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  5. Conceptual study of advanced PWR systems. A study of passive and inherent safety design concepts for advanced light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Heung; No, Hee Cheon; Baek, Won Pil; Jae, Shim Young; Lee, Goung Jin; Na, Man Gyun; Lee, Jae Young; Kim, Han Gon; Kang, Ki Sig; Moon, Sang Ki; Kim, Yun Il; Park, Jae Wook; Yang, Soo Hyung; Kim, Soo Hyung; Lee, Seong Wook; Kim, Hong Che; Park, Hyun Sik; Jeong, Ji Hwan; Lee, Sang Il; Jung, Hae Yong; Kim, Hyong Tae; Chae, Kyung Sun; Moon, Ki Hoon [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1995-08-01

    The five thermal-hydraulic concepts chosen for advanced PWR have been studied as follows: (1) Critical Heat Flux: Review of previous works, analysis of parametric trends, analysis of transient CHF characteristics, extension of the CHF date bank, survey and assessment of correlations, design of a intermediate-pressure CHF test loop have been performed. (2) Passive Cooling Concepts for Concrete Containment system: Review of condensation phenomena with noncondensable gases, selection of a promising concept (i.e., use of external condensers), design of test loop according to scaling laws have been accomplished. and computer programs based on the control-volume approach, and the conceptual design of test loop have been accomplished. (4) Fluidic Diode Concepts: Review of previous applications of the concept, analysis major parameters affecting the performance, development of a computational code, and conceptual investigation of the verification test loop have been performed. (5) Wet Thermal Insulator: Review of previous works, selection of promising methods ( i.e. ceramic fiber in a steel case and mirror-type insulator), and conceptual design of the experimental loop have been performed. (author). 9 refs.

  6. Design validation of the ITER EC upper launcher according to codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Spaeh, Peter, E-mail: peter.spaeh@kit.edu [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Aiello, Gaetano [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Gagliardi, Mario [Karlsruhe Institute of Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); F4E, Fusion for Energy, Joint Undertaking, Barcelona (Spain); Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Weinhorst, Bastian [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany)

    2015-10-15

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  7. Design validation of the ITER EC upper launcher according to codes and standards

    International Nuclear Information System (INIS)

    Spaeh, Peter; Aiello, Gaetano; Gagliardi, Mario; Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro; Weinhorst, Bastian

    2015-01-01

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  8. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  9. Performance analysis of wavelength/spatial coding system with fixed in-phase code matrices in OCDMA network

    Science.gov (United States)

    Tsai, Cheng-Mu; Liang, Tsair-Chun

    2011-12-01

    This paper proposes a wavelength/spatial (W/S) coding system with fixed in-phase code (FIPC) matrix in the optical code-division multiple-access (OCDMA) network. A scheme is presented to form the FIPC matrix which is applied to construct the W/S OCDMA network. The encoder/decoder in the W/S OCDMA network is fully able to eliminate the multiple-access-interference (MAI) at the balanced photo-detectors (PD), according to fixed in-phase cross correlation. The phase-induced intensity noise (PIIN) related to the power square is markedly suppressed in the receiver by spreading the received power into each PD while the net signal power is kept the same. Simulation results show that the W/S OCDMA network based on the FIPC matrices cannot only completely remove the MAI but effectively suppress the PIIN to upgrade the network performance.

  10. Performance Analysis of Wavelength Multiplexed Sac Ocdma Codes in Beat Noise Mitigation in Sac Ocdma Systems

    Science.gov (United States)

    Alhassan, A. M.; Badruddin, N.; Saad, N. M.; Aljunid, S. A.

    2013-07-01

    In this paper we investigate the use of wavelength multiplexed spectral amplitude coding (WM SAC) codes in beat noise mitigation in coherent source SAC OCDMA systems. A WM SAC code is a low weight SAC code, where the whole code structure is repeated diagonally (once or more) in the wavelength domain to achieve the same cardinality as a higher weight SAC code. Results show that for highly populated networks, the WM SAC codes provide better performance than SAC codes. However, for small number of active users the situation is reversed. Apart from their promising improvement in performance, these codes are more flexible and impose less complexity on the system design than their SAC counterparts.

  11. A conceptual model of nurses' goal orientation, service behavior, and service performance.

    Science.gov (United States)

    Chien, Chun-Cheng; Chou, Hsin-Kai; Hung, Shuo-Tsung

    2008-01-01

    Based on the conceptual framework known as the "service triangle," the authors constructed a model of nurses' goal orientation, service behavior, and service performance to investigate the antecedents and consequences of the medical service behavior provided by nurses. This cross-sectional study collected data from 127 nurses in six hospitals using a mail-in questionnaire. Analysis of the model revealed that the customer-oriented behavior of nurses had a positive influence on organizational citizenship behavior; and both of these behaviors had a significant positive influence on service performance. The results also indicate that a higher learning goal orientation among nurses was associated with the performance of both observable customer-oriented behavior and organizational-citizenship behavior.

  12. Analysis of parallel computing performance of the code MCNP

    International Nuclear Information System (INIS)

    Wang Lei; Wang Kan; Yu Ganglin

    2006-01-01

    Parallel computing can reduce the running time of the code MCNP effectively. With the MPI message transmitting software, MCNP5 can achieve its parallel computing on PC cluster with Windows operating system. Parallel computing performance of MCNP is influenced by factors such as the type, the complexity level and the parameter configuration of the computing problem. This paper analyzes the parallel computing performance of MCNP regarding with these factors and gives measures to improve the MCNP parallel computing performance. (authors)

  13. Does the new conceptual framework provide adequate concepts for reporting relevant information about performance?

    NARCIS (Netherlands)

    Brouwer, A.; Faramarzi, A; Hoogendoorn, M.

    2014-01-01

    The basic question we raise in this paper is whether the 2013 Discussion Paper (DP 2013) on the Conceptual Framework provides adequate principles for reporting an entity’s performance and what improvements could be made in light of both user needs and evidence from academic literature. DP 2013

  14. Performance Comparison of Containment PT analysis between CAP and CONTEMPT Code

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Jun; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul [FNC Tech., Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [KHNP-CENTERAL RESEARCH INSTITUTE, Daejeon (Korea, Republic of)

    2013-10-15

    CAP, in the form that is linked with SPACE, computed the containment back-pressure during LOCA accident. In previous SAR (safety analysis report) report of Shin-Kori Units 3 and 4, the CONTEMPT series of codes(hereby referred to as just 'CONTEMPT') is used to evaluate the containment safety during the postulated loss-of-coolant accident (LOCA). In more detail, CONTEMPT-LT/028 was used to calculate the containment maximum PT, while CONTEMPT4/MOD5 to calculate the minimum PT. Actually, in minimum PT analysis, CONTEMPT4/MOD5, which provide back pressure condition of containment, was linked with RELAP5/MOD3.3 which calculate the amount of blowdown into containment. In this analysis, CONTEMPT4/MOD5 was modified based on KREM. CONTEMPT code was developed to predict the long term behavior of water-cooled nuclear reactor containment systems subjected to LOCA conditions. It calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments, leakage on containment response. Models are provided for fan cooler and cooling spray as engineered safety systems. Any compartment may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. As mentioned above, CONTEMP has the similar code features and it therefore is expected to show the similar analysis performance with CAP. In this study, the differences between CAP and two CONTEMPT code versions (CONTEMPT-LT/028 for maximum PT and CONTEMPT4/MOD5 for minimum PT) are, in detail, identified and the code performances were compared for the same problem. Code by code comparison was carried out to identify the difference of LOCA analysis between a series of COMTEMPT and CAP code. With regard to important factors that affect the transient behavior of compartment thermodynamic

  15. Performance Comparison of Containment PT analysis between CAP and CONTEMPT Code

    International Nuclear Information System (INIS)

    Choo, Yeon Jun; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2013-01-01

    CAP, in the form that is linked with SPACE, computed the containment back-pressure during LOCA accident. In previous SAR (safety analysis report) report of Shin-Kori Units 3 and 4, the CONTEMPT series of codes(hereby referred to as just 'CONTEMPT') is used to evaluate the containment safety during the postulated loss-of-coolant accident (LOCA). In more detail, CONTEMPT-LT/028 was used to calculate the containment maximum PT, while CONTEMPT4/MOD5 to calculate the minimum PT. Actually, in minimum PT analysis, CONTEMPT4/MOD5, which provide back pressure condition of containment, was linked with RELAP5/MOD3.3 which calculate the amount of blowdown into containment. In this analysis, CONTEMPT4/MOD5 was modified based on KREM. CONTEMPT code was developed to predict the long term behavior of water-cooled nuclear reactor containment systems subjected to LOCA conditions. It calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments, leakage on containment response. Models are provided for fan cooler and cooling spray as engineered safety systems. Any compartment may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. As mentioned above, CONTEMP has the similar code features and it therefore is expected to show the similar analysis performance with CAP. In this study, the differences between CAP and two CONTEMPT code versions (CONTEMPT-LT/028 for maximum PT and CONTEMPT4/MOD5 for minimum PT) are, in detail, identified and the code performances were compared for the same problem. Code by code comparison was carried out to identify the difference of LOCA analysis between a series of COMTEMPT and CAP code. With regard to important factors that affect the transient behavior of compartment thermodynamic state in

  16. Performance of the dot product function in radiative transfer code SORD

    Science.gov (United States)

    Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent

    2016-10-01

    The successive orders of scattering radiative transfer (RT) codes frequently call the scalar (dot) product function. In this paper, we study performance of some implementations of the dot product in the RT code SORD using 50 scenarios for light scattering in the atmosphere-surface system. In the dot product function, we use the unrolled loops technique with different unrolling factor. We also considered the intrinsic Fortran functions. We show results for two machines: ifort compiler under Windows, and pgf90 under Linux. Intrinsic DOT_PRODUCT function showed best performance for the ifort. For the pgf90, the dot product implemented with unrolling factor 4 was the fastest. The RT code SORD together with the interface that runs all the mentioned tests are publicly available from ftp://maiac.gsfc.nasa.gov/pub/skorkin/SORD_IP_16B (current release) or by email request from the corresponding (first) author.

  17. Performance and Complexity Evaluation of Iterative Receiver for Coded MIMO-OFDM Systems

    Directory of Open Access Journals (Sweden)

    Rida El Chall

    2016-01-01

    Full Text Available Multiple-input multiple-output (MIMO technology in combination with channel coding technique is a promising solution for reliable high data rate transmission in future wireless communication systems. However, these technologies pose significant challenges for the design of an iterative receiver. In this paper, an efficient receiver combining soft-input soft-output (SISO detection based on low-complexity K-Best (LC-K-Best decoder with various forward error correction codes, namely, LTE turbo decoder and LDPC decoder, is investigated. We first investigate the convergence behaviors of the iterative MIMO receivers to determine the required inner and outer iterations. Consequently, the performance of LC-K-Best based receiver is evaluated in various LTE channel environments and compared with other MIMO detection schemes. Moreover, the computational complexity of the iterative receiver with different channel coding techniques is evaluated and compared with different modulation orders and coding rates. Simulation results show that LC-K-Best based receiver achieves satisfactory performance-complexity trade-offs.

  18. An Examination of the Performance Based Building Code on the Design of a Commercial Building

    Directory of Open Access Journals (Sweden)

    John Greenwood

    2012-11-01

    Full Text Available The Building Code of Australia (BCA is the principal code under which building approvals in Australia are assessed. The BCA adopted performance-based solutions for building approvals in 1996. Performance-based codes are based upon a set of explicit objectives, stated in terms of a hierarchy of requirements beginning with key general objectives. With this in mind, the research presented in this paper aims to analyse the impact of the introduction of the performance-based code within Western Australia to gauge the effect and usefulness of alternative design solutions in commercial construction using a case study project. The research revealed that there are several advantages to the use of alternative designs and that all parties, in general, are in favour of the performance-based building code of Australia. It is suggested that change in the assessment process to streamline the alternative design path is needed for the greater use of the performance-based alternative. With appropriate quality control measures, minor variations to the deemed-to-satisfy provisions could easily be managed by the current and future building surveying profession.

  19. Some concepts of model uncertainty for performance assessments of nuclear waste repositories

    International Nuclear Information System (INIS)

    Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.

    1994-01-01

    Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided

  20. Modification in the FUDA computer code to predict fuel performance at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Das, M; Arunakumar, B V; Prasad, P N [Nuclear Power Corp., Mumbai (India)

    1997-08-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig.

  1. Modification in the FUDA computer code to predict fuel performance at high burnup

    International Nuclear Information System (INIS)

    Das, M.; Arunakumar, B.V.; Prasad, P.N.

    1997-01-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig

  2. New Technique for Improving Performance of LDPC Codes in the Presence of Trapping Sets

    Directory of Open Access Journals (Sweden)

    Mohamed Adnan Landolsi

    2008-06-01

    Full Text Available Trapping sets are considered the primary factor for degrading the performance of low-density parity-check (LDPC codes in the error-floor region. The effect of trapping sets on the performance of an LDPC code becomes worse as the code size decreases. One approach to tackle this problem is to minimize trapping sets during LDPC code design. However, while trapping sets can be reduced, their complete elimination is infeasible due to the presence of cycles in the underlying LDPC code bipartite graph. In this work, we introduce a new technique based on trapping sets neutralization to minimize the negative effect of trapping sets under belief propagation (BP decoding. Simulation results for random, progressive edge growth (PEG and MacKay LDPC codes demonstrate the effectiveness of the proposed technique. The hardware cost of the proposed technique is also shown to be minimal.

  3. An evaluation and analysis of three dynamic watershed acidification codes (MAGIC, ETD, and ILWAS)

    Energy Technology Data Exchange (ETDEWEB)

    Jenne, E.A.; Eary, L.E.; Vail, L.W.; Girvin, D.C.; Liebetrau, A.M.; Hibler, L.F.; Miley, T.B.; Monsour, M.J.

    1989-01-01

    The US Environmental Protection Agency is currently using the dynamic watershed acidification codes MAGIC, ILWAS, and ETD to assess the potential future impact of the acidic deposition on surface water quality by simulating watershed acid neutralization processes. The reliability of forecasts made with these codes is of considerable concern. The present study evaluates the process formulations (i.e., conceptual and numerical representation of atmospheric, hydrologic geochemical and biogeochemical processes), compares their approaches to calculating acid neutralizing capacity (ANC), and estimates the relative effects (sensitivity) of perturbations in the input data on selected output variables for each code. Input data were drawn from three Adirondack (upstate New York) watersheds: Panther Lake, Clear Pond, and Woods Lake. Code calibration was performed by the developers of the codes. Conclusions focus on summarizing the adequacy of process formulations, differences in ANC simulation among codes and recommendations for further research to increase forecast reliability. 87 refs., 11 figs., 77 tabs.

  4. A Conceptual Framework of Corporate and Business Ethics across Organizations: Structures, Processes and Performance

    Science.gov (United States)

    Svensson, Goran; Wood, Greg

    2011-01-01

    Purpose: The objective of this paper is to introduce and describe a conceptual framework of corporate and business ethics across organizations in terms of ethical structures, ethical processes and ethical performance. Design/methodology/approach: A framework is outlined and positioned incorporating an ethical frame of reference in the field of…

  5. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  6. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  7. Performance analysis of LDPC codes on OOK terahertz wireless channels

    Science.gov (United States)

    Chun, Liu; Chang, Wang; Jun-Cheng, Cao

    2016-02-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. Project supported by the National Key Basic Research Program of China (Grant No. 2014CB339803), the National High Technology Research and Development Program of China (Grant No. 2011AA010205), the National Natural Science Foundation of China (Grant Nos. 61131006, 61321492, and 61204135), the Major National Development Project of Scientific Instrument and Equipment (Grant No. 2011YQ150021), the National Science and Technology Major Project (Grant No. 2011ZX02707), the International Collaboration and Innovation Program on High Mobility Materials Engineering of the Chinese Academy of Sciences, and the Shanghai Municipal Commission of Science and Technology (Grant No. 14530711300).

  8. Development of a 1D thermal-hydraulic analysis code for once-through steam generator in SMRs using straight tubes

    Energy Technology Data Exchange (ETDEWEB)

    Park, Youngjae; Kim, Iljin; Kim, Hyungdae [Kyung Hee University, Yongin (Korea, Republic of)

    2015-10-15

    Diverse integral/small-modular reactors (SMRs) have been developed. Once-through steam generator (OTSG) which generates superheated steam without steam separator and dryer was used in the SMRs to reduce volume of steam generator. It would be possible to design a new steam generator with best estimate thermal-hydraulic codes such as RELAP and MARS. However, it is not convenience to use the general purpose thermal-hydraulic analysis code to design a specific component of nuclear power plants. A widely used simulation tool for thermal-hydraulic analysis of drum-type steam generators is ATHOS, which allows 3D analysis. On the other hand, a simple 1D thermal-hydraulic analysis code might be accurate enough for the conceptual design of OTSG. In this study, thermal-hydraulic analysis code for conceptual design of OTSG was developed using 1D homogeneous equilibrium model (HEM). A benchmark calculation was also conducted to verify and validate the prediction accuracy of the developed code by comparing with the analysis results with MARS. Finally, conceptual design of OTSG was conducted by the developed code. A simple 1D thermal-hydraulic analysis code was developed for the purpose of conceptual design OTSG for SMRs. A set of benchmark calculations was conducted to verify and validate the analysis accuracy of the developed code by comparing results obtained with a best-estimated thermal-hydraulic analysis code, MARS. Finally, analysis of two different OTSG design concepts with superheating and recirculation was demonstrated using the developed code.

  9. ATES/heat pump simulations performed with ATESSS code

    Science.gov (United States)

    Vail, L. W.

    1989-01-01

    Modifications to the Aquifer Thermal Energy Storage System Simulator (ATESSS) allow simulation of aquifer thermal energy storage (ATES)/heat pump systems. The heat pump algorithm requires a coefficient of performance (COP) relationship of the form: COP = COP sub base + alpha (T sub ref minus T sub base). Initial applications of the modified ATES code to synthetic building load data for two sizes of buildings in two U.S. cities showed insignificant performance advantage of a series ATES heat pump system over a conventional groundwater heat pump system. The addition of algorithms for a cooling tower and solar array improved performance slightly. Small values of alpha in the COP relationship are the principal reason for the limited improvement in system performance. Future studies at Pacific Northwest Laboratory (PNL) are planned to investigate methods to increase system performance using alternative system configurations and operations scenarios.

  10. HOW TO IMPROVE EMPLOYEE MOTIVATION AND GROUP PERFORMANCE THROUGH LEADERSHIP – CONCEPTUAL MODEL

    Directory of Open Access Journals (Sweden)

    Tebeian Adriana

    2012-07-01

    Full Text Available We live times of intense change, due to fast technological development, when information and knowledge are certain factors of power. Organizations today must not underestimate the importance of capable employees, owners of these great factors like information and knowledge. The key to success, as many organizations have shown, is, of course a mix of attributes, but one of much importance stands in the quality of employees and their performance. How to motivate and keep such employees is a real challenge in the actual business world. Employee fluctuation is a big risk, because it brings many losses: loss of expertise, of know-how, of time, it breaks relationships and teams. One solution to this problem is creating a good and attractive environment, so that employees are motivated enough to bring up the best in them and rise their performance to a high level. The main objective of this study is to present a new approach in motivating employees and raising their performance. In this approach, the leader is the main pawn, the generator and sustainer of a healthy environment. By reviewing the literature and the most recent works in this domain, two styles of leadership come into focus: transformational and servant leadership. Both are relatively new concepts, with common but also different attributes, and both having the same final goal: performance, but through a different approach. Based on a vast literature research, the conceptual model was built, considering the main attributes of the two leadership styles, their impact on motivational factors, their influence on job satisfaction and ultimately on team performance. The intermediary results of the research sustain the conceptual framework and main hypothesis of the study. One important contribution of this research is introducing elements of servant leadership together with transformational leadership in the empirical research and also creating an overall conceptual framework which can be applied

  11. The LAC Test: A New Look at Auditory Conceptualization and Literacy Development K-12.

    Science.gov (United States)

    Lindamood, Charles; And Others

    The Lindamood Auditory Conceptualization (LAC) Test was constructed with the recognition that the process of decoding involves an integration of the auditory, visual, and motor senses. Requiring the manipulation of colored blocks to indicate conceptualization of test patterns spoken by the examiner, subtest 1 entails coding of identity, number,…

  12. Performance and Feasibility Analysis of a Wind Turbine Power System for Use on Mars

    Science.gov (United States)

    Lichter, Matthew D.; Viterna, Larry

    1999-01-01

    A wind turbine power system for future missions to the Martian surface was studied for performance and feasibility. A C++ program was developed from existing FORTRAN code to analyze the power capabilities of wind turbines under different environments and design philosophies. Power output, efficiency, torque, thrust, and other performance criteria could be computed given design geometries, atmospheric conditions, and airfoil behavior. After reviewing performance of such a wind turbine, a conceptual system design was modeled to evaluate feasibility. More analysis code was developed to study and optimize the overall structural design. Findings of this preliminary study show that turbine power output on Mars could be as high as several hundred kilowatts. The optimized conceptual design examined here would have a power output of 104 kW, total mass of 1910 kg, and specific power of 54.6 W/kg.

  13. Two-dimensional nucleonics calculations for a ''FIRST STEP'' conceptual ICF reactor

    International Nuclear Information System (INIS)

    Davidson, J.W.; Battat, M.E.; Saylor, W.W.; Pendergrass, J.H.; Dudziak, D.J.

    1985-01-01

    A detailed two-dimensional nucleonic analysis has been performed for the FIRST STEP conceptual ICF reactor blanket design. The reactor concept incorporated in this design is a modified wetted-wall cavity with target illumination geometry left as a design variable. The 2-m radius spherical cavity is surrounded by a blanket containing lithium and 238 U as fertile species and also as energy multipliers. The blanket is configured as 0.6-m-thick cylindrical annuli containing modified LMFBR-type fuel elements with 0.5-m-thick fuel-bearing axial end plugs. Liquid lithium surrounds the inner blanket regions and serves as the coolant for both the blanket and the first wall. The two-dimensional analysis of the blanket performance was made using the 2-D discrete-ordinates code TRISM, and benchmarked with the 3-D Monte Carlo code MCNP. Integral responses including the tritium breeding ratio (TBR), plutonium breeding ratio (PUBR), and blanket energy multiplication were calculated for axial and radial blanket regions. Spatial distributions were calculated for steady-state rates of fission, neutron heating, prompt gamma-ray heating, and fuel breeding

  14. On the performance of diagonal lattice space-time codes for the quasi-static MIMO channel

    KAUST Repository

    Abediseid, Walid

    2013-06-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple-output (MIMO) channel. All the coding design to date focuses on either high-performance, high rates, low complexity encoding and decoding, or targeting a combination of these criteria. In this paper, we analyze in detail the performance of diagonal lattice space-time codes under lattice decoding. We present both upper and lower bounds on the average error probability. We derive a new closed form expression of the lower bound using the so-called sphere-packing bound. This bound presents the ultimate performance limit a diagonal lattice space-time code can achieve at any signal-to-noise ratio (SNR). The upper bound is simply derived using the union-bound and demonstrates how the average error probability can be minimized by maximizing the minimum product distance of the code. © 2013 IEEE.

  15. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  16. SNR and BER Models and the Simulation for BER Performance of Selected Spectral Amplitude Codes for OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-01-01

    Full Text Available Many encoding schemes are used in OCDMA (Optical Code Division Multiple Access Network but SAC (Spectral Amplitude Codes is widely used. It is considered an effective arrangement to eliminate dominant noise called MAI (Multi Access Interference. Various codes are studied for evaluation with respect to their performance against three noises namely shot noise, thermal noise and PIIN (Phase Induced Intensity Noise. Various Mathematical models for SNR (Signal to Noise Ratios and BER (Bit Error Rates are discussed where the SNRs are calculated and BERs are computed using Gaussian distribution assumption. After analyzing the results mathematically, it is concluded that ZCC (Zero Cross Correlation Code performs better than the other selected SAC codes and can serve larger number of active users than the other codes do. At various receiver power levels, analysis points out that RDC (Random Diagonal Code also performs better than the other codes. For the power interval between -10 and -20 dBm performance of RDC is better ZCC. Their lowest BER values suggest that these codes should be part of an efficient and cost effective OCDM access network in the future.

  17. Development of a general coupling interface for the fuel performance code TRANSURANUS – Tested with the reactor dynamics code DYN3D

    International Nuclear Information System (INIS)

    Holt, L.; Rohde, U.; Seidl, M.; Schubert, A.; Van Uffelen, P.; Macián-Juan, R.

    2015-01-01

    Highlights: • A general coupling interface was developed for couplings of the TRANSURANUS code. • With this new tool simplified fuel behavior models in codes can be replaced. • Applicable e.g. for several reactor types and from normal operation up to DBA. • The general coupling interface was applied to the reactor dynamics code DYN3D. • The new coupled code system DYN3D–TRANSURANUS was successfully tested for RIA. - Abstract: A general interface is presented for coupling the TRANSURANUS fuel performance code with thermal hydraulics system, sub-channel thermal hydraulics, computational fluid dynamics (CFD) or reactor dynamics codes. As first application the reactor dynamics code DYN3D was coupled at assembly level in order to describe the fuel behavior in more detail. In the coupling, DYN3D provides process time, time-dependent rod power and thermal hydraulics conditions to TRANSURANUS, which in case of the two-way coupling approach transfers parameters like fuel temperature and cladding temperature back to DYN3D. Results of the coupled code system are presented for the reactivity transient scenario, initiated by control rod ejection. More precisely, the two-way coupling approach systematically calculates higher maximum values for the node fuel enthalpy. These differences can be explained thanks to the greater detail in fuel behavior modeling. The numerical performance for DYN3D–TRANSURANUS was proved to be fast and stable. The coupled code system can therefore improve the assessment of safety criteria, at a reasonable computational cost

  18. PORST: a computer code to analyze the performance of retrofitted steam turbines

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C.; Hwang, I.T.

    1980-09-01

    The computer code PORST was developed to analyze the performance of a retrofitted steam turbine that is converted from a single generating to a cogenerating unit for purposes of district heating. Two retrofit schemes are considered: one converts a condensing turbine to a backpressure unit; the other allows the crossover extraction of steam between turbine cylinders. The code can analyze the performance of a turbine operating at: (1) valve-wide-open condition before retrofit, (2) partial load before retrofit, (3) valve-wide-open after retrofit, and (4) partial load after retrofit.

  19. DETRA: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Suolanen, V.

    1996-01-01

    The computer code DETRA is a generic tool for environmental transfer analyses of radioactive or stable substances. The code has been applied for various purposes, mainly problems related to the biospheric transfer of radionuclides both in safety analyses of disposal of nuclear wastes and in consideration of foodchain exposure pathways in the analyses of off-site consequences of reactor accidents. For each specific application an individually tailored conceptual model can be developed. The biospheric transfer analyses performed by the code are typically carried out for terrestrial, aquatic and food chain applications. 21 refs, 35 figs, 15 tabs

  20. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  1. ITER conceptual design

    International Nuclear Information System (INIS)

    Tomabechi, K.; Gilleland, J.R.; Sokolov, Yu.A.; Toschi, R.

    1991-01-01

    The Conceptual Design Activities of the International Thermonuclear Experimental Reactor (ITER) were carried out jointly by the European Community, Japan, the Soviet Union and the United States of America, under the auspices of the International Atomic Energy Agency. The European Community provided the site for joint work sessions at the Max-Planck-Institut fuer Plasmaphysik in Garching, Germany. The Conceptual Design Activities began in the spring of 1988 and ended in December 1990. The objectives of the activities were to develop the design of ITER, to perform a safety and environmental analysis, to define the site requirements as well as the future research and development needs, to estimate the cost and manpower, and to prepare a schedule for detailed engineering design, construction and operation. On the basis of the investigation and analysis performed, a concept of ITER was developed which incorporated maximum flexibility of the performance of the device and allowed a variety of operating scenarios to be adopted. The heart of the machine is a tokamak having a plasma major radius of 6 m, a plasma minor radius of 2.15 m, a nominal plasma current of 22 MA and a nominal fusion power of 1 GW. The conceptual design can meet the technical objectives of the ITER programme. Because of the success of the Conceptual Design Activities, the Parties are now considering the implementation of the next phase, called the Engineering Design Activities. (author). Refs, figs and tabs

  2. Dexter - A one-dimensional code for calculating thermionic performance of long converters.

    Science.gov (United States)

    Sawyer, C. D.

    1971-01-01

    This paper describes a versatile code for computing the coupled thermionic electric-thermal performance of long thermionic converters in which the temperature and voltage variations cannot be neglected. The code is capable of accounting for a variety of external electrical connection schemes, coolant flow paths and converter failures by partial shorting. Example problem solutions are given.

  3. Reliability issues and solutions for coding social communication performance in classroom settings.

    Science.gov (United States)

    Olswang, Lesley B; Svensson, Liselotte; Coggins, Truman E; Beilinson, Jill S; Donaldson, Amy L

    2006-10-01

    To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroom settings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for describing social communication performance. Four coders participated in this study. They observed and independently coded 6 social communication behavioral dimensions using handheld computers. The dimensions were mutually exclusive and accounted for all verbal and nonverbal productions during a specified time frame. The technology allowed for coding frequency and duration for each entered code. Data were collected from 20 different 2-min video segments of children in kindergarten through 3rd-grade classrooms. Data were analyzed for interobserver and intraobserver agreements using time-interval sorting and Cohen's kappa. Further, interval size and total observation length were manipulated to determine their influence on reliability. The data revealed interval sorting and kappa to be a suitable method for examining reliability of occurrence and duration of ongoing social communication behavioral dimensions. Nearly all comparisons yielded medium to large kappa values; interval size and length of observation minimally affected results. Implications The analysis procedure described in this research solves a challenge in reliability: comparing coding by independent observers of both occurrence and duration of behaviors. Results indicate the utility of a new coding taxonomy and technology for application in online observations of social communication in a classroom setting.

  4. Setting live coding performance in wider historical contexts

    OpenAIRE

    Norman, Sally Jane

    2016-01-01

    This paper sets live coding in the wider context of performing arts, construed as the poetic modelling and projection of liveness. Concepts of liveness are multiple, evolving, and scale-dependent: entities considered live from different cultural perspectives range from individual organisms and social groupings to entire ecosystems, and consequently reflect diverse temporal and spatial orders. Concepts of liveness moreover evolve with our tools, which generate and reveal new senses and places ...

  5. PROVOCATIVE PERFORMANCE, PROVOCATIVE THEATER: AESTHETIC INTERACTION

    Directory of Open Access Journals (Sweden)

    ROŞCA ANGELINA

    2017-12-01

    Full Text Available The author directs her attention to an yet-understudied area of theatre science, namely the aesthetic interaction between the provocative performance and the provocative theatre. The article sheds light on the conceptual coordinates of the performance placed among such artistic trends as Actionism, Conceptualism, Fluxus and Happening. Elements of performing arts infiltrated into the theatre long ago. And it is not just a co-occurrence within an artistic act, but an interference indeed. The provocative performance dispossesses the alternative arts of their distinct codes. First in the USA, then in Europe and eventually in the former Socialist Camp and the territory of the ex-USSR, the performance conquered the theatrical territory. This process has been accelerating since the dramatic theatre began to yield its positions to the post-dramatic one. Some of the overlapping points, which unite the performance with the spectacle in contemporary art, are followed in: the strategy of idiocy in provocative practices; the ready made; the corporeality; the pictorial.

  6. The conceptual design of the ITER CODAC system

    International Nuclear Information System (INIS)

    Farthing, J.; Greenwald, M.; Jo Lister; Izuru Yonekawa

    2006-01-01

    The COntrol Data Access and Communication (CODAC) system for ITER is presently under conceptual design, revising the previous design dating from 1998. The design concentrates on the major perceived challenges: 35-year life of the project for maintenance and evolution; harmonizing strict access security with world-wide participation in the exploitation of ITER; the complexity of CODAC which has to control a large number of disparate procurements systems, 24 hours/365 days; the particular '' in-kind '' procurement of all Plant Systems. The design has so-far concentrated on appropriate methods for combating these challenges. Concepts include: strict application and enforcement of standards for interfacing procured systems at a high '' black-box '' level; reliance on standard high performance networks; reliance on the self-description of the procured systems; maximizing the use of data-driven applications, rather than device-specific coding. The interfacing and procurement specifications will be presented, especially the self-description of '' black-box '' systems, and the boundaries of CODAC will be defined. The breakdown of CODAC into a number of manageable systems and their interfaces will be outlined. The data volumes and data rates will be estimated, suggesting an appropriate conceptual design of the various parts of the CODAC network. There are no required CODAC features which could not be provided with today's tools. However, one element of this conceptual design is to identify areas where ideal solutions are not clearly available for which appropriate R(and)D will be proposed. (author)

  7. Development of hydraulic analysis code for optimizing thermo-chemical is process reactors

    International Nuclear Information System (INIS)

    Terada, Atsuhiko; Hino, Ryutaro; Hirayama, Toshio; Nakajima, Norihiro; Sugiyama, Hitoshi

    2007-01-01

    The Japan Atomic Energy Agency has been conducting study on thermochemical IS process for water splitting hydrogen production. Based on the test results and know-how obtained through the bench-scale test, a pilot test plant, which has a hydrogen production performance of 30 Nm 3 /h, is being designed conceptually as the next step of the IS process development. In design of the IS pilot plant, it is important to make chemical reactors compact with high performance from the viewpoint of plant cost reduction. A new hydraulic analytical code has been developed for optimizing mixing performance of multi-phase flow involving chemical reactions especially in the Bunsen reactor. Complex flow pattern with gas-liquid chemical interaction involving flow instability will be characterized in the Bunsen reactor. Preliminary analytical results obtained with above mentioned code, especially flow patterns induced by swirling flow agreed well with that measured by water experiments, which showed vortex breakdown pattern in a simplified Bunsen reactor. (author)

  8. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  9. Conceptual design of the virtual engineering system for high level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    2000-02-01

    The role of Virtual Engineering System for High Level Radioactive Waste Geological Disposal (hereafter the VES) is to accumulate and unify the results of research and development which JNC had been carried out for the completion of the second progress report on a computer system. The purpose and functions of VES with considering the long-term plan for geological disposal in Japan was studied. The analysis between geological environment assessment, safety performance assessment, and engineering technology had not been integrated mutually in the conventional study. The iterative analysis performed by VES makes it possible to analyze natural barrier and engineering barrier more quantitatively for obtaining safety margin and rationalization of the design of a waste repository. We have examined the system functions to achieve the above purpose of VES. Next, conceptual design for codes, databases, and utilities that consist of VES were performed by examining their purpose and functions. The conceptual design of geological environment assessment system, safety performance assessment system, waste repository element database, economical assessment system, investigation support system, quality assurance system, and visualization system are preformed. The whole system configuration, examination of suitable configuration of hardware and software, examination of system implementation, the confirmation of parallel calculation technology, the conceptual design of platform, the development of demonstration program of platform are performed. Based upon studies stated above, the VES development plan including prototype development during the period of selection of the site candidate was studied. The concept of VES was build based on the examination stated above. (author)

  10. Conceptual design Fusion Experimental Reactor (FER/ITER)

    International Nuclear Information System (INIS)

    Uehara, Kazuya; Nagashima, Takashi; Ikeda, Yoshitaka

    1991-11-01

    This report describes a conceptual design of Lower Hybrid Wave (LH) system for FER and ITER. In JAERI, the conceptual design of LH system for FER has been performed in these 3 years in parallel to that of ITER. There must be a common design part with ITER and FER. The physical requirement of LH system is the saving of volt · sec in the current start-up phase, and the current drive at the boundary region. The frequency of 5GHz is mainly chosen for avoidance of the α particle absorption and for the availability of electron tube development. Seventy-two klystrons (FER) and one hundred klystrons (ITER) are necessary to inject the 30 MW (FER) and 45-50 MW (ITER) rf power into plasma using 0.7 - 0.8 MW klystron per one tube. The launching system is the multi-junction type and the rf spectrum must be as sharp as possible with high directivity to improve the current drive efficiency. One port (FER) and two ports (ITER) are used and the injection direction is in horizontal, in which the analysis of the ray-tracing code and the better coupling of LH wave is considered. The transmission line is over-sized waveguide with low rf loss. (author)

  11. KUGEL: a thermal, hydraulic, fuel performance, and gaseous fission product release code for pebble bed reactor core analysis

    International Nuclear Information System (INIS)

    Shamasundar, B.I.; Fehrenbach, M.E.

    1981-05-01

    The KUGEL computer code is designed to perform thermal/hydraulic analysis and coated-fuel particle performance calculations for axisymmetric pebble bed reactor (PBR) cores. This computer code was developed as part of a Department of Energy (DOE)-funded study designed to verify the published core performance data on PBRs. The KUGEL code is designed to interface directly with the 2DB code, a two-dimensional neutron diffusion code, to obtain distributions of thermal power, fission rate, fuel burnup, and fast neutron fluence, which are needed for thermal/hydraulic and fuel performance calculations. The code is variably dimensioned so that problem size can be easily varied. An interpolation routine allows variable mesh size to be used between the 2DB output and the two-dimensional thermal/hydraulic calculations

  12. Postclosure performance assessment of the SCP [Site Characterization Plan] conceptual design for horizontal emplacement: Revision 1

    International Nuclear Information System (INIS)

    1987-08-01

    This report is a preliminary postclosure performance assessment of the repository design specified in the Site Characterization Plan Conceptual Design Report (SCP-CDR) for horizontal emplacement of high-level nuclear waste. At the time that these analyses were done, horizontal emplacement was the preferred orientation for the waste packages but vertical emplacement is now the reference design. This assessment consists of (1) a review of the regulatory requirements and strategy to demonstrate compliance with these requirements, (2) an analysis of the performance of the total repository system, (3) an analysis of the thermomechanical behavior of the repository, (4) an analysis of brine mobility in the repository, (5) an analysis of the waste package performance, (6) an analysis of the performance of seals, and (7) comments on the sensitivity of the various performance measures to uncertainties in the data and models. These are preliminary analyses and, in most cases, involve bounding calculations of the repository behavior. They have several purposes including (1) assessing how well this conceptual design ''measures up'' against requirements, (2) gaining experience in implementing the performance assessment strategy and tools and thereby learning where improvements are needed, (3) helping to identify needed data, and (4) helping to indicate required design modifications. 26 refs., 40 figs., 20 tabs

  13. Probabilistic evaluation of fuel element performance by the combined use of a fast running simplistic and a detailed deterministic fuel performance code

    International Nuclear Information System (INIS)

    Misfeldt, I.

    1980-01-01

    A comprehensive evaluation of fuel element performance requires a probabilistic fuel code supported by a well bench-marked deterministic code. This paper presents an analysis of a SGHWR ramp experiment, where the probabilistic fuel code FRP is utilized in combination with the deterministic fuel models FFRS and SLEUTH/SEER. The statistical methods employed in FRP are Monte Carlo simulation or a low-order Taylor approximation. The fast-running simplistic fuel code FFRS is used for the deterministic simulations, whereas simulations with SLEUTH/SEER are used to verify the predictions of FFRS. The ramp test was performed with a SGHWR fuel element, where 9 of the 36 fuel pins failed. There seemed to be good agreement between the deterministic simulations and the experiment, but the statistical evaluation shows that the uncertainty on the important performance parameters is too large for this ''nice'' result. The analysis does therefore indicate a discrepancy between the experiment and the deterministic code predictions. Possible explanations for this disagreement are discussed. (author)

  14. DEXTER: A one-dimensional code for calculating thermionic performance of long converters

    Science.gov (United States)

    Sawyer, C. D.

    1971-01-01

    A versatile code is described for computing the coupled thermionic electric-thermal performance of long thermionic converters in which the temperature and voltage variations cannot be neglected. The code is capable of accounting for a variety of external electrical connection schemes, coolant flow paths and converter failures by partial shorting. Example problem solutions are included along with a user's manual.

  15. NeoCASS: An integrated tool for structural sizing, aeroelastic analysis and MDO at conceptual design level

    Science.gov (United States)

    Cavagna, Luca; Ricci, Sergio; Travaglini, Lorenzo

    2011-11-01

    This paper presents a design framework called NeoCASS (Next generation Conceptual Aero-Structural Sizing Suite), developed at the Department of Aerospace Engineering of Politecnico di Milano in the frame of SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by EU in the context of 6th Framework Program. It enables the creation of efficient low-order, medium fidelity models particularly suitable for structural sizing, aeroelastic analysis and optimization at the conceptual design level. The whole methodology is based on the integration of geometry construction, aerodynamic and structural analysis codes that combine depictive, computational, analytical, and semi-empirical methods, validated in an aircraft design environment. The work here presented aims at including the airframe and its effect from the very beginning of the conceptual design. This aspect is usually not considered in this early phase. In most cases, very simplified formulas and datasheets are adopted, which implies a low level of detail and a poor accuracy. Through NeoCASS, a preliminar distribution of stiffness and inertias can be determined, given the initial layout. The adoption of empirical formulas is reduced to the minimum in favor of simple numerical methods. This allows to consider the aeroelastic behavior and performances, as well, improving the accuracy of the design tools during the iterative steps and lowering the development costs and reducing the time to market. The result achieved is a design tool based on computational methods for the aero-structural analysis and Multi-Disciplinary Optimization (MDO) of aircraft layouts at the conceptual design stage. A complete case study regarding the TransoniCRuiser aircraft, including validation of the results obtained using industrial standard tools like MSC/NASTRAN and a CFD (Computational Fluid Dynamics) code, is reported. As it will be shown, it is possible to improve the degree of

  16. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  17. Solar thermal–photovoltaic powered potato cold storage – Conceptual design and performance analyses

    International Nuclear Information System (INIS)

    Basu, Dipankar N.; Ganguly, A.

    2016-01-01

    Highlights: • Loss of food crop is a huge problem in India due to the shortage of cold storage. • Conceptual design of a power system using solar energy for a potato cold storage. • Integration of flat plate collector and SPV module with suitable operating strategy. • System provides a net energy surplus of about 36 MW h over a calendar year. • Rudimentary economic analysis found payback period of less than four years. - Abstract: Wastage of food crops due to the dearth of proper cold storage facilities is a huge problem in underdeveloped and developing countries of the world. Conceptual design of a potato cold storage is presented here, along with performance appraisal over a calendar year. The microclimate inside the cold storage is regulated using a water–lithium bromide absorption system. Proposed system utilizes both solar thermal and photovoltaic generated electrical energy for its operation. A suitable operation strategy is devised and the performance of the integrated system is analyzed from energy and exergy point of view to identify the required numbers of thermal collectors and photovoltaic modules. The proposed system is found to provide a net surplus of about 36 MW h energy over a calendar year, after meeting the in-house requirements. A rudimentary economic analysis is also performed to check the financial viability of the proposed system. Both the thermal and photovoltaic components are found to have payback periods less than four years.

  18. The CMSSW benchmarking suite: Using HEP code to measure CPU performance

    International Nuclear Information System (INIS)

    Benelli, G

    2010-01-01

    The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.

  19. Oxide fuel pin transient performance analysis and design with the TEMECH code

    International Nuclear Information System (INIS)

    Bard, F.E.; Dutt, S.P.; Hinman, C.A.; Hunter, C.W.; Pitner, A.L.

    1986-01-01

    The TEMECH code is a fast-running, thermal-mechanical-hydraulic, analytical program used to evaluate the transient performance of LMR oxide fuel pins. The code calculates pin deformation and failure probability due to fuel-cladding differential thermal expansion, expansion of fuel upon melting, and fission gas pressurization. The mechanistic fuel model in the code accounts for fuel cracking, crack closure, porosity decrease, and the temperature dependence of fuel creep through the course of the transient. Modeling emphasis has been placed on results obtained from Fuel Cladding Transient Test (FCTT) testing, Transient Fuel Deformation (TFD) tests and TREAT integral fuel pin experiments

  20. Conceptualizing the metaphors of drug abusers

    Directory of Open Access Journals (Sweden)

    Gyuró Monika

    2016-06-01

    Full Text Available The intention of this article is to demonstrate, within the framework of cognitive linguistics (Lakoff and Johnson, 1980a, how slang words associated with substance abuse are conceptualized via metaphors. This study analyses recreational drug slang terms found in the Drug Slang Dictionary in order to reveal categories of metaphors involved in drug users’ language. The results of the data analysis effectively reveal that, within a thematic approach, classes of metaphor are coded to enable connections between metaphorical concepts and drug addicts’ physiological experiences in order to present their personal meanings and cognitive processes. The study also involves drug addicts’ narratives to identify conceptual metaphors in their experiences. Notably, it is argued within this research that figurative language use is also connected to the cultural background of users to a great extent.

  1. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  2. Impact of intra-flow network coding on the relay channel performance: an analytical study

    OpenAIRE

    Apavatjrut , Anya; Goursaud , Claire; Jaffrès-Runser , Katia; Gorce , Jean-Marie

    2012-01-01

    International audience; One of the most powerful ways to achieve trans- mission reliability over wireless links is to employ efficient coding techniques. This paper investigates the performance of a transmission over a relay channel where information is protected by two layers of coding. In the first layer, transmission reliability is ensured by fountain coding at the source. The second layer incorporates network coding at the relay node. Thus, fountain coded packets are re-encoded at the relay...

  3. Perceptual Processing Affects Conceptual Processing

    Science.gov (United States)

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  4. Typical performance of regular low-density parity-check codes over general symmetric channels

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Toshiyuki [Department of Electronics and Information Engineering, Tokyo Metropolitan University, 1-1 Minami-Osawa, Hachioji-shi, Tokyo 192-0397 (Japan); Saad, David [Neural Computing Research Group, Aston University, Aston Triangle, Birmingham B4 7ET (United Kingdom)

    2003-10-31

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models.

  5. Typical performance of regular low-density parity-check codes over general symmetric channels

    International Nuclear Information System (INIS)

    Tanaka, Toshiyuki; Saad, David

    2003-01-01

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models

  6. A ''SuperCode'' for performing systems analysis of tokamak experiments and reactors

    International Nuclear Information System (INIS)

    Haney, S.W.; Barr, W.L.; Crotinger, J.A.; Perkins, L.J.; Solomon, C.J.; Chaniotakis, E.A.; Freidberg, J.P.; Wei, J.; Galambos, J.D.; Mandrekas, J.

    1992-01-01

    A new code, named the ''SUPERCODE,'' has been developed to fill the gap between currently available zero dimensional systems codes and highly sophisticated, multidimensional plasma performance codes. The former are comprehensive in content, fast to execute, but rather simple in terms of the accuracy of the physics and engineering models. The latter contain state-of-the-art plasma physics modelling but are limited in engineering content and time consuming to run. The SUPERCODE upgrades the reliability and accuracy of systems codes by calculating the self consistent 1 1/2 dimensional MHD-transport plasma evolution in a realistic engineering environment. By a combination of variational techniques and careful formation, there is only a modest increase in CPU time over O-D runs, thereby making the SUPERCODE suitable for use as a systems studies tool. In addition, considerable effort has been expended to make the code user- and programming-friendly, as well as operationally flexible, with the hope of encouraging wide usage throughout the fusion community

  7. Performance analysis of a decoding algorithm for algebraic-geometry codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund

    1999-01-01

    The fast decoding algorithm for one point algebraic-geometry codes of Sakata, Elbrond Jensen, and Hoholdt corrects all error patterns of weight less than half the Feng-Rao minimum distance. In this correspondence we analyze the performance of the algorithm for heavier error patterns. It turns out...

  8. Development of a general coupling interface for the fuel performance code transuranus tested with the reactor dynamic code DYN3D

    International Nuclear Information System (INIS)

    Holt, L.; Rohde, U.; Seidl, M.; Schubert, A.; Van Uffelen, P.

    2013-01-01

    Several institutions plan to couple the fuel performance code TRANSURANUS developed by the European Institute for Transuranium Elements with their own codes. One of these codes is the reactor dynamic code DYN3D maintained by the Helmholtz-Zentrum Dresden - Rossendorf. DYN3D was developed originally for VVER type reactors and was extended later to western type reactors. Usually, the fuel rod behavior is modeled in thermal hydraulics and neutronic codes in a simplified manner. The main idea of this coupling is to describe the fuel rod behavior in the frame of core safety analysis in a more detailed way, e.g. including the influence of the high burn-up structure, geometry changes and fission gas release. It allows to take benefit from the improved computational power and software achieved over the last two decades. The coupling interface was developed in a general way from the beginning. Thence it can be easily used also by other codes for a coupling with TRANSURANUS. The user can choose between a one-way as well as a two-way online coupling option. For a one-way online coupling, DYN3D provides only the time-dependent rod power and thermal hydraulics conditions to TRANSURANUS, but the fuel performance code doesn’t transfer any variable back to DYN3D. In a two-way online coupling, TRANSURANUS in addition transfers parameters like fuel temperature and cladding temperature back to DYN3D. This list of variables can be extended easily by geometric and further variables of interest. First results of the code system DYN3D-TRANSURANUS will be presented for a control rod ejection transient in a modern western type reactor. Pre-analyses show already that a detailed fuel rod behavior modeling will influence the thermal hydraulics and thence also the neutronics due to the Doppler reactivity effect of the fuel temperature. The coupled code system has therefore a potential to improve the assessment of safety criteria. The developed code system DYN3D-TRANSURANUS can be used also

  9. Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment

    Science.gov (United States)

    Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.

    2017-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of

  10. Preliminary Development of the MARS/FREK Spatial Kinetics Coupled System Code for Square Fueled Fast Reactor Applications

    International Nuclear Information System (INIS)

    Bae, Moo Hoon; Joo, Han Gyu

    2009-01-01

    Incorporation of a three-dimensional (3-D) reactor kinetics model into a system thermal-hydraulic (T/H) code enhances the capability to perform realistic analyses of the core neutronic behavior and the plant system dynamics which are coupled each other. For this advantage, several coupled system T/H and spatial kinetics codes, such as RELAP/PARCS, RELAP5/ PANBOX, and MARS/MASTER have been developed. These codes, however, so far limited to LWR applications. The objective of this work is to develop such a coupled code for fast reactor applications. Particularly, applications to lead-bismuth eutectic (LBE) cooled fast reactor are of interest which employ open square lattices. A fast reactor kinetics code applicable to square fueled cores called FREK is coupled the LBE version of the MARS code. The MARS/MASTER coupled code is used as the reference for the integration. The coupled code MARS/FREK is examined for a conceptual reactor called P-DEMO which is being developed by NUTRECK. In order to check the validity of the coupled code, however, the OECD MSLB benchmark exercise III calculation is solved first

  11. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  12. Performance studies of the parallel VIM code

    International Nuclear Information System (INIS)

    Shi, B.; Blomquist, R.N.

    1996-01-01

    In this paper, the authors evaluate the performance of the parallel version of the VIM Monte Carlo code on the IBM SPx at the High Performance Computing Research Facility at ANL. Three test problems with contrasting computational characteristics were used to assess effects in performance. A statistical method for estimating the inefficiencies due to load imbalance and communication is also introduced. VIM is a large scale continuous energy Monte Carlo radiation transport program and was parallelized using history partitioning, the master/worker approach, and p4 message passing library. Dynamic load balancing is accomplished when the master processor assigns chunks of histories to workers that have completed a previously assigned task, accommodating variations in the lengths of histories, processor speeds, and worker loads. At the end of each batch (generation), the fission sites and tallies are sent from each worker to the master process, contributing to the parallel inefficiency. All communications are between master and workers, and are serial. The SPx is a scalable 128-node parallel supercomputer with high-performance Omega switches of 63 microsec latency and 35 MBytes/sec bandwidth. For uniform and reproducible performance, they used only the 120 identical regular processors (IBM RS/6000) and excluded the remaining eight planet nodes, which may be loaded by other's jobs

  13. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran.

    Science.gov (United States)

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-09-01

    Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  14. The error performance analysis over cyclic redundancy check codes

    Science.gov (United States)

    Yoon, Hee B.

    1991-06-01

    The burst error is generated in digital communication networks by various unpredictable conditions, which occur at high error rates, for short durations, and can impact services. To completely describe a burst error one has to know the bit pattern. This is impossible in practice on working systems. Therefore, under the memoryless binary symmetric channel (MBSC) assumptions, the performance evaluation or estimation schemes for digital signal 1 (DS1) transmission systems carrying live traffic is an interesting and important problem. This study will present some analytical methods, leading to efficient detecting algorithms of burst error using cyclic redundancy check (CRC) code. The definition of burst error is introduced using three different models. Among the three burst error models, the mathematical model is used in this study. The probability density function, function(b) of burst error of length b is proposed. The performance of CRC-n codes is evaluated and analyzed using function(b) through the use of a computer simulation model within CRC block burst error. The simulation result shows that the mean block burst error tends to approach the pattern of the burst error which random bit errors generate.

  15. A conceptual toolbox for designing CSCW applications

    DEFF Research Database (Denmark)

    Bødker, Susanne; Christiansen, Ellen

    1995-01-01

    This paper presents a conceptual toolbox, developed to support the design of CSCW applications in a large Esprit project, EuroCODE. Here, several groups of designers work to investigate computer support for cooperative work in large use organizations, at the same time as they work to develop...... an open development platform for CSCW applications. The conceptual toolbox has been developed to support communication in and among these design groups, between designers and users and in future use of the open development platform. Rejecting the idea that one may design from a framework describing CSCW......, the toolbox aims to support design by doing and help bridging between work with users, technical design, and insights gained from theoretical and empirical CSCW research....

  16. Use of advanced simulations in fuel performance codes

    International Nuclear Information System (INIS)

    Van Uffelen, P.

    2015-01-01

    The simulation of the cylindrical fuel rod behaviour in a reactor or a storage pool for spent fuel requires a fuel performance code. Such tool solves the equations for the heat transfer, the stresses and strains in fuel and cladding, the evolution of several isotopes and the behaviour of various fission products in the fuel rod. The main equations along with their limitations are briefly described. The current approaches adopted for overcoming these limitations and the perspectives are also outlined. (author)

  17. Introduction into scientific work methods-a necessity when performance-based codes are introduced

    DEFF Research Database (Denmark)

    Dederichs, Anne; Sørensen, Lars Schiøtt

    The introduction of performance-based codes in Denmark in 2004 requires new competences from people working with different aspects of fire safety in the industry and the public sector. This abstract presents an attempt in reducing problems with handling and analysing the mathematical methods...... and CFD models when applying performance-based codes. This is done within the educational program "Master of Fire Safety Engineering" at the department of Civil Engineering at the Technical University of Denmark. It was found that the students had general problems with academic methods. Therefore, a new...

  18. Parallel Computing Characteristics of Two-Phase Thermal-Hydraulics code, CUPID

    International Nuclear Information System (INIS)

    Lee, Jae Ryong; Yoon, Han Young

    2013-01-01

    Parallelized CUPID code has proved to be able to reproduce multi-dimensional thermal hydraulic analysis by validating with various conceptual problems and experimental data. In this paper, the characteristics of the parallelized CUPID code were investigated. Both single- and two phase simulation are taken into account. Since the scalability of a parallel simulation is known to be better for fine mesh system, two types of mesh system are considered. In addition, the dependency of the preconditioner for matrix solver was also compared. The scalability for the single-phase flow is better than that for two-phase flow due to the less numbers of iterations for solving pressure matrix. The CUPID code was investigated the parallel performance in terms of scalability. The CUPID code was parallelized with domain decomposition method. The MPI library was adopted to communicate the information at the interface cells. As increasing the number of mesh, the scalability is improved. For a given mesh, single-phase flow simulation with diagonal preconditioner shows the best speedup. However, for the two-phase flow simulation, the ILU preconditioner is recommended since it reduces the overall simulation time

  19. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  2. Conceptual design and performance simulations of super-compact electromagnetic calorimeter

    Directory of Open Access Journals (Sweden)

    Skoda Libor

    2013-11-01

    Full Text Available Measurements of particle production at forward rapidities in high energy p-p, p-A and A-A collisions provide access to physics processes at very low Bjorken x. These measurements will allow to study the gluon saturation scale and improve our knowledge of parton distribution in nuclei. Specific requirements must be fulfilled for a calorimeter to successfully operate in high-multiplicity forward region within often stringent space limits. Here we present a study of a conceptual design of super-compact electromagnetic calorimeter being developed at Czech Technical University in Prague. The design of the sampling calorimeter is based on a sandwich structure of thin tungsten and scintillator layers oriented in parallel to the beam. Used optical readout of individual scintillator pads guaranties the required high radiation hardness of the detector. We present simulation of the expected performance of the optical pad readout together with overall detector performance. It is aimed for the detector to allow measuring of high energy photons (1

  3. Environmental context effects in conceptual explicit and implicit memory.

    Science.gov (United States)

    Parker, Andrew; Dagnall, Neil; Coyle, Anne-Marie

    2007-05-01

    Previous research has found environmental context effects for both conceptual explicit and conceptual implicit memory (Parker, Gellatly, & Waterman, 1999). The research presented here challenges these findings on methodological grounds. Experiment 1 assessed the effects of context change on category-exemplar generation (conceptual implicit memory test) and category-cued recall (conceptual explicit memory test). Experiment 2 assessed the effects of context change on word association (conceptual implicit memory test) and word associate cued recall (conceptual explicit memory test). In both experiments, study-test changes in environmental context were found to influence performance only on tests of explicit memory. It is concluded that when retrieval cues across explicit and implicit tests are matched, and the probability of explicit contamination is reduced, then only conceptual explicit test performance is reduced by study-test changes in environmental context.

  4. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    Science.gov (United States)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  5. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  6. Drop performance test of conceptually designed control rod assembly for prototype generation IV sodium-cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Kyu; Lee, Jae Han; Kim, Hoe Woong; KIm, Sung Kyun; Kim, Jong Bum [Sodium-cooled Fast Reactor NSSS Design Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    The control rod assembly controls reactor power by adjusting its position during normal operation and shuts down chain reactions by its free drop under scram conditions. Therefore, the drop performance of the control rod assembly is important for the safety of a nuclear reactor. In this study, the drop performance of the conceptually designed control rod assembly for the prototype generation IV sodium-cooled fast reactor that is being developed at the Korea Atomic Energy Research Institute as a next-generation nuclear reactor was experimentally investigated. For the performance test, the test facility and test procedure were established first, and several free drop performance tests of the control rod assembly under different flow rate conditions were then carried out. Moreover, performance tests under several types and magnitudes of seismic loading conditions were also conducted to investigate the effects of seismic loading on the drop performance of the control rod assembly. The drop time of the conceptually designed control rod assembly for 0% of the tentatively designed flow rate was measured to be 1.527 seconds, and this agrees well with the analytically calculated drop time. It was also observed that the effect of seismic loading on the drop time was not significant.

  7. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  8. Three-field modeling for MARS 1-D code

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Lim, Ho-Gon; Jeong, Jae-Jun; Chung, Bub-Dong

    2006-01-01

    In this study, the three-field modeling of the two-phase mixture is developed. The finite difference equations for the three-field equations thereafter are devised. The solution scheme has been implemented into the MARS 1-D code. The three-field formulations adopted are similar to those for MARS 3-D module, in a sense that the mass and momentum are treated separately for the entrained liquid and continuous liquid. As in the MARS-3D module, the entrained liquid and continuous liquid are combined into one for the energy equation, assuming thermal equilibrium between the two. All the non-linear terms are linearized to arrange the finite difference equation set into a linear matrix form with respect to the unknown arguments. The problems chosen for the assessment of the newly added entrained field consist of basic conceptual tests. Among the tests are gas-only test, liquid-only test, gas-only with supplied entrained liquid test, Edwards pipe problem, and GE level swell problem. The conceptual tests performed confirm the sound integrity of the three-field solver

  9. Conceptualizations on Innovation Competency in a Problem- and Project-Based Learning Curriculum

    DEFF Research Database (Denmark)

    Zhang, Fenzhi; Kolmos, Anette; de Graaff, Erik

    2013-01-01

    Understanding innovation competency is the first step in fostering innovative engineers as conceptualizations can both enhance and inhibit innovative behaviors. Though literature is replete with discussions on conceptualizing innovation competency, there is much disagreement regarding its concepts...... identified by analyzing the narratives of interviewees and coding the transcriptions into pre-prepared categories, based on the theoretical framework inspired by activity theory. The analysis of empirical data indicates a collaborative nature of innovation competency in the PBL curriculum; emphasizes...

  10. Development of a Conceptual Framework to Measure the Social Impact of Burns.

    Science.gov (United States)

    Marino, Molly; Soley-Bori, Marina; Jette, Alan M; Slavin, Mary D; Ryan, Colleen M; Schneider, Jeffrey C; Resnik, Linda; Acton, Amy; Amaya, Flor; Rossi, Melinda; Soria-Saucedo, Rene; Kazis, Lewis E

    Measuring community reintegration following burn injury is important to assess the efficacy of therapies designed to optimize recovery. This project aims to develop and validate a conceptual framework for understanding the social impact of burn injuries in adults. The framework is critical for developing the item banks used for a computerized adaptive test. We performed a comprehensive literature review and consulted with clinical experts and burn survivors about social life areas impacted by burn injury. Focus groups with burn survivors and clinicians were conducted to inform and validate the framework. Transcripts were coded using grounded theory methodology. The World Health Organization's International Classification of Functioning, Disability and Health, was chosen to ground the content model. The primary construct identified was social participation, which contains two concepts: societal role and personal relationships. The subdomains chosen for item development were work, recreation and leisure, relating with strangers, and romantic, sexual, family, and informal relationships. Qualitative results strongly suggest that the conceptual model fits the constructs for societal role and personal relationships with the respective subdomains. This conceptual framework has guided the implementation of a large-scale calibration study currently underway which will lead to a computerized adaptive test for monitoring the social impacts of burn injuries during recovery.

  11. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  12. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    Directory of Open Access Journals (Sweden)

    Sara Moghaddam

    2013-08-01

    Full Text Available Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective.Methods: A cross-sectional comparative study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13.Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  13. SCANAIR a transient fuel performance code Part two: Assessment of modelling capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Georgenthum, Vincent, E-mail: vincent.georgenthum@irsn.fr; Moal, Alain; Marchand, Olivier

    2014-12-15

    Highlights: • The SCANAIR code is devoted to the study of irradiated fuel rod behaviour during RIA. • The paper deals with the status of the code validation for PWR rods. • During the PCMI stage there is a good agreement between calculations and experiments. • The boiling crisis occurrence is rather well predicted. • The code assessment during the boiling crisis has still to be improved. - Abstract: In the frame of their research programmes on fuel safety, the French Institut de Radioprotection et de Sûreté Nucléaire develops the SCANAIR code devoted to the study of irradiated fuel rod behaviour during reactivity initiated accident. A first paper was focused on detailed modellings and code description. This second paper deals with the status of the code validation for pressurised water reactor rods performed thanks to the available experimental results. About 60 integral tests carried out in CABRI and NSRR experimental reactors and 24 separated tests performed in the PATRICIA facility (devoted to the thermal-hydraulics study) have been recalculated and compared to experimental data. During the first stage of the transient, the pellet clad mechanical interaction phase, there is a good agreement between calculations and experiments: the clad residual elongation and hoop strain of non failed tests but also the failure occurrence and failure enthalpy of failed tests are correctly calculated. After this first stage, the increase of cladding temperature can lead to the Departure from Nucleate Boiling. During the film boiling regime, the clad temperature can reach a very high temperature (>700 °C). If the boiling crisis occurrence is rather well predicted, the calculation of the clad temperature and the clad hoop strain during this stage have still to be improved.

  14. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    Science.gov (United States)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  15. Construction and performance analysis of variable-weight optical orthogonal codes for asynchronous OCDMA systems

    Science.gov (United States)

    Li, Chuan-qi; Yang, Meng-jie; Zhang, Xiu-rong; Chen, Mei-juan; He, Dong-dong; Fan, Qing-bin

    2014-07-01

    A construction scheme of variable-weight optical orthogonal codes (VW-OOCs) for asynchronous optical code division multiple access (OCDMA) system is proposed. According to the actual situation, the code family can be obtained by programming in Matlab with the given code weight and corresponding capacity. The formula of bit error rate (BER) is derived by taking account of the effects of shot noise, avalanche photodiode (APD) bulk, thermal noise and surface leakage currents. The OCDMA system with the VW-OOCs is designed and improved. The study shows that the VW-OOCs have excellent performance of BER. Despite of coming from the same code family or not, the codes with larger weight have lower BER compared with the other codes in the same conditions. By taking simulation, the conclusion is consistent with the analysis of BER in theory. And the ideal eye diagrams are obtained by the optical hard limiter.

  16. Evolution beyond neo-Darwinism: a new conceptual framework.

    Science.gov (United States)

    Noble, Denis

    2015-01-01

    Experimental results in epigenetics and related fields of biological research show that the Modern Synthesis (neo-Darwinist) theory of evolution requires either extension or replacement. This article examines the conceptual framework of neo-Darwinism, including the concepts of 'gene', 'selfish', 'code', 'program', 'blueprint', 'book of life', 'replicator' and 'vehicle'. This form of representation is a barrier to extending or replacing existing theory as it confuses conceptual and empirical matters. These need to be clearly distinguished. In the case of the central concept of 'gene', the definition has moved all the way from describing a necessary cause (defined in terms of the inheritable phenotype itself) to an empirically testable hypothesis (in terms of causation by DNA sequences). Neo-Darwinism also privileges 'genes' in causation, whereas in multi-way networks of interactions there can be no privileged cause. An alternative conceptual framework is proposed that avoids these problems, and which is more favourable to an integrated systems view of evolution. © 2015. Published by The Company of Biologists Ltd.

  17. Performance of Turbo Interference Cancellation Receivers in Space-Time Block Coded DS-CDMA Systems

    Directory of Open Access Journals (Sweden)

    Emmanuel Oluremi Bejide

    2008-07-01

    Full Text Available We investigate the performance of turbo interference cancellation receivers in the space time block coded (STBC direct-sequence code division multiple access (DS-CDMA system. Depending on the concatenation scheme used, we divide these receivers into the partitioned approach (PA and the iterative approach (IA receivers. The performance of both the PA and IA receivers is evaluated in Rayleigh fading channels for the uplink scenario. Numerical results show that the MMSE front-end turbo space-time iterative approach receiver (IA effectively combats the mixture of MAI and intersymbol interference (ISI. To further investigate the possible achievable data rates in the turbo interference cancellation receivers, we introduce the puncturing of the turbo code through the use of rate compatible punctured turbo codes (RCPTCs. Simulation results suggest that combining interference cancellation, turbo decoding, STBC, and RCPTC can significantly improve the achievable data rates for a synchronous DS-CDMA system for the uplink in Rayleigh flat fading channels.

  18. Conceptual integrated approach for the magnet system of a tokamak reactor

    International Nuclear Information System (INIS)

    Duchateau, J.-L.; Hertout, P.; Saoutic, B.; Artaud, J.-F.; Zani, L.; Reux, C.

    2014-01-01

    Highlights: • We give a conceptual approach of a fusion reactor magnet system based on analytical formula. • We give design criteria for the CS and TF cable in conduit conductors and for the magnet system structural description. • We apply this conceptual approach to ITER and we crosscheck with actual characteristics. • We apply this conceptual approach to a possible version of DEMO. - Abstract: In the framework of the reflexion about DEMO, a conceptual integrated approach for the magnet system of a tokamak reactor is presented. This objective is reached using analytical formulas which are presented in this paper, coupled to a Fortran code ESCORT (Electromagnetic Superconducting System for the Computation of Research Tokamaks), to be integrated into SYCOMORE, a code for reactor modelling presently in development at CEA/IRFM in Cadarache, using the tools of the EFDA Integrated Tokamak Modelling task force. The analytical formulas deal with all aspects of the magnet system, starting from the derivation of the TF system general geometry, from the plasma main characteristics. The design criteria for the cable current density and the structural design of the toroidal field and central solenoid systems are presented, enabling to deliver the radial thicknesses of the magnets and enabling also to estimate the plasma duration of the plateau. As a matter of fact, a pulsed version DEMO is presently actively considered in the European programmes. Considerations regarding the cryogenics and the protection are given, affecting the general design. An application of the conceptual approach is presented, allowing a comparison between ESCORT output data and actual ITER parameters and giving the main characteristics of a possible version for DEMO

  19. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  20. Improving aircraft conceptual design - A PHIGS interactive graphics interface for ACSYNT

    Science.gov (United States)

    Wampler, S. G.; Myklebust, A.; Jayaram, S.; Gelhausen, P.

    1988-01-01

    A CAD interface has been created for the 'ACSYNT' aircraft conceptual design code that permits the execution and control of the design process via interactive graphics menus. This CAD interface was coded entirely with the new three-dimensional graphics standard, the Programmer's Hierarchical Interactive Graphics System. The CAD/ACSYNT system is designed for use by state-of-the-art high-speed imaging work stations. Attention is given to the approaches employed in modeling, data storage, and rendering.

  1. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  2. Safety analysis of MOX fuels by fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-12-01

    Performance of plutonium rick mixed oxide fuels specified for the Reduced-Moderation Water Reactor (RMWR) has been analysed by modified fuel performance code. Thermodynamic properties of these fuels up to 120 GWd/t burnup have not been measured and estimated using existing uranium fuel models. Fission product release, pressure rise inside fuel rods and mechanical loads of fuel cans due to internal pressure have been preliminarily assessed based on assumed axial power distribution history, which show the integrity of fuel performance. Detailed evaluation of fuel-cladding interactions due to thermal expansion or swelling of fuel pellets due to high burnup will be required for safety analysis of mixed oxide fuels. Thermal conductivity and swelling of plutonium rich mixed oxide fuels shall be taken into consideration. (T. Tanaka)

  3. Implementation of computer codes for performance assessment of the Republic repository of LLW/ILW Mochovce

    International Nuclear Information System (INIS)

    Hanusik, V.; Kopcani, I.; Gedeon, M.

    2000-01-01

    This paper describes selection and adaptation of computer codes required to assess the effects of radionuclide release from Mochovce Radioactive Waste Disposal Facility. The paper also demonstrates how these codes can be integrated into performance assessment methodology. The considered codes include DUST-MS for source term release, MODFLOW for ground-water flow and BS for transport through biosphere and dose assessment. (author)

  4. Biosphere conceptual model development in the frame of Baita Bihor repository safety project

    International Nuclear Information System (INIS)

    Paunescu, N.; Margineanu, R.; Ene, D.

    2002-01-01

    The topic of this paper is the development of the biosphere model in the frame of the preliminary performance assessment of the Romanian National L and ILW repository, Baita-Bihor. The work presents the actual understanding of the radionuclide pathways through the repository adjacent area and their conceptualization, collection of required data, implementation of model and preliminary calculation results. The model takes into consideration a leaching scenario from the near field and the transport of radionuclides by river water. The critical group is a small community of inhabitants relying on the local resources, which constitutes an agriculture community 'small farm system'. On the basis of the defined specifications (biosphere equations and data), application of model and dose rate estimates were performed by the ABRICOT code. (author)

  5. Extending the application range of a fuel performance code from normal operating to design basis accident conditions

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Gyori, C.; Schubert, A.; Laar, J. van de; Hozer, Z.; Spykman, G.

    2008-01-01

    Two types of fuel performance codes are generally being applied, corresponding to the normal operating conditions and the design basis accident conditions, respectively. In order to simplify the code management and the interface between the codes, and to take advantage of the hardware progress it is favourable to generate a code that can cope with both conditions. In the first part of the present paper, we discuss the needs for creating such a code. The second part of the paper describes an example of model developments carried out by various members of the TRANSURANUS user group for coping with a loss of coolant accident (LOCA). In the third part, the validation of the extended fuel performance code is presented for LOCA conditions, whereas the last section summarises the present status and indicates needs for further developments to enable the code to deal with reactivity initiated accident (RIA) events

  6. Isotopic modelling using the ENIGMA-B fuel performance code

    International Nuclear Information System (INIS)

    Rossiter, G.D.; Cook, P.M.A.; Weston, R.

    2001-01-01

    A number of experimental programmes by BNFL and other MOX fabricators have now shown that the in-pile performance of MOX fuel is generally similar to that of conventional UO 2 fuel. Models based on UO 2 fuel experience form a good basis for a description of MOX fuel behaviour. However, an area where the performance of MOX fuel is sufficiently different from that of UO 2 to warrant model changes is in the radial power and burnup profile. The differences in radial power and burnup profile arise from the presence of significant concentrations of plutonium in MOX fuel, at beginning of life, and their subsequent evolution with burnup. Amongst other effects, plutonium has a greater neutron absorption cross-section than uranium. This paper focuses on the development of a new model for the radial power and burnup profile within a UO 2 or MOX fuel rod, in which the underlying fissile isotope concentration distributions are tracked during irradiation. The new model has been incorporated into the ENIGMA-B fuel performance code and has been extended to track the isotopic concentrations of the fission gases, xenon and krypton. The calculated distributions have been validated against results from rod puncture measurements and electron probe micro-analysis (EPMA) linescans, performed during the M501 post irradiation examination (PIE) programme. The predicted gas inventory of the fuel/clad gap is compared with the isotopic composition measured during rod puncture and the measured radial distributions of burnup (from neodymium measurements) and plutonium in the fuel are compared with the calculated distributions. It is shown that there is good agreement between the code predictions and the measurements. (author)

  7. Preliminary study to improve the performance of SCWR-M during loss-of-flow accident

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.J., E-mail: xiaojingliu@sjtu.edu.cn [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dong Chuan Road, Shanghai 200240 (China); Sun, C.; Wang, Z.D.; Chai, X.; Xiong, J.B.; Yang, Y.H. [School of Nuclear Science and Engineering, Shanghai Jiao Tong University, 800 Dong Chuan Road, Shanghai 200240 (China); Cheng, X. [Institute of Fusion and Nuclear Technology, Karlsruhe Institute of Technology (KIT), Kaiserstr. 12, 76131 Karlsruhe (Germany)

    2016-10-15

    Highlights: • Validation of the ATHLET-SC code to the safety analysis for SCWR. • Loss of flow accident analysis for SCWR-M is performed. • The passive design parameter is optimized. • The optimized SCWR-M design shows a better safety performance. - Abstract: The SCWR-M is one of the conceptual core designs with mixed neutron spectrum (fast and thermal), which is developed at Shanghai Jiao Tong University. Some preliminary calculations of this new conceptual SCWR indicate the SCWR-M system gets better safety characteristics compared to other single spectrum supercritical water cooled reactors. Loss of flow accident (LOFA) is of particular importance among the abnormal events and accidents for SCWR-M. In order to perform the preliminary study to improve the current SCWR-M safety design, this paper presents the validation results of the ATHLET-SC code and optimization work for safety system design parameters of the ICS, ACC, GDCS based on LOFA analysis. The better performance of the optimized design parameters are demonstrated by comparison with the previous design.

  8. Pre-conceptual-schema-based patterns for deriving key performance indicators from strategic objectives

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2017-05-01

    Full Text Available Performance measurement is crucial for achieving business success. Moreover, such success is also related to the fulfillment of the organizational strategic objectives. Hence, an adequate determination of relevant performance indicators—or key performance indicators (KPIs—and their relationships to organizational objectives is needed. Even though several approaches for treating KPIs and objective-KPI relationships have been proposed, they exhibit some drawbacks associated with the lack of reusability and traceability. We attempt to fill this gap by proposing a set of patterns based on pre-conceptual schemas for supporting the systematic derivation of KPIs and their relationships to organizational objectives. In this way, the proposed patterns guarantee a reusable and traceable derivation process of a set of candidate KPIs from organizational strategic objectives. Lastly, we provide a lab study in order to illustrate the usefulness of this proposal.

  9. On the performance of diagonal lattice space-time codes for the quasi-static MIMO channel

    KAUST Repository

    Abediseid, Walid; Alouini, Mohamed-Slim

    2013-01-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple-output (MIMO) channel. All the coding design to date focuses on either high-performance, high rates, low complexity encoding and decoding

  10. Polarity Correspondence: A General Principle for Performance of Speeded Binary Classification Tasks

    Science.gov (United States)

    Proctor, Robert W.; Cho, Yang Seok

    2006-01-01

    Differences in performance with various stimulus-response mappings are among the most prevalent findings for binary choice reaction tasks. The authors show that perceptual or conceptual similarity is not necessary to obtain mapping effects; a type of structural similarity is sufficient. Specifically, stimulus and response alternatives are coded as…

  11. Performance of the OVERFLOW-MLP and LAURA-MLP CFD Codes on the NASA Ames 512 CPU Origin System

    Science.gov (United States)

    Taft, James R.

    2000-01-01

    The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full

  12. Reliability in the performance-based concept of fib Model Code 2010

    NARCIS (Netherlands)

    Bigaj-van Vliet, A.; Vrouwenvelder, T.

    2013-01-01

    The design philosophy of the new fib Model Code for Concrete Structures 2010 represents the state of the art with regard to performance-based approach to the design and assessment of concrete structures. Given the random nature of quantities determining structural behaviour, the assessment of

  13. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    Science.gov (United States)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  14. Intact and impaired conceptual memory processes in amnesia.

    Science.gov (United States)

    Keane, M M; Gabrieli, J D; Monti, L A; Fleischman, D A; Cantor, J M; Noland, J S

    1997-01-01

    To examine the status of conceptual memory processes in amnesia, a conceptual memory task with implicit or explicit task instructions was given to amnesic and control groups. After studying a list of category exemplars, participants saw category labels and were asked to generate as many exemplars as possible (an implicit memory task) or to generate exemplars that had been in the prior study list (an explicit memory task). After incidental deep or shallow encoding of exemplars, amnesic patients showed normal implicit memory performance (priming), a normal levels-of-processing effect on priming, and impaired explicit memory performance. After intentional encoding of exemplars, amnesic patients showed impaired implicit and explicit memory performance. Results suggest that although amnesic patients can show impairments on implicit and explicit conceptual memory tasks, their deficit does not generalize to all conceptual memory tasks.

  15. Performance of an Error Control System with Turbo Codes in Powerline Communications

    Directory of Open Access Journals (Sweden)

    Balbuena-Campuzano Carlos Alberto

    2014-07-01

    Full Text Available This paper reports the performance of turbo codes as an error control technique in PLC (Powerline Communications data transmissions. For this system, computer simulations are used for modeling data networks based on the model classified in technical literature as indoor, and uses OFDM (Orthogonal Frequency Division Multiplexing as a modulation technique. Taking into account the channel, modulation and turbo codes, we propose a methodology to minimize the bit error rate (BER, as a function of the average received signal noise ratio (SNR.

  16. Steady State and Transient Fuel Rod Performance Analyses by Pad and Transuranus Codes

    International Nuclear Information System (INIS)

    Slyeptsov, O.; Slyeptsov, S.; Kulish, G.; Ostapov, A.; Chernov, I.

    2013-01-01

    The report performed under IAEA research contract No.15370/L2 describes the analysis results of WWER and PWR fuel rod performance at steady state operation and transients by means of PAD and TRANSURANUS codes. The code TRANSURANUS v1m1j09 developed by Institute for of Transuranium Elements (ITU) was used based on the Licensing Agreement N31302. The code PAD 4.0 developed by Westinghouse Electric Company was utilized in the frame of the Ukraine Nuclear Fuel Qualification Project for safety substantiation for the use of Westinghouse fuel assemblies in the mixed core of WWER-1000 reactor. The experimental data for the Russian fuel rod behavior obtained during the steady-state operation in the WWER-440 core of reactor Kola-3 and during the power transients in the core of MIR research reactor were taken from the IFPE database of the OECD/NEA and utilized for assessing the codes themselves during simulation of such properties as fuel burnup, fuel centerline temperature (FCT), fuel swelling, cladding strain, fission gas release (FGR) and rod internal pressure (RIP) in the rod burnup range of (41 - 60) GWD/MTU. The experimental data of fuel behavior at steady-state operation during seven reactor cycles presented by AREVA for the standard PWR fuel rod design were used to examine the code FGR model in the fuel burnup range of (37 - 81) GWD/MTU. (author)

  17. Software Abstractions and Methodologies for HPC Simulation Codes on Future Architectures

    Directory of Open Access Journals (Sweden)

    Anshu Dubey

    2014-07-01

    Full Text Available Simulations with multi-physics modeling have become crucial to many science and engineering fields, and multi-physics capable scientific software is as important to these fields as instruments and facilities are to experimental sciences. The current generation of mature multi-physics codes would have sustainably served their target communities with modest amount of ongoing investment for enhancing capabilities. However, the revolution occurring in the hardware architecture has made it necessary to tackle the parallelism and performance management in these codes at multiple levels. The requirements of various levels are often at cross-purposes with one another, and therefore hugely complicate the software design. All of these considerations make it essential to approach this challenge cooperatively as a community. We conducted a series of workshops under an NSF-SI2 conceptualization grant to get input from various stakeholders, and to identify broad approaches that might lead to a solution. In this position paper we detail the major concerns articulated by the application code developers, and emerging trends in utilization of programming abstractions that we found through these workshops.

  18. CONCEPTUAL METAPHORS IN BASKETBALL DISCOURSE

    Directory of Open Access Journals (Sweden)

    Reda Toleikienė

    2013-04-01

    Full Text Available This article presents part of the research performed within the scope of the national project “Conceptual Metaphors in Public Discourse,”1 financed by the Research Council of Lithuania. The aim of the present paper is to analyze conceptual metaphors in the discourse of the European Basketball Championship which took place in Kaunas, Lithuania in 2011, as well as to determine the source concepts. The analysis allows certain features of the images which are used while conceptualizing the entities related to basketball to be described. The metaphorical collocations drawn from the Lithuanian language corpora and web portals (www.delfi.lt and www.lrytas.lt were selected and analyzed from 31 August 2011 to 18 September 2011. A conceptual metaphor is defined as an interaction of two conceptual fields (source and target concepts. On the basis of the analyzed conceptual sayings, the reconstructed conceptual metaphors proved that the most prolific metaphors are of war, ontology, and scale. In basketball discourse, the war metaphor is characterized by the fact that the image of sport is war is supplemented by other source concepts (e.g.,a person, a building, a thing, a material, a scale. The features of two or sometimes even three source concepts are ascribed to the target concept.

  19. Development of Tokamak reactor system code and conceptual studies of DEMO with He Cooled Molten Li blanket

    International Nuclear Information System (INIS)

    Hong, B.G.; Lee, Dong Won; Kim, Yong Hi

    2007-01-01

    To develop the concepts of fusion power plants and identify the design parameters, we have been developing the tokamak reactor system code. The system code can take into account a wide range of plasma physics and technology effects simultaneously and it can be used to find design parameters which optimize the given figure of merits. The outcome of the system studies using the system code is to identify which areas of plasma physics and technologies and to what extent should be developed for realization of a given fusion power plant concepts. As an application of the tokamak reactor system code, we investigate the performance of DEMO for early realization with a limited extension from the plasma physics and technology used in the design of the ITER. Main requirements for DEMO are selected as: 1) to demonstrate tritium self-sufficiency, 2) to generate net electricity, and 3) for steady-state operation. The size of plasma is assumed to be same as that of ITER and the plasma parameters which characterize the performance, i.e. normalized β value, β N , confinement improvement factor for the H-mode, H and the ratio of the Greenwald density limit n/n G are assumed to be improved beyond those of ITER: β N >2.0, H>1.0 and n/n G >1.0. Tritium self-sufficiency is provided by the He Cooled Molten Lithium (HCML) blanket with the total thickness of 2.5 m including the shield. With n/n G >1.2, net electric power bigger than 500 MW is possible with β N >4.0 andH>1.2. To access operation space for higher electric power, main restrictions are given by the divertor heat load and the steady-state operation requirements. Developments in both plasma physics and technology are required to handle high heat load and to increase the current drive efficiency. (orig.)

  20. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings. PMID:27242491

  1. Systemizers are better code-breakers:Self-reported systemizing predicts code-breaking performance in expert hackers and naïve participants

    Directory of Open Access Journals (Sweden)

    India eHarvey

    2016-05-01

    Full Text Available Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders - attention to detail and systemizing - may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e. crypto-analysis or code-breaking. A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001; Baron-Cohen et al., 2003. They were also tested with behavioural tasks involving code-breaking and a control task involving security x-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing was related with performance in the x-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015. We discuss the theoretical and translational implications of our findings.

  2. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  3. A conceptual design of multidisciplinary-integrated C.F.D. simulation on parallel computers

    International Nuclear Information System (INIS)

    Onishi, Ryoichi; Ohta, Takashi; Kimura, Toshiya.

    1996-11-01

    A design of a parallel aeroelastic code for aircraft integrated simulations is conducted. The method for integrating aerodynamics and structural dynamics software on parallel computers is devised by using the Euler/Navier-Stokes equations coupled with wing-box finite element structures. A synthesis of modern aircraft requires the optimizations of aerodynamics, structures, controls, operabilities, or other design disciplines, and the R and D efforts to implement Multidisciplinary Design Optimization environments using high performance computers are made especially among the U.S. aerospace industries. This report describes a Multiple Program Multiple Data (MPMD) parallelization of aerodynamics and structural dynamics codes with a dynamic deformation grid. A three-dimensional computation of a flowfield with dynamic deformation caused by a structural deformation is performed, and a pressure data calculated is used for a computation of the structural deformation which is input again to a fluid dynamics code. This process is repeated exchanging the computed data of pressures and deformations between flowfield grids and structural elements. It enables to simulate the structure movements which take into account of the interaction of fluid and structure. The conceptual design for achieving the aforementioned various functions is reported. Also the future extensions to incorporate control systems, which enable to simulate a realistic aircraft configuration to be a major tool for Aircraft Integrated Simulation, are investigated. (author)

  4. Reactivity Insertion Accident (RIA) Capability Status in the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Folsom, Charles Pearson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pastore, Giovanni [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-05-01

    One of the Challenge Problems being considered within CASL relates to modelling and simulation of Light Water Reactor LWR) fuel under Reactivity Insertion Accident (RIA) conditions. BISON is the fuel performance code used within CASL for LWR fuel under both normal operating and accident conditions, and thus must be capable of addressing the RIA challenge problem. This report outlines required BISON capabilities for RIAs and describes the current status of the code. Information on recent accident capability enhancements, application of BISON to a RIA benchmark exercise, and plans for validation to RIA behavior are included.

  5. Solutions to HYDROCOIN [Hydrologic Code Intercomparison] Level 1 problems using STOKES and PARTICLE (Cases 1,2,4,7)

    International Nuclear Information System (INIS)

    Gureghian, A.B.; Andrews, A.; Steidl, S.B.; Brandstetter, A.

    1987-10-01

    HYDROCOIN (Hydrologic Code Intercomparison) Level 1 benchmark problems are solved using the finite element ground-water flow code STOKES and the pathline generating code PARTICLE developed for the Office of Crystalline Repository Development (OCRD). The objective of the Level 1 benchmark problems is to verify the numerical accuracy of ground-water flow codes by intercomparison of their results with analytical solutions and other numerical computer codes. Seven test cases were proposed for Level 1 to the Swedish Nuclear Power Inspectorate, the managing participant of HYDROCOIN. Cases 1, 2, 4, and 7 were selected by OCRD because of their appropriateness to the nature of crystalline repository hydrologic performance. The background relevance, conceptual model, and assumptions of each case are presented. The governing equations, boundary conditions, input parameters, and the solution schemes applied to each case are discussed. The results are shown in graphic and tabular form with concluding remarks. The results demonstrate the two-dimensional verification of STOKES and PARTICLE. 5 refs., 61 figs., 30 tabs

  6. IS ENVIRONMENTAL ALIGNMENT AND BUSINESS PERFORMANCE: A CONCEPTUAL APPROACH

    Directory of Open Access Journals (Sweden)

    K. Garg

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper proposes a conceptual model termed "IS environmental alignment" that focuses on the support provided by IS strategy to minimize the gap between perceived environmental uncertainty and realized/objective environmental conditions. The model uses the Chan et al [9] alignment measurement method to measure IS strategic alignment as it provides a quantitative measure. In due course the proposed model would be tested in industry and would examine the affect of IS environmental alignment on business performance. The implication of the model lies in the effective use of deployed IS systems by organizations.

    AFRIKAANSE OPSOMMING: 'n Konsepmodel word voorgelê wat handel oor "IS-omgewingsaanpassing". IS-strategie-ondersteuning by die minimisering van die gaping tussen waargenome omgewingsonsekerheid en gerealiseerde/objektiewe omgewingstoestande. Die model maak gebruik van die Chan et al [9] aanpassingsmeetmetode om IS-strategie-aanpassing op 'n kwantitatiewe basis te bepaal. Met die verloop van tyd sal die konsepmodel in die praktyk getoets word om te toon hoe IS-strategie-aanpassing sakevertoning affekteer. Die effektiewe ontplooiing van IS-stelsels by ondernemings word voorgehou.

  7. Assessing alternative conceptual models of fracture flow

    International Nuclear Information System (INIS)

    Ho, C.K.

    1995-01-01

    The numerical code TOUGH2 was used to assess alternative conceptual models of fracture flow. The models that were considered included the equivalent continuum model (ECM) and the dual permeability (DK) model. A one-dimensional, layered, unsaturated domain was studied with a saturated bottom boundary and a constant infiltration at the top boundary. Two different infiltration rates were used in the studies. In addition, the connection areas between the fracture and matrix elements in the dual permeability model were varied. Results showed that the two conceptual models of fracture flow produced different saturation and velocity profiles-even under steady-state conditions. The magnitudes of the discrepancies were sensitive to two parameters that affected the flux between the fractures and matrix in the dual permeability model: (1) the fracture-matrix connection areas and (2) the capillary pressure gradients between the fracture and matrix elements

  8. Conceptual Kinematic Design and Performance Evaluation of a Chameleon-Like Service Robot for Space Stations

    Directory of Open Access Journals (Sweden)

    Marco Ceccarelli

    2015-03-01

    Full Text Available In this paper a conceptual kinematic design of a chameleon-like robot with proper mobility capacity is presented for service applications in space stations as result of design considerations with biomimetic inspiration by looking at chameleons. Requirements and characteristics are discussed with the aim to identify design problems and operation features. A study of feasibility is described through performance evaluation by using simulations for a basic operation characterization.

  9. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    International Nuclear Information System (INIS)

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites

  10. Direct and Conceptual Replications of Burgmer & Englich (2012: Power May Have Little to No Effect on Motor Performance.

    Directory of Open Access Journals (Sweden)

    Margaret Cusack

    Full Text Available Burgmer and Englich (2012 have reported that manipulating feelings of power can substantially improve performance on two motor tasks: golf and darts. We conducted two high-powered direct replications of the effects of power on golf, two online conceptual replications using mirror-tracing as a performance measure, and an additional conceptual replication using a cognitive performance measure (word-search. Overall, we found little to no effect of power on motor skill (d = 0.09, 95% CI[-0.07, 0.22], n = 603. We varied task difficulty, re-analyzed data without participants showing weak responses on manipulation checks, and tried adjusting performance scores for age, gender, and initial task skill. None of these secondary analyses revealed a strong effect of power on performance. A meta-analysis integrating our data with Burgmer & Englich leaves open the possibility that manipulating power could provide a modest boost in motor skill (d = 0.19, 95% CI [0.001, 0.38], n = 685. Unfortunately, the pattern of performance changes we observed was unrelated to group differences in perceived and rated power, suggesting that what motor effects do occur with this protocol may not be directly related to the construct of power. [Burgmer, P., &Englich, B. (2012. Bullseye!: How Power Improves Motor Performance. Social Psychological and Personality Science, 4(2, 224-232.].

  11. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  12. Understanding Notional Machines through Traditional Teaching with Conceptual Contraposition and Program Memory Tracing

    Directory of Open Access Journals (Sweden)

    Jeisson Hidalgo-Céspedes

    2016-08-01

    Full Text Available A correct understanding about how computers run code is mandatory in order to effectively learn to program. Lectures have historically been used in programming courses to teach how computers execute code, and students are assessed through traditional evaluation methods, such as exams. Constructivism learning theory objects to students’ passiveness during lessons, and traditional quantitative methods for evaluating a complex cognitive process such as understanding. Constructivism proposes complimentary techniques, such as conceptual contraposition and colloquies. We enriched lectures of a “Programming II” (CS2 course combining conceptual contraposition with program memory tracing, then we evaluated students’ understanding of programming concepts through colloquies. Results revealed that these techniques applied to the lecture are insufficient to help students develop satisfactory mental models of the C++ notional machine, and colloquies behaved as the most comprehensive traditional evaluations conducted in the course.

  13. Performance Analysis of Spectral Amplitude Coding Based OCDMA System with Gain and Splitter Mismatch

    Science.gov (United States)

    Umrani, Fahim A.; Umrani, A. Waheed; Umrani, Naveed A.; Memon, Kehkashan A.; Kalwar, Imtiaz Hussain

    2013-09-01

    This paper presents the practical analysis of the optical code-division multiple-access (O-CDMA) systems based on perfect difference codes. The work carried out use SNR criterion to select the optimal value of avalanche photodiodes (APD) gain and shows how the mismatch in the splitters and gains of the APD used in the transmitters and receivers of network can degrade the BER performance of the system. The investigations also reveal that higher APD gains are not suitable for such systems even at higher powers. The system performance, with consideration of shot noise, thermal noise, bulk and surface leakage currents is also investigated.

  14. A performance assessment review tool for the proposed radioactive waste repository at Yucca Mountain, Nevada, USA

    International Nuclear Information System (INIS)

    Mohanty, Sitakanta; Codell, Richard

    2000-01-01

    The U.S. Nuclear Regulatory Commission (NRC), with the assistance of the Center for Nuclear Waste Regulatory Analyses, has developed a Total-system Performance Assessment (TPA) Code to assist in evaluating the performance of the Yucca Mountain (YM) High-Level Waste Repository in Nevada, proposed by the U.S. Department of Energy (DOE). The proposed YM repository would be built in a thick sequence of partially saturated volcanic tuff above the water table. Among the unique challenges of this environment are (1) the transport of radionuclides would take place partially through highly heterogeneous unsaturated rock; (2) the waste packages (WPs) would be generally exposed to oxidizing conditions, and (3) water either infiltrating from the surface or recirculating because of decay heat may drip onto the WPs. Tools such as the TPA code and embedded techniques for evaluating YM performance are aimed at (1) determining the parameters and key parts of the repository system that have the most influence on repository performance; (2) performing alternative conceptual models studies, especially with bounding models; (3) estimating the relative importance of the physical phenomena that lead to human exposure to radionuclides; and (4) improving NRC staff capabilities in performance assessment and associated license application reviews. This paper presents an overview of the NRC conceptual framework, approach to conducting system-level sensitivity analyses for determining influential parameters, and alternative conceptual model studies to investigate the effect of model uncertainties. (author)

  15. Application of core structural design guidelines in conceptual fuel pin design

    International Nuclear Information System (INIS)

    Patel, M.R.; Stephen, J.D.

    1979-01-01

    The paper describes an application of the Draft RDT Standards F9-7, -8, and -9 to conceptual design of Fast Breeder Reactor (FBR) fuel pins. The Standards are being developed to provide guidelines for structural analysis and design of the FBR core components which have limited ductility at high fluences and are not addressed by the prevalent codes. The development is guided by a national working group sponsored by the Division of Reactor Researcch and Technology of the Department of Energy. The development program summarized in the paper includes establishment of design margins consistent with the test data and component performance requirements, and application of the design rules in various design activities. The application program insures that the quantities required for proper application of the design rules are available from the analysis methods and test data, and that the use of the same design rules in different analysis tools used at different stages of a component design producees consistent results. This is illustrated in the paper by application of the design rules in the analysis methods developed for conceptual and more detailed designs of an FBR fuel pin

  16. Evaluation of CFETR as a Fusion Nuclear Science Facility using multiple system codes

    Science.gov (United States)

    Chan, V. S.; Costley, A. E.; Wan, B. N.; Garofalo, A. M.; Leuer, J. A.

    2015-02-01

    This paper presents the results of a multi-system codes benchmarking study of the recently published China Fusion Engineering Test Reactor (CFETR) pre-conceptual design (Wan et al 2014 IEEE Trans. Plasma Sci. 42 495). Two system codes, General Atomics System Code (GASC) and Tokamak Energy System Code (TESC), using different methodologies to arrive at CFETR performance parameters under the same CFETR constraints show that the correlation between the physics performance and the fusion performance is consistent, and the computed parameters are in good agreement. Optimization of the first wall surface for tritium breeding and the minimization of the machine size are highly compatible. Variations of the plasma currents and profiles lead to changes in the required normalized physics performance, however, they do not significantly affect the optimized size of the machine. GASC and TESC have also been used to explore a lower aspect ratio, larger volume plasma taking advantage of the engineering flexibility in the CFETR design. Assuming the ITER steady-state scenario physics, the larger plasma together with a moderately higher BT and Ip can result in a high gain Qfus ˜ 12, Pfus ˜ 1 GW machine approaching DEMO-like performance. It is concluded that the CFETR baseline mode can meet the minimum goal of the Fusion Nuclear Science Facility (FNSF) mission and advanced physics will enable it to address comprehensively the outstanding critical technology gaps on the path to a demonstration reactor (DEMO). Before proceeding with CFETR construction steady-state operation has to be demonstrated, further development is needed to solve the divertor heat load issue, and blankets have to be designed with tritium breeding ratio (TBR) >1 as a target.

  17. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    Science.gov (United States)

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  18. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    Science.gov (United States)

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  19. A conceptual gamma shield design using the DRP model computation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, E E [Reactor Department, Nuclear Research Center, Atomic Energy Authority, Cairo (Egypt); Rahman, F A [National Center of Nuclear Safety and Radiation Control, Atomic Energy Authority, Cairo (Egypt)

    1997-12-31

    The purpose of this investigation is to assess basic areas of concern in the development of reactor shielding conceptual design calculations. A spherical shield model composed of low carbon steel and lead have been constructed to surround a Co-60 gamma point source. two alternative configurations have been considered in the model computation. The numerical calculations have been performed using both the ANISN code and DRP model computation together with the DLC 75-Bugle 80 data library. A resume of results for deep penetration in different shield materials with different packing densities is presented and analysed. The results showed that the gamma fluxes attenuation is increased with increasing distribution the packing density of the shield material which reflects its importance of considering it as a safety parameter in shielding design. 3 figs.

  20. Conceptual design study of Hyb-WT as fusion–fission hybrid reactor for waste transmutation

    International Nuclear Information System (INIS)

    Siddique, Muhammad Tariq; Kim, Myung Hyun

    2014-01-01

    Highlights: • Conceptual design study of fusion-fission hybrid reactor for waste transmutation. • MCNPX and MONTEBURNS are compared for transmutation performance of WT-Hyb. • Detailed neutronic performance of final optimized Hyb-WT design is analyzed. • A new tube-in-duct core design is implemented and compared with pin type design. • Study shows many aspects of hybrid reactor even though scope was limited to neutronic analysis. - Abstract: This study proposes a conceptual design of a hybrid reactor for waste transmutation (Hyb-WT). The design of Hyb-WT is based on a low-power tokamak (less than 150 MWt) and an annular ring-shaped reactor core with metal fuel (TRU 60 w/o, Zr 40 w/o) and a fission product (FP) zone. The computational code systems MONTEBURNS and MCNPX2.6 are investigated for their suitability in evaluating the performance of Hyb-WT. The overall design performance of the proposed reactor is determined by considering pin-type and tube-in-duct core designs. The objective of such consideration is to explore the possibilities for enhanced transmutation with reduced wall loading from fusion neutrons and reduced transuranic (TRU) inventory. TRU and FP depletion is analyzed by calculating waste transmutation ratio, mass burned per full power year (in units of kg/fpy), and support ratio. The radio toxicity analysis of TRUs and FPs is performed by calculating the percentage of toxicity reduction in TRU and FP over a burn cycle

  1. Conceptual cask design with burnup credit

    International Nuclear Information System (INIS)

    Lee, Seong Hee; Ahn, Joon Gi; Hwang, Hae Ryong

    2003-01-01

    Conceptual design has been performed for a spent fuel transport cask with burnup credit and a neutron-absorbing material to maximize transportation capacity. Both fresh and burned fuel are assumed to be stored in the cask and boral and borated stainless steel are selected for the neutron-absorbing materials. Three different sizes of cask with typical 14, 21 and 52 PWR fuel assemblies are modeled and analyzed with the SCALE 4.4 code system. In this analysis, the biases and uncertainties through validation calculations for both isotopic predictions and criticality calculation for the spent fuel have been taken into account. All of the reactor operating parameters, such as moderator density, soluble boron concentration, fuel temperature, specific power, and operating history, have been selected in a conservative way for the criticality analysis. Two different burnup credit loading curves are developed for boral and borated stainless steel absorbing materials. It is concluded that the spent fuel transport cask design with burnup credit is feasible and is expected to increase cask payloads. (author)

  2. Performance of super-orthogonal space-time trellis code in a multipath environment

    CSIR Research Space (South Africa)

    Sokoya, OA

    2007-09-01

    Full Text Available This paper investigates the performance of Super-Orthogonal Space-time Trellis Code (SOSTTC) designed primarily for non-frequency selective (i.e. flat) fading channel but now applied to a frequency selective fading channel. A new decoding trellis...

  3. KALIMER-600-clad Core Fuel Assembly Calculation using MATRA-LMR (V2.0) Code

    International Nuclear Information System (INIS)

    Kim, Young Gyun; Kim, Young Il

    2006-12-01

    Since the sodium boiling point is very high, maximum cladding and pin temperatures are used for design limit condition in sodium cooled liquid metal reactor. It is necessary to predict accurately the temperature distribution in the core and in the subassemblies to increase the sodium coolant efficiency. Based on the MATRA code, which is developed for PWR analysis, MATRA-LMR has been developed for SFR. The major modifications are: the sodium properties table is implemented as subprogram in the code, Heat transfer coefficients are changed for SFR, te pressure drop correlations are changed for more accurate calculations, which are Novendstern, Chiu-Rohsenow-Todreas, and Cheng-Todreas correlations. This This report describes briefly code structure and equations of MATRA-LMR (Version 2.0), explains input data preparation and shows some calculation results for the KALIMER-600-clad core fuel assembly for which has been performed the conceptual design of the core in the year 2006

  4. Conceptual design of helium experimental loop

    International Nuclear Information System (INIS)

    Yu Xingfu; Feng Kaiming

    2007-01-01

    In a future demonstration fusion power station (DEMO), helium is envisaged as coolant for plasma facing components, such as blanket and dive,or. All these components have a very complex geometry, with many parallel cooling channels, involving a complex helium flow distribution. Test blanket modules (TBM) of this concept will under go various tests in the experimental reactor ITER. For the qualification of TBM, it is indispensable to test mock-ups in a helium loop under realistic pressure and temperature profiles, in order to validate design codes, especially regarding mass flow and heat transition processes in narrow cooling channels. Similar testing must be performed for DEMO blanket, currently under development. A Helium Experimental Loop (HELOOP) is planed to be built for TBM tests. The design parameter of temperature, pressure, flow rate is 550 degree C, 10 MPa, l kg/s respectively. In particular, HELOOP is able to: perform full-scale tests of TBM under realistic conditions; test other components of the He-cooling system in ITER; qualify the purification circuit; obtain information for the design of the ITER cooling system. The main requirements and characteristics of the HELOOP facility and a preliminary conceptual design are described in the paper. (authors)

  5. A Conceptual Framework to Measure Systems’ Performance during Emergency Preparedness Exercises

    Directory of Open Access Journals (Sweden)

    Elena Savoia

    2014-09-01

    Full Text Available Large-scale public health emergencies require a sophisticated, coordinated response involving multiple entities to protect health and minimize suffering. However, the rarity of such emergencies presents a barrier to gathering observational data about the effectiveness of the public health response before such events occur. For this reason, public health practitioners increasingly have relied on simulated emergencies, known as “exercises” as proxies to test their emergency capabilities. However, the formal evaluation of performance in these exercises, historically has been inconsistent, and there is little research to describe how data acquired from simulated emergencies actually support conclusions about the quality of the public health emergency response system. Over the past six years, we have designed and evaluated more than seventy public health emergency exercises, collaborating with public health agencies, hospitals and others to test a wide variety of systems and their capabilities. Using the data and experience that we gathered, we have developed a conceptual framework that describes the essential elements necessary to consider when applying performance measurement science to public health emergency exercises. We suggest that this framework may assist practitioners and researchers who wish to better measure performance in exercises and to improve public health emergency preparedness.

  6. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  7. Contributions of Sensory Coding and Attentional Control to Individual Differences in Performance in Spatial Auditory Selective Attention Tasks.

    Science.gov (United States)

    Dai, Lengshi; Shinn-Cunningham, Barbara G

    2016-01-01

    Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics

  8. Contributions of sensory coding and attentional control to individual differences in performance in spatial auditory selective attention tasks

    Directory of Open Access Journals (Sweden)

    Lengshi Dai

    2016-10-01

    Full Text Available Listeners with normal hearing thresholds differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding, onset event-related potentials from the scalp (ERPs, reflecting cortical responses to sound, and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones; however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance, inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with normal hearing thresholds can arise due to both subcortical coding differences and differences in attentional control, depending on

  9. Performance Evaluation of a Novel Optimization Sequential Algorithm (SeQ Code for FTTH Network

    Directory of Open Access Journals (Sweden)

    Fazlina C.A.S.

    2017-01-01

    Full Text Available The SeQ codes has advantages, such as variable cross-correlation property at any given number of users and weights, as well as effectively suppressed the impacts of phase induced intensity noise (PIIN and multiple access interference (MAI cancellation property. The result revealed, at system performance analysis of BER = 10-09, the SeQ code capable to achieved 1 Gbps up to 60 km.

  10. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  11. Design and Performance Assessment of a Conceptual Cover Cap of Near Surface Repository for Short Lived Low and Intermediate Level Radioactive Waste

    Energy Technology Data Exchange (ETDEWEB)

    Asaba, Ruth; Kim, Changlak [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    The safety of the overall waste disposal system is determined by the performance of its individual components: waste form, waste container, engineered barrier and the host environment. The design of the cover cap helps in preventing percolation of water, and in retarding radionuclide migration from the disposal facility to the biosphere. The cover of a disposal facility is usually a combination of materials such as sand, gravel, concrete, clay and soil conditioned for vegetation growth. The cover system will be designed using models such as EPA's Hydrological Evaluation of Land fill Performance (HELP) code. This paper describes a conceptual design for a cover cap for a land fill as a preferred disposal facility for low and short lived intermediate radioactive waste in Uganda. Majority of the waste is generated from disused sealed and unsealed radioactive sources arising from medical, industrial applications, and research. Radioactive waste management has raised a lot of concern in both developed and developing countries. Each country has a responsibility to come up with a solution to prevent contamination of the environment and humans from radioactive waste. It is important to have thicker soil layers in cap designs so as to support vegetation growth since some activities such as erosion and settlements are expected. Help simulations in this study will assist to demonstrate that it is possible to design a cover cap which can contain radioactive waste packages for hundreds of years provided the proper institutional and performance monitoring schemes are implemented.

  12. Leveraging Quick Response Code Technology to Facilitate Simulation-Based Leaderboard Competition.

    Science.gov (United States)

    Chang, Todd P; Doughty, Cara B; Mitchell, Diana; Rutledge, Chrystal; Auerbach, Marc A; Frisell, Karin; Jani, Priti; Kessler, David O; Wolfe, Heather; MacKinnon, Ralph J; Dewan, Maya; Pirie, Jonathan; Lemke, Daniel; Khattab, Mona; Tofil, Nancy; Nagamuthu, Chenthila; Walsh, Catharine M

    2018-02-01

    Leaderboards provide feedback on relative performance and a competitive atmosphere for both self-guided improvement and social comparison. Because simulation can provide substantial quantitative participant feedback, leaderboards can be used, not only locally but also in a multidepartment, multicenter fashion. Quick Response (QR) codes can be integrated to allow participants to access and upload data. We present the development, implementation, and initial evaluation of an online leaderboard employing principles of gamification using points, badges, and leaderboards designed to enhance competition among healthcare providers. This article details the fundamentals behind the development and implementation of a user-friendly, online, multinational leaderboard that employs principles of gamification to enhance competition and integrates a QR code system to promote both self-reporting of performance data and data integrity. An open-ended survey was administered to capture perceptions of leaderboard implementation. Conceptual step-by-step instructions detailing how to apply the QR code system to any leaderboard using simulated or real performance metrics are outlined using an illustrative example of a leaderboard that employed simulated cardiopulmonary resuscitation performance scores to compare participants across 17 hospitals in 4 countries for 16 months. The following three major descriptive categories that captured perceptions of leaderboard implementation emerged from initial evaluation data from 10 sites: (1) competition, (2) longevity, and (3) perceived deficits. A well-designed leaderboard should be user-friendly and encompass best practices in gamification principles while collecting and storing data for research analyses. Easy storage and export of data allow for longitudinal record keeping that can be leveraged both to track compliance and to enable social competition.

  13. Baseline performance and learning rate of conceptual and perceptual skill-learning tasks: the effect of moderate to severe traumatic brain injury.

    Science.gov (United States)

    Vakil, Eli; Lev-Ran Galon, Carmit

    2014-01-01

    Existing literature presents a complex and inconsistent picture of the specific deficiencies involved in skill learning following traumatic brain injury (TBI). In an attempt to address this difficulty, individuals with moderate to severe TBI (n = 29) and a control group (n = 29) were tested with two different skill-learning tasks: conceptual (i.e., Tower of Hanoi Puzzle, TOHP) and perceptual (i.e., mirror reading, MR). Based on previous studies of the effect of divided attention on these tasks and findings regarding the effect of TBI on conceptual and perceptual priming tasks, it was predicted that the group with TBI would show impaired baseline performance compared to controls in the TOHP task though their learning rate would be maintained, while both baseline performance and learning rate on the MR task would be maintained. Consistent with our predictions, overall baseline performance of the group with TBI was impaired in the TOHP test, while the learning rate was not. The learning rate on the MR task was preserved but, contrary to our prediction, response time of the group with TBI was slower than that of controls. The pattern of results observed in the present study was interpreted to possibly reflect an impairment of both the frontal lobes as well as that of diffuse axonal injury, which is well documented as being affected by TBI. The former impairment affects baseline performance of the conceptual learning skill, while the latter affects the overall slower performance of the perceptual learning skill.

  14. Conceptual structure of performance assessments conducted for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Helton, J.C.; Marietta, M.G.; Rechard, R.P.

    1993-04-01

    The Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico is being developed by the US Department of Energy as a disposal facility for transuranic waste. In support of this project, Sandia National Laboratories is conducting an ongoing performance assessment (PA) for the WIPP. The ordered triple representation for risk proposed by Kaplan and Garrick is used to provide a clear conceptual structure for this PA. This presentation describes how the preceding representation provides a basis in the WIPP PA for (1) the definition of scenarios and the calculation of scenario probabilities and consequences, (2) the separation of subjective and stochastic uncertainties, (3) the construction of the complementary cumulative distribution functions required in comparisons with the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (i.e., 40 CFR Part 191, Subpart B), and (4) the performance of uncertainty and sensitivity studies. Results obtained in a preliminary PA for the WIPP completed in December of 1991 are used for illustration

  15. To Master or Perform? Exploring Relations between Achievement Goals and Conceptual Change Learning

    Science.gov (United States)

    Ranellucci, John; Muis, Krista R.; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M.

    2013-01-01

    Background: Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. Aims: To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Sample and Method:…

  16. VINE-A NUMERICAL CODE FOR SIMULATING ASTROPHYSICAL SYSTEMS USING PARTICLES. II. IMPLEMENTATION AND PERFORMANCE CHARACTERISTICS

    International Nuclear Information System (INIS)

    Nelson, Andrew F.; Wetzstein, M.; Naab, T.

    2009-01-01

    We continue our presentation of VINE. In this paper, we begin with a description of relevant architectural properties of the serial and shared memory parallel computers on which VINE is intended to run, and describe their influences on the design of the code itself. We continue with a detailed description of a number of optimizations made to the layout of the particle data in memory and to our implementation of a binary tree used to access that data for use in gravitational force calculations and searches for smoothed particle hydrodynamics (SPH) neighbor particles. We describe the modifications to the code necessary to obtain forces efficiently from special purpose 'GRAPE' hardware, the interfaces required to allow transparent substitution of those forces in the code instead of those obtained from the tree, and the modifications necessary to use both tree and GRAPE together as a fused GRAPE/tree combination. We conclude with an extensive series of performance tests, which demonstrate that the code can be run efficiently and without modification in serial on small workstations or in parallel using the OpenMP compiler directives on large-scale, shared memory parallel machines. We analyze the effects of the code optimizations and estimate that they improve its overall performance by more than an order of magnitude over that obtained by many other tree codes. Scaled parallel performance of the gravity and SPH calculations, together the most costly components of most simulations, is nearly linear up to at least 120 processors on moderate sized test problems using the Origin 3000 architecture, and to the maximum machine sizes available to us on several other architectures. At similar accuracy, performance of VINE, used in GRAPE-tree mode, is approximately a factor 2 slower than that of VINE, used in host-only mode. Further optimizations of the GRAPE/host communications could improve the speed by as much as a factor of 3, but have not yet been implemented in VINE

  17. The added value of international benchmarks for fuel performance codes: an illustration on the basis of TRANSURANUS

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Schubert, A.; Gyeori, C.; Van De Laar, J.

    2009-01-01

    Safety authorities and fuel designers, as well as nuclear research centers rely heavily on fuel performance codes for predicting the behaviour and life-time of fuel rods. The simulation tools are developed and validated on the basis of experimental results, some of which is in the public domain such as the International Fuel Performance Experiments database of the OECD/NEA and IAEA. Publicly available data constitute an excellent basis for assessing codes themselves, but also to compare codes that are being developed by independent teams. The present report summarises the advantages for the TRANSURANUS code by taking part in previous benchmarks organised by the IAEA, and outlines the preliminary results along with the perspectives of our participation in the current coordinated research project FUMEXIII

  18. Reflections on the conceptualization and operationalization of a set-theoretic approach to employee motivation and performance research

    Directory of Open Access Journals (Sweden)

    James Christopher Ryan

    2017-01-01

    Full Text Available The current commentary offers a reflection on the conceptualizations of Lee and Raschke's (2016 proposal for a set-theoretic approach to employee motivation and organizational performance. The commentary is informed by the current author's operationalization of set-theoretic research on employee motivation which occurred contemporaneously to the work of Lee and Raschke. Observations on the state of current research on employee motivation, development of motivation theory and future directions of set-theoretic approaches to employee motivation and performance are offered.

  19. Coding the Complexity of Activity in Video Recordings

    DEFF Research Database (Denmark)

    Harter, Christopher Daniel; Otrel-Cass, Kathrin

    2017-01-01

    This paper presents a theoretical approach to coding and analyzing video data on human interaction and activity, using principles found in cultural historical activity theory. The systematic classification or coding of information contained in video data on activity can be arduous and time...... Bødker’s in 1996, three possible areas of expansion to Susanne Bødker’s method for analyzing video data were found. Firstly, a technological expansion due to contemporary developments in sophisticated analysis software, since the mid 1990’s. Secondly, a conceptual expansion, where the applicability...... of using Activity Theory outside of the context of human–computer interaction, is assessed. Lastly, a temporal expansion, by facilitating an organized method for tracking the development of activities over time, within the coding and analysis of video data. To expand on the above areas, a prototype coding...

  20. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    2017-02-01

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functional characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.

  1. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    International Nuclear Information System (INIS)

    Tso, C.F.; Hueggenberg, R.

    2004-01-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work

  2. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Tso, C.F. [Arup (United Kingdom); Hueggenberg, R. [Gesellschaft fuer Nuklear-Behaelter mbH (Germany)

    2004-07-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work.

  3. Fire-safety engineering and performance-based codes

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    project administrators, etc. The book deals with the following topics: • Historical presentation on the subject of fire • Legislation and building project administration • European fire standardization • Passive and active fire protection • Performance-based Codes • Fire-safety Engineering • Fundamental......Fire-safety Engineering is written as a textbook for Engineering students at universities and other institutions of higher education that teach in the area of fire. The book can also be used as a work of reference for consulting engineers, Building product manufacturers, contractors, building...... thermodynamics • Heat exchange during the fire process • Skin burns • Burning rate, energy release rate and design fires • Proposal to Risk-based design fires • Proposal to a Fire scale • Material ignition and flame spread • Fire dynamics in buildings • Combustion products and toxic gases • Smoke inhalation...

  4. Improving 3D-Turbo Code's BER Performance with a BICM System over Rayleigh Fading Channel

    Directory of Open Access Journals (Sweden)

    R. Yao

    2016-12-01

    Full Text Available Classical Turbo code suffers from high error floor due to its small Minimum Hamming Distance (MHD. Newly-proposed 3D-Turbo code can effectively increase the MHD and achieve a lower error floor by adding a rate-1 post encoder. In 3D-Turbo codes, part of the parity bits from the classical Turbo encoder are further encoded through the post encoder. In this paper, a novel Bit-Interleaved Coded Modulation (BICM system is proposed by combining rotated mapping Quadrature Amplitude Modulation (QAM and 3D-Turbo code to improve the Bit Error Rate (BER performance of 3D-Turbo code over Raleigh fading channel. A key-bit protection scheme and a Two-Dimension (2D iterative soft demodulating-decoding algorithm are developed for the proposed BICM system. Simulation results show that the proposed system can obtain about 0.8-1.0 dB gain at BER of 10^{-6}, compared with the existing BICM system with Gray mapping QAM.

  5. Conceptual design of an angular multiplexed 50 kJ KrF amplifier for ICF

    International Nuclear Information System (INIS)

    Lowenthal, D.D.; Ewing, J.J.; Center, R.E.; Mumola, P.; Olson, T.

    1981-01-01

    The results of a conceptual design for an angular multiplexed 50 kJ KrF amplifier for ICF are presented. Optical designs, amplifier scaling with a KrF kinetics code and limitations imposed by pulsed power technology are described

  6. Conceptual study on high performance blanket in a spherical tokamak fusion-driven transmuter

    International Nuclear Information System (INIS)

    Chen Yixue; Wu Yican

    2000-01-01

    A preliminary conceptual design on high performance dual-cooled blanket of fusion-driven transmuter is presented based on neutronic calculation. The dual-cooled system has some attractive advantages when utilized in transmutation of HLW (High Level Wastes). The calculation results show that this kind of blanket could safely transmute about 6 ton minor actinides (produced by 170 GW(e) Year PWRs approximately) and 0.4 ton fission products per year, and output 12 GW thermal power. In addition, the variation of power and critical factor of this blanket is relatively little during its 1-year operation period. This blanket is also tritium self-sustainable

  7. Implementation and Performance Evaluation of Distributed Cloud Storage Solutions using Random Linear Network Coding

    DEFF Research Database (Denmark)

    Fitzek, Frank; Toth, Tamas; Szabados, Áron

    2014-01-01

    This paper advocates the use of random linear network coding for storage in distributed clouds in order to reduce storage and traffic costs in dynamic settings, i.e. when adding and removing numerous storage devices/clouds on-the-fly and when the number of reachable clouds is limited. We introduce...... various network coding approaches that trade-off reliability, storage and traffic costs, and system complexity relying on probabilistic recoding for cloud regeneration. We compare these approaches with other approaches based on data replication and Reed-Solomon codes. A simulator has been developed...... to carry out a thorough performance evaluation of the various approaches when relying on different system settings, e.g., finite fields, and network/storage conditions, e.g., storage space used per cloud, limited network use, and limited recoding capabilities. In contrast to standard coding approaches, our...

  8. Application of the BISON Fuel Performance Code of the FUMEX-III Coordinated Research Project

    International Nuclear Information System (INIS)

    Williamson, R.L.; Novascone, S.R.

    2013-01-01

    Since 1981, the International Atomic Energy Agency (IAEA) has sponsored a series of Coordinated Research Projects (CRP) in the area of nuclear fuel modeling. These projects have typically lasted 3-5 years and have had broad international participation. The objectives of the projects have been to assess the maturity and predictive capability of fuel performance codes, support interaction and information exchange between countries with code development and application needs, build a database of well- defined experiments suitable for code validation, transfer a mature fuel modeling code to developing countries, and provide guidelines for code quality assurance and code application to fuel licensing. The fourth and latest of these projects, known as FUMEX-III1 (FUel Modeling at EXtended Burnup- III), began in 2008 and ended in December of 2011. FUMEX-III was the first of this series of fuel modeling CRP's in which the INL participated. Participants met at the beginning of the project to discuss and select a set of experiments ('priority cases') for consideration during the project. These priority cases were of broad interest to the participants and included reasonably well-documented and reliable data. A meeting was held midway through the project for participants to present and discuss progress on modeling the priority cases. A final meeting was held at close of the project to present and discuss final results and provide input for a final report. Also in 2008, the INL initiated development of a new multidimensional (2D and 3D) multiphysics nuclear fuel performance code called BISON, with code development progressing steadily during the three-year FUMEX-III project. Interactions with international fuel modeling researchers via FUMEX-III played a significant role in the BISON evolution, particularly influencing the selection of material and behavioral models which are now included in the code. The FUMEX-III cases are generally integral fuel rod experiments occurring

  9. Embedded Sensors and Controls to Improve Component Performance and Reliability Conceptual Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Kisner, R.; Melin, A.; Burress, T.; Fugate, D.; Holcomb, D.; Wilgen, J.; Miller, J.; Wilson, D.; Silva, P.; Whitlow, L.; Peretz, F.

    2012-09-15

    The objective of this project is to demonstrate improved reliability and increased performance made possible by deeply embedding instrumentation and controls (I&C) in nuclear power plant (NPP) components and systems. The project is employing a highly instrumented canned rotor, magnetic bearing, fluoride salt pump as its I&C technology demonstration platform. I&C is intimately part of the basic millisecond-by-millisecond functioning of the system; treating I&C as an integral part of the system design is innovative and will allow significant improvement in capabilities and performance. As systems become more complex and greater performance is required, traditional I&C design techniques become inadequate and more advanced I&C needs to be applied. New I&C techniques enable optimal and reliable performance and tolerance of noise and uncertainties in the system rather than merely monitoring quasistable performance. Traditionally, I&C has been incorporated in NPP components after the design is nearly complete; adequate performance was obtained through over-design. By incorporating I&C at the beginning of the design phase, the control system can provide superior performance and reliability and enable designs that are otherwise impossible. This report describes the progress and status of the project and provides a conceptual design overview for the platform to demonstrate the performance and reliability improvements enabled by advanced embedded I&C.

  10. Development and application of the BISON fuel performance code to the analysis of fission gas behaviour

    International Nuclear Information System (INIS)

    Pastore, G.; Hales, J.D.; Novascone, S.R.; Perez, D.M.; Spencer, B.W.; Williamson, R.L.

    2014-01-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that has been under development at Idaho National Laboratory (USA) since 2009. The capabilities of BISON comprise implicit solution of the fully coupled thermo-mechanics and diffusion equations, applicability to a variety of fuel forms, and simulation of both steady-state and transient conditions. The code includes multiphysics constitutive behavior for both fuel and cladding materials, and is designed for efficient use on highly parallel computers. This paper describes the main features of BISON, with emphasis on recent developments in modelling of fission gas behaviour in LWR-UO 2 fuel. The code is applied to the simulation of fuel rod irradiation experiments from the OECD/NEA International Fuel Performance Experiments Database. The comparison of the results with the available experimental data of fuel temperature, fission gas release, and cladding diametrical strain during pellet-cladding mechanical interaction is presented, pointing out a promising potential of the BISON code with the new fission gas behaviour model. (authors)

  11. Annotating non-coding regions of the genome.

    Science.gov (United States)

    Alexander, Roger P; Fang, Gang; Rozowsky, Joel; Snyder, Michael; Gerstein, Mark B

    2010-08-01

    Most of the human genome consists of non-protein-coding DNA. Recently, progress has been made in annotating these non-coding regions through the interpretation of functional genomics experiments and comparative sequence analysis. One can conceptualize functional genomics analysis as involving a sequence of steps: turning the output of an experiment into a 'signal' at each base pair of the genome; smoothing this signal and segmenting it into small blocks of initial annotation; and then clustering these small blocks into larger derived annotations and networks. Finally, one can relate functional genomics annotations to conserved units and measures of conservation derived from comparative sequence analysis.

  12. System Performance of Concatenated STBC and Block Turbo Codes in Dispersive Fading Channels

    Directory of Open Access Journals (Sweden)

    Kam Tai Chan

    2005-05-01

    Full Text Available A new scheme of concatenating the block turbo code (BTC with the space-time block code (STBC for an OFDM system in dispersive fading channels is investigated in this paper. The good error correcting capability of BTC and the large diversity gain characteristics of STBC can be achieved simultaneously. The resulting receiver outperforms the iterative convolutional Turbo receiver with maximum- a-posteriori-probability expectation maximization (MAP-EM algorithm. Because of its ability to perform the encoding and decoding processes in parallel, the proposed system is easy to implement in real time.

  13. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    Energy Technology Data Exchange (ETDEWEB)

    Giovedi, Claudia; Martins, Marcelo Ramos, E-mail: claudia.giovedi@labrisco.usp.br, E-mail: mrmartin@usp.br [Laboratorio de Analise, Avaliacao e Gerenciamento de Risco (LabRisco/POLI/USP), São Paulo, SP (Brazil); Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e, E-mail: ayabe@ipen.br, E-mail: dsgomes@ipen.br, E-mail: teixiera@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  14. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    International Nuclear Information System (INIS)

    Giovedi, Claudia; Martins, Marcelo Ramos; Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e

    2017-01-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  15. Performance Analysis of DPSK Signals with Selection Combining and Convolutional Coding in Fading Channel

    National Research Council Canada - National Science Library

    Ong, Choon

    1998-01-01

    The performance analysis of a differential phase shift keyed (DPSK) communications system, operating in a Rayleigh fading environment, employing convolutional coding and diversity processing is presented...

  16. Using Annotated Conceptual Models to Derive Information System Implementations

    Directory of Open Access Journals (Sweden)

    Anthony Berglas

    1994-05-01

    Full Text Available Producing production quality information systems from conceptual descriptions is a time consuming process that employs many of the world's programmers. Although most of this programming is fairly routine, the process has not been amenable to simple automation because conceptual models do not provide sufficient parameters to make all the implementation decisions that are required, and numerous special cases arise in practice. Most commercial CASE tools address these problems by essentially implementing a waterfall model in which the development proceeds from analysis through design, layout and coding phases in a partially automated manner, but the analyst/programmer must heavily edit each intermediate stage. This paper demonstrates that by recognising the nature of information systems, it is possible to specify applications completely using a conceptual model that has een annotated with additional parameters that guide automated implementation. More importantly, it will be argued that a manageable number of annotations are sufficient to implement realistic applications, and techniques will be described that enabled the author's commercial CASE tool, the Intelligent Develope to automated implementation without requiring complex theorem proving technology.

  17. Conceptual Masking: How One Picture Captures Attention from Another Picture.

    Science.gov (United States)

    Loftus, Geoffrey R.; And Others

    1988-01-01

    Five experiments studied operations of conceptual masking--the reduction of conceptual memory performance for an initial stimulus when it is followed by a masking picture process. The subjects were 337 undergraduates at the University of Washington (Seattle). Conceptual masking is distinguished from perceptual masking. (TJH)

  18. Evaluation of CFETR as a Fusion Nuclear Science Facility using multiple system codes

    International Nuclear Information System (INIS)

    Chan, V.S.; Garofalo, A.M.; Leuer, J.A.; Costley, A.E.; Wan, B.N.

    2015-01-01

    This paper presents the results of a multi-system codes benchmarking study of the recently published China Fusion Engineering Test Reactor (CFETR) pre-conceptual design (Wan et al 2014 IEEE Trans. Plasma Sci. 42 495). Two system codes, General Atomics System Code (GASC) and Tokamak Energy System Code (TESC), using different methodologies to arrive at CFETR performance parameters under the same CFETR constraints show that the correlation between the physics performance and the fusion performance is consistent, and the computed parameters are in good agreement. Optimization of the first wall surface for tritium breeding and the minimization of the machine size are highly compatible. Variations of the plasma currents and profiles lead to changes in the required normalized physics performance, however, they do not significantly affect the optimized size of the machine. GASC and TESC have also been used to explore a lower aspect ratio, larger volume plasma taking advantage of the engineering flexibility in the CFETR design. Assuming the ITER steady-state scenario physics, the larger plasma together with a moderately higher B T and I p can result in a high gain Q fus  ∼ 12, P fus  ∼ 1 GW machine approaching DEMO-like performance. It is concluded that the CFETR baseline mode can meet the minimum goal of the Fusion Nuclear Science Facility (FNSF) mission and advanced physics will enable it to address comprehensively the outstanding critical technology gaps on the path to a demonstration reactor (DEMO). Before proceeding with CFETR construction steady-state operation has to be demonstrated, further development is needed to solve the divertor heat load issue, and blankets have to be designed with tritium breeding ratio (TBR) >1 as a target. (paper)

  19. Structural analysis by load combination for conceptual design of HCCR TBM-set

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Eo Hwak; Kim, Suk-Kwon; Yoon, Jae Sung; Park, Seong Dae [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Shin, Kyu In [Gentec Co., Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    Highlights: • The structural analyses of the load combination are performed for the HCCR TBM-set. • Structural analyses are performed with the conventional CFD and FEM codes (ANSYS-CFX and ANSYS). • The results of stresses and deformations for all the loads conditions are confirmed to meet the design criteria. - Abstract: Using a conceptual design of the Korean Helium Cooled Ceramic Reflector (HCCR) Test Blanket Module (TBM) including the TBM-shield for testing in ITER, a structural analysis is performed according to the selected Load Combinations (LCs), which is described in the TBM Port Plug (TBM PP) System Load Specifications. Single load analyses are performed such as dead weight (DW), operating pressure (PresO), operating temperature (THO), electromagnetic (EM; MD-I, MD-II, and MD-IV), seismic (SL-1, SL-2, and SMHV), and in-TBM LOCA-IV. In addition, their results are superposed for the selected LCs. Through an analysis with 14 single load cases and 10 LC cases, it is confirmed that all load combination results meet the design criteria from the stress breakdown analysis according to the RCC-MRx.

  20. Better Measurement for Performance Improvement in Low- and Middle-Income Countries: The Primary Health Care Performance Initiative (PHCPI) Experience of Conceptual Framework Development and Indicator Selection.

    Science.gov (United States)

    Veillard, Jeremy; Cowling, Krycia; Bitton, Asaf; Ratcliffe, Hannah; Kimball, Meredith; Barkley, Shannon; Mercereau, Laure; Wong, Ethan; Taylor, Chelsea; Hirschhorn, Lisa R; Wang, Hong

    2017-12-01

    Policy Points: Strengthening accountability through better measurement and reporting is vital to ensure progress in improving quality primary health care (PHC) systems and achieving universal health coverage (UHC). The Primary Health Care Performance Initiative (PHCPI) provides national decision makers and global stakeholders with opportunities to benchmark and accelerate performance improvement through better performance measurement. Results from the initial PHC performance assessments in low- and middle-income countries (LMICs) are helping guide PHC reforms and investments and improve the PHCPI's instruments and indicators. Findings from future assessment activities will further amplify cross-country comparisons and peer learning to improve PHC. New indicators and sources of data are needed to better understand PHC system performance in LMICs. The Primary Health Care Performance Initiative (PHCPI), a collaboration between the Bill and Melinda Gates Foundation, The World Bank, and the World Health Organization, in partnership with Ariadne Labs and Results for Development, was launched in 2015 with the aim of catalyzing improvements in primary health care (PHC) systems in 135 low- and middle-income countries (LMICs), in order to accelerate progress toward universal health coverage. Through more comprehensive and actionable measurement of quality PHC, the PHCPI stimulates peer learning among LMICs and informs decision makers to guide PHC investments and reforms. Instruments for performance assessment and improvement are in development; to date, a conceptual framework and 2 sets of performance indicators have been released. The PHCPI team developed the conceptual framework through literature reviews and consultations with an advisory committee of international experts. We generated 2 sets of performance indicators selected from a literature review of relevant indicators, cross-referenced against indicators available from international sources, and evaluated through

  1. 3D Analysis of Cooling Performance with Loss of Offsite Power Using GOTHIC Code

    International Nuclear Information System (INIS)

    Oh, Kye Min; Heo, Gyun Young; Na, In Sik; Choi, Yu Jung

    2010-01-01

    GOTHIC code enables to analyze one-dimensional or multi-dimensional problems for evaluating the cooling performance of loss of offsite power. The conventional GOTHIC code analysis performs heat transfer between plant containment and the outside of the fan cooler tubes by modeling each of fan cooler part model and component cooling water inside tube each to analyze boiling probability. In this paper, we suggest a way which reduces the multi-procedure of the cooling performance with loss of offsite power or the heat transfer states with complex geometrical structure to a single-procedure and verify the applicability of the heat transfer differences from the containment atmosphere humidity changes by the multi-nodes which component cooling water of tube or air of Reactor Containment Fan Cooler in the containment, otherwise the component model uses only one node

  2. Establishment of joint application system of safety analysis codes between Korea and Vietnam

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Park, Cheol; Bae, Sung Won; Baek, Won Pil; Song, Cheol hwa; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Lee, Chang Sup

    2011-04-01

    The following KAERI-VAEI collaboration works have been performed during the 2 year project ('09.4∼'11.4). 1) On the job training of Vietnam code users(1st training for 4 VAEI staff-3 months. 2nd training for 3 VAEI staff- 3 month), 2) Lecture of nuclear safety analysis (30 hrs basic course and 30 hrs advanced course), 3) Review of safety analysis method (IAEA safety concept and requirements), 4) Collaborative assessment of safety analysis code MARS (13 conceptual problem, 2 separate effect test problem, 1 integral effect test problem), 5) Input deck preparation of standard PWR (Preparation of APR1400 input deck and safety analysis of DBA). VAEI staffs have been familiarized to Korean PWR safety assessment technology through the collaboration assessment work using a computer code developed in Korea. The lectures for Vietnamese research will be contributed to the utilization and cultivation of Korean safety technology. The collaborated assessment works will be used for the establishment of MARS based safety analysis system which is independent from US safety assessment system

  3. Establishment of joint application system of safety analysis codes between Korea and Vietnam

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Kim, Kyung Doo; Park, Cheol; Bae, Sung Won; Baek, Won Pil; Song, Cheol hwa; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Lee, Chang Sup [KAERI, Daejeon (Korea, Republic of)

    2011-04-15

    The following KAERI-VAEI collaboration works have been performed during the 2 year project ('09.4{approx}'11.4). 1) On the job training of Vietnam code users(1st training for 4 VAEI staff-3 months. 2nd training for 3 VAEI staff- 3 month), 2) Lecture of nuclear safety analysis (30 hrs basic course and 30 hrs advanced course), 3) Review of safety analysis method (IAEA safety concept and requirements), 4) Collaborative assessment of safety analysis code MARS (13 conceptual problem, 2 separate effect test problem, 1 integral effect test problem), 5) Input deck preparation of standard PWR (Preparation of APR1400 input deck and safety analysis of DBA). VAEI staffs have been familiarized to Korean PWR safety assessment technology through the collaboration assessment work using a computer code developed in Korea. The lectures for Vietnamese research will be contributed to the utilization and cultivation of Korean safety technology. The collaborated assessment works will be used for the establishment of MARS based safety analysis system which is independent from US safety assessment system

  4. Conceptual analyses of neutronic and equilibrium refueling parameters to develop a cost-effective multi-purpose pool-type research reactor using WIMSD and CITVAP codes

    Energy Technology Data Exchange (ETDEWEB)

    Hedayat, Afshin, E-mail: ahedayat@aeoi.org.ir

    2016-12-01

    Highlights: • Introducing a high-beneficent and low-cost multipurpose research reactor. • High technical documents and standard safety issues are introduced coherently. • High effective conceptual neutronic analyses and fuel management strategy. • Gaining high score design criteria and safety margins via 3-D core modeling. • Capacity and capability to produce all medical and industrial radioisotopes. - Abstract: In this paper, neutronic and equilibrium refueling parameters of a multi-purpose cost-effective research reactor have been studied and analyzed. It has been tried to provide periodic and long-term requirements of the irradiating applications coherently. The WIMSD5B and CITVAP codes are used to calculate neutronic parameters and simulate fuel management strategy. The used nuclear data, codes, and calculating methods have been severally benchmarked and verified, successfully. Fundamental concepts, design criteria, and safety issues are introduced and discussed, coherently. Design criteria are selected to gain the most economic benefits per capital costs via minimum required reactor power. Accurate, fast and simplified models have been tried for an integrated decision making and analyses using deterministic codes. Core management, power effects, fuel consumption and burn up effects, and also a complete simulation of the fuel management strategy are presented and analyzed. Results show that the supposed reactor core design can be promisingly suitable in accordance with the commercial multi-purpose irradiating applications. It also retains Operating Limits and Conditions (OLCs) due to standard safety issues, conservatively where safety parameters are calculated using best estimate tools. Such reactor core configuration and integrated refueling task can effectively enhance the Quality Assurance (QA) of the general irradiating applications of the current MTR within their power limits and corresponding OLCs.

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  6. Dosskin code for radiological evaluation of skin radioactive contaminations

    International Nuclear Information System (INIS)

    Cornejo D, N.

    1996-01-01

    The conceptual procedure and computational features of the DOSSKIN code are shown. This code calculates, in a very interactive way, skin equivalent doses and radiological risk related to skin radioactive contaminations. The evaluation takes into account the contributions of contaminant daughter nuclides and backscattering of beta particles in any skin cover. DOSSKIN also allows to estimate the maximum time needed to decontaminate the affected zone, using, as input quantity, the limit value of skin equivalent dose considered by users. The comparison of the results obtained by the DOSSKIN code with those reported by different authors are showed. The differences of results are less than 30%. (authors). 4 refs., 3 fig., 1 tab

  7. Performance Analysis of a New Coded TH-CDMA Scheme in Dispersive Infrared Channel with Additive Gaussian Noise

    Science.gov (United States)

    Hamdi, Mazda; Kenari, Masoumeh Nasiri

    2013-06-01

    We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.

  8. Effects of donor proliferation in development aid for health on health program performance: A conceptual framework.

    Science.gov (United States)

    Pallas, Sarah Wood; Ruger, Jennifer Prah

    2017-02-01

    Development aid for health increased dramatically during the past two decades, raising concerns about inefficiency and lack of coherence among the growing number of global health donors. However, we lack a framework for how donor proliferation affects health program performance to inform theory-based evaluation of aid effectiveness policies. A review of academic and gray literature was conducted. Data were extracted from the literature sample on study design and evidence for hypothesized effects of donor proliferation on health program performance, which were iteratively grouped into categories and mapped into a new conceptual framework. In the framework, increases in the number of donors are hypothesized to increase inter-donor competition, transaction costs, donor poaching of recipient staff, recipient control over aid, and donor fragmentation, and to decrease donors' sense of accountability for overall development outcomes. There is mixed evidence on whether donor proliferation increases or decreases aid volume. These primary effects in turn affect donor innovation, information hoarding, and aid disbursement volatility, as well as recipient country health budget levels, human resource capacity, and corruption, and the determinants of health program performance. The net effect of donor proliferation on health will vary depending on the magnitude of the framework's competing effects in specific country settings. The conceptual framework provides a foundation for improving design of aid effectiveness practices to mitigate negative effects from donor proliferation while preserving its potential benefits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  10. Soft-Decision-Data Reshuffle to Mitigate Pulsed Radio Frequency Interference Impact on Low-Density-Parity-Check Code Performance

    Science.gov (United States)

    Ni, Jianjun David

    2011-01-01

    This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.

  11. Embedded Sensors and Controls to Improve Component Performance and Reliability: Conceptual Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Kisner, Roger A [ORNL; Melin, Alexander M [ORNL; Burress, Timothy A [ORNL; Fugate, David L [ORNL; Holcomb, David Eugene [ORNL; Wilgen, John B [ORNL; Miller, John M [ORNL; Wilson, Dane F [ORNL; Silva, Pamela C [ORNL; Whitlow, Lynsie J [ORNL; Peretz, Fred J [ORNL

    2012-10-01

    The overall project objective is to demonstrate improved reliability and increased performance made possible by deeply embedding instrumentation and controls (I&C) in nuclear power plant components. The project is employing a highly instrumented canned rotor, magnetic bearing, fluoride salt pump as its I&C technology demonstration vehicle. The project s focus is not primarily on pump design, but instead is on methods to deeply embed I&C within a pump system. However, because the I&C is intimately part of the basic millisecond-by-millisecond functioning of the pump, the I&C design cannot proceed in isolation from the other aspects of the pump. The pump will not function if the characteristics of the I&C are not embedded within the design because the I&C enables performance of the basic function rather than merely monitoring quasi-stable performance. Traditionally, I&C has been incorporated in nuclear power plant (NPP) components after their design is nearly complete; adequate performance was obtained through over-design. This report describes the progress and status of the project and provides a conceptual design overview for the embedded I&C pump.

  12. Performance Comparison of Assorted Color Spaces for Multilevel Block Truncation Coding based Face Recognition

    OpenAIRE

    H.B. Kekre; Sudeep Thepade; Karan Dhamejani; Sanchit Khandelwal; Adnan Azmi

    2012-01-01

    The paper presents a performance analysis of Multilevel Block Truncation Coding based Face Recognition among widely used color spaces. In [1], Multilevel Block Truncation Coding was applied on the RGB color space up to four levels for face recognition. Better results were obtained when the proposed technique was implemented using Kekre’s LUV (K’LUV) color space [25]. This was the motivation to test the proposed technique using assorted color spaces. For experimental analysis, two face databas...

  13. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  14. WWER-440 fuel rod performance analysis with PIN-Micro and TRANSURANUS codes

    International Nuclear Information System (INIS)

    Vitkova, M.; Manolova, M.; Stefanova, S.; Simeonova, V.; Passage, G.; Lassmann, K.

    1994-01-01

    PIN-micro and TRANSURANUS codes were used to analyse the WWER-440 fuel rod behaviour at normal operation conditions. Two highest loaded fuel rods of the fuel assemblies irradiated in WWER-440 with different power histories were selected. A set of the most probable average values of all geometrical and technological parameters were used. A comparison between PIN-micro and TRANSURANUS codes was performed using identical input data. The results for inner gas pressure, gap size, local linear heat rate, fuel central temperature and fission gas release as a function of time calculated for the selected fuel rods are presented. The following conclusions were drawn: 1) The PIN-micro code predicts adequately the thermal and mechanical behaviour of the two fuel rods; 2) The comparison of the results obtained by PIN-micro and TRANSURANUS shows a reasonable agreement and the discrepancies could be explained by the lack of thoroughly WWER oriented verification of TRANSURANUS; 3) The advanced TRANSURANUS code could be successfully applied for WWER fuel rod thermal and mechanical analysis after incorporation of all necessary WWER specific material properties and models for the Zr+1%Nb cladding, for the fuel rod as a whole and after validation against WWER experimental and operational data. 1 tab., 10 figs., 10 refs

  15. WWER-440 fuel rod performance analysis with PIN-Micro and TRANSURANUS codes

    Energy Technology Data Exchange (ETDEWEB)

    Vitkova, M; Manolova, M; Stefanova, S; Simeonova, V; Passage, G [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika; Kharalampieva, Ts [Kombinat Atomna Energetika, Kozloduj (Bulgaria); Lassmann, K [European Atomic Energy Community, Karlsruhe (Germany). European Inst. for Transuranium Elements

    1994-12-31

    PIN-micro and TRANSURANUS codes were used to analyse the WWER-440 fuel rod behaviour at normal operation conditions. Two highest loaded fuel rods of the fuel assemblies irradiated in WWER-440 with different power histories were selected. A set of the most probable average values of all geometrical and technological parameters were used. A comparison between PIN-micro and TRANSURANUS codes was performed using identical input data. The results for inner gas pressure, gap size, local linear heat rate, fuel central temperature and fission gas release as a function of time calculated for the selected fuel rods are presented. The following conclusions were drawn: (1) The PIN-micro code predicts adequately the thermal and mechanical behaviour of the two fuel rods; (2) The comparison of the results obtained by PIN-micro and TRANSURANUS shows a reasonable agreement and the discrepancies could be explained by the lack of thoroughly WWER oriented verification of TRANSURANUS; (3) The advanced TRANSURANUS code could be successfully applied for WWER fuel rod thermal and mechanical analysis after incorporation of all necessary WWER specific material properties and models for the Zr+1%Nb cladding, for the fuel rod as a whole and after validation against WWER experimental and operational data. 1 tab., 10 figs., 10 refs.

  16. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  17. A Method to Reveal Fine-Grained and Diverse Conceptual Progressions during Learning

    Science.gov (United States)

    Lombard, François; Merminod, Marie; Widmer, Vincent; Schneider, Daniel K.

    2018-01-01

    Empirical data on learners' conceptual progression is required to design curricula and guide students. In this paper, we present the Reference Map Change Coding (RMCC) method for revealing students' progression at a fine-grained level. The method has been developed and tested through the analysis of successive versions of the productions of eight…

  18. A probabilistic analysis of PWR and BWR fuel rod performance using the code CASINO-SLEUTH

    International Nuclear Information System (INIS)

    Bull, A.J.

    1987-01-01

    This paper presents a brief description of the Monte Carlo and response surface techniques used in the code, and a probabilistic analysis of fuel rod performance in PWR and BWR applications. The analysis shows that fission gas release predictions are very sensitive to changes in certain of the code's inputs, identifies the most dominant input parameters and compares their effects in the two cases. (orig./HP)

  19. Development of a computer code for low-and intermediate-level radioactive waste disposal safety assessment

    International Nuclear Information System (INIS)

    Park, J. W.; Kim, C. L.; Lee, E. Y.; Lee, Y. M.; Kang, C. H.; Zhou, W.; Kozak, M. W.

    2002-01-01

    A safety assessment code, called SAGE (Safety Assessment Groundwater Evaluation), has been developed to describe post-closure radionuclide releases and potential radiological doses for low- and intermediate-level radioactive waste (LILW) disposal in an engineered vault facility in Korea. The conceptual model implemented in the code is focused on the release of radionuclide from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. The radionuclide transport equations are solved by spatially discretizing the disposal system into a series of compartments. Mass transfer between compartments is by diffusion/dispersion and advection. In all compartments, radionuclides are decayed either as a single-member chain or as multi-member chains. The biosphere is represented as a set of steady-state, radionuclide-specific pathway dose conversion factors that are multiplied by the appropriate release rate from the far field for each pathway. The code has the capability to treat input parameters either deterministically or probabilistically. Parameter input is achieved through a user-friendly Graphical User Interface. An application is presented, which is compared against safety assessment results from the other computer codes, to benchmark the reliability of system-level conceptual modeling of the code

  20. Bad-good constraints on a polarity correspondence account for the spatial-numerical association of response codes (SNARC) and markedness association of response codes (MARC) effects.

    Science.gov (United States)

    Leth-Steensen, Craig; Citta, Richie

    2016-01-01

    Performance in numerical classification tasks involving either parity or magnitude judgements is quicker when small numbers are mapped onto a left-sided response and large numbers onto a right-sided response than for the opposite mapping (i.e., the spatial-numerical association of response codes or SNARC effect). Recent research by Gevers et al. [Gevers, W., Santens, S., Dhooge, E., Chen, Q., Van den Bossche, L., Fias, W., & Verguts, T. (2010). Verbal-spatial and visuospatial coding of number-space interactions. Journal of Experimental Psychology: General, 139, 180-190] suggests that this effect also arises for vocal "left" and "right" responding, indicating that verbal-spatial coding has a role to play in determining it. Another presumably verbal-based, spatial-numerical mapping phenomenon is the linguistic markedness association of response codes (MARC) effect whereby responding in parity tasks is quicker when odd numbers are mapped onto left-sided responses and even numbers onto right-sided responses. A recent account of both the SNARC and MARC effects is based on the polarity correspondence principle [Proctor, R. W., & Cho, Y. S. (2006). Polarity correspondence: A general principle for performance of speeded binary classification tasks. Psychological Bulletin, 132, 416-442]. This account assumes that stimulus and response alternatives are coded along any number of dimensions in terms of - and + polarities with quicker responding when the polarity codes for the stimulus and the response correspond. In the present study, even-odd parity judgements were made using either "left" and "right" or "bad" and "good" vocal responses. Results indicated that a SNARC effect was indeed present for the former type of vocal responding, providing further evidence for the sufficiency of the verbal-spatial coding account for this effect. However, the decided lack of an analogous SNARC-like effect in the results for the latter type of vocal responding provides an important

  1. Influence of Code Size Variation on the Performance of 2D Hybrid ZCC/MD in OCDMA System

    Directory of Open Access Journals (Sweden)

    Matem Rima.

    2018-01-01

    Full Text Available Several two dimensional OCDMA have been developed in order to overcome many problems in optical network, enhancing cardinality, suppress Multiple Access Interference (MAI and mitigate Phase Induced Intensity Noise (PIIN. This paper propose a new 2D hybrid ZCC/MD code combining between 1D ZCC spectral encoding where M is its code length and 1D MD spatial spreading where N is its code length. The spatial spreading (N code length offers a good cardinality so it represents the main effect to enhance the performance of the system compared to the spectral (M code length according to the numerical results.

  2. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  3. Heterogeneous performances of conceptual dis/continuity: a dialectic reading of Brown and Kloser's article

    Science.gov (United States)

    Hwang, Sungwon; Kim, Mijung

    2009-12-01

    We review Brown and Kloser's article, "Conceptual continuity and the science of baseball: using informal science literacy to promote students science learning" from a Vygotskian cultural-historical and dialectic perspective. Brown and Kloser interpret interview data with student baseball players and claim that students' conceptual understanding articulated in vernacular genres involves continuities (similarities) with the canonical scientific explanations. In this commentary, we suggest that the authors' approach presupposes the dichotomy of the formal and the informal, which brings the authors' attention to continuity into the separation of cognition from language. We propose a Vygotskian approach that points out the problem of theorizing cognition (conceptual understanding) by depending on specific forms of representation (e.g., scientific terms). As alternative, we envision a Vygotskian cultural-historical approach to language, which considers different, irreducible modes of communication as an integrated whole and therefore allows theorizing cognition without dichotomizing it from the concrete ways by which human being communicates. We provide an exemplary analysis of a lecture talk in a university physics classroom and exemplify dialectic theories that explain the development of conceptual understanding. We discuss that this Vygotskian dialectic approach shows that people communicate scientific concepts through hybridization, which does not reproduce a genre self-identically; the continuity of conceptual understanding involves dis/continuity.

  4. The development of the Nuclear Electric core performance and fault transient analysis code package in support of Sizewell B

    International Nuclear Information System (INIS)

    Hall, P.; Hutt, P.

    1994-01-01

    This paper describes Nuclear Electric's (NE) development of an integrated code package in support of all its reactors including Sizewell B, designed for the provision of fuel management design, core performance studies, operational support and fault transient analysis. The package uses the NE general purpose three-dimensional transient reactor physics code PANTHER with cross-sections derived in the PWR case from the LWRWIMS LWR lattice neutronics code. The package also includes ENIGMA a generic fuel performance code and for PWR application VIPRE-01 a subchannel thermal hydraulics code, RELAP5 the system thermal hydraulics transient code and SCORPIO an on-line surveillance system. The paper describes the capabilities and validation of the elements of this package for PWR, how they are coupled within the package and the way in which they are being applied for Sizewell B to on-line surveillance and fault transient analysis. (Author)

  5. Assessment of stainless steel 348 fuel rod performance against literature available data using TRANSURANUS code

    Directory of Open Access Journals (Sweden)

    Giovedi Claudia

    2016-01-01

    Full Text Available Early pressurized water reactors were originally designed to operate using stainless steel as cladding material, but during their lifetime this material was replaced by zirconium-based alloys. However, after the Fukushima Daiichi accident, the problems related to the zirconium-based alloys due to the hydrogen production and explosion under severe accident brought the importance to assess different materials. In this sense, initiatives as ATF (Accident Tolerant Fuel program are considering different material as fuel cladding and, one candidate is iron-based alloy. In order to assess the fuel performance of fuel rods manufactured using iron-based alloy as cladding material, it was necessary to select a specific stainless steel (type 348 and modify properly conventional fuel performance codes developed in the last decades. Then, 348 stainless steel mechanical and physics properties were introduced in the TRANSURANUS code. The aim of this paper is to present the obtained results concerning the verification of the modified TRANSURANUS code version against data collected from the open literature, related to reactors which operated using stainless steel as cladding. Considering that some data were not available, some assumptions had to be made. Important differences related to the conventional fuel rods were taken into account. Obtained results regarding the cladding behavior are in agreement with available information. This constitutes an evidence of the modified TRANSURANUS code capabilities to perform fuel rod investigation of fuel rods manufactured using 348 stainless steel as cladding material.

  6. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  7. PERFORMANCE EVALUATION OF TURBO CODED OFDM SYSTEMS AND APPLICATION OF TURBO DECODING FOR IMPULSIVE CHANNEL

    Directory of Open Access Journals (Sweden)

    Savitha H. M.

    2010-09-01

    Full Text Available A comparison of the performance of hard and soft-decision turbo coded Orthogonal Frequency Division Multiplexing systems with Quadrature Phase Shift Keying (QPSK and 16-Quadrature Amplitude Modulation (16-QAM is considered in the first section of this paper. The results show that the soft-decision method greatly outperforms the hard-decision method. The complexity of the demapper is reduced with the use of simplified algorithm for 16-QAM demapping. In the later part of the paper, we consider the transmission of data over additive white class A noise (AWAN channel, using turbo coded QPSK and 16-QAM systems. We propose a novel turbo decoding scheme for AWAN channel. Also we compare the performance of turbo coded systems with QPSK and 16-QAM on AWAN channel with two different channel values- one computed as per additive white Gaussian noise (AWGN channel conditions and the other as per AWAN channel conditions. The results show that the use of appropriate channel value in turbo decoding helps to combat the impulsive noise more effectively. The proposed model for AWAN channel exhibits comparable Bit error rate (BER performance as compared to AWGN channel.

  8. Application and analysis of performance of dqpsk advanced modulation format in spectral amplitude coding ocdma

    International Nuclear Information System (INIS)

    Memon, A.

    2015-01-01

    SAC (Spectral Amplitude Coding) is a technique of OCDMA (Optical Code Division Multiple Access) to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying) modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying) is applied, simulated and analyzed, m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK. (author)

  9. Study of nuclear computer code maintenance and management system

    International Nuclear Information System (INIS)

    Ryu, Chang Mo; Kim, Yeon Seung; Eom, Heung Seop; Lee, Jong Bok; Kim, Ho Joon; Choi, Young Gil; Kim, Ko Ryeo

    1989-01-01

    Software maintenance is one of the most important problems since late 1970's.We wish to develop a nuclear computer code system to maintenance and manage KAERI's nuclear software. As a part of this system, we have developed three code management programs for use on CYBER and PC systems. They are used in systematic management of computer code in KAERI. The first program is embodied on the CYBER system to rapidly provide information on nuclear codes to the users. The second and the third programs were embodied on the PC system for the code manager and for the management of data in korean language, respectively. In the requirement analysis, we defined each code, magnetic tape, manual and abstract information data. In the conceptual design, we designed retrieval, update, and output functions. In the implementation design, we described the technical considerations of database programs, utilities, and directions for the use of databases. As a result of this research, we compiled the status of nuclear computer codes which belonged KAERI until September, 1988. Thus, by using these three database programs, we could provide the nuclear computer code information to the users more rapidly. (Author)

  10. L1 and L2 Picture Naming in Mandarin-English Bilinguals: A Test of Bilingual Dual Coding Theory

    Science.gov (United States)

    Jared, Debra; Poh, Rebecca Pei Yun; Paivio, Allan

    2013-01-01

    This study examined the nature of bilinguals' conceptual representations and the links from these representations to words in L1 and L2. Specifically, we tested an assumption of the Bilingual Dual Coding Theory that conceptual representations include image representations, and that learning two languages in separate contexts can result in…

  11. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  12. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  13. Performance analysis of 2D asynchronous hard-limiting optical code-division multiple access system through atmospheric scattering channel

    Science.gov (United States)

    Zhao, Yaqin; Zhong, Xin; Wu, Di; Zhang, Ye; Ren, Guanghui; Wu, Zhilu

    2013-09-01

    Optical code-division multiple access (OCDMA) systems usually allocate orthogonal or quasi-orthogonal codes to the active users. When transmitting through atmospheric scattering channel, the coding pulses are broadened and the orthogonality of the codes is worsened. In truly asynchronous case, namely both the chips and the bits are asynchronous among each active user, the pulse broadening affects the system performance a lot. In this paper, we evaluate the performance of a 2D asynchronous hard-limiting wireless OCDMA system through atmospheric scattering channel. The probability density function of multiple access interference in truly asynchronous case is given. The bit error rate decreases as the ratio of the chip period to the root mean square delay spread increases and the channel limits the bit rate to different levels when the chip period varies.

  14. Enhancing the performance of the light field microscope using wavefront coding.

    Science.gov (United States)

    Cohen, Noy; Yang, Samuel; Andalman, Aaron; Broxton, Michael; Grosenick, Logan; Deisseroth, Karl; Horowitz, Mark; Levoy, Marc

    2014-10-06

    Light field microscopy has been proposed as a new high-speed volumetric computational imaging method that enables reconstruction of 3-D volumes from captured projections of the 4-D light field. Recently, a detailed physical optics model of the light field microscope has been derived, which led to the development of a deconvolution algorithm that reconstructs 3-D volumes with high spatial resolution. However, the spatial resolution of the reconstructions has been shown to be non-uniform across depth, with some z planes showing high resolution and others, particularly at the center of the imaged volume, showing very low resolution. In this paper, we enhance the performance of the light field microscope using wavefront coding techniques. By including phase masks in the optical path of the microscope we are able to address this non-uniform resolution limitation. We have also found that superior control over the performance of the light field microscope can be achieved by using two phase masks rather than one, placed at the objective's back focal plane and at the microscope's native image plane. We present an extended optical model for our wavefront coded light field microscope and develop a performance metric based on Fisher information, which we use to choose adequate phase masks parameters. We validate our approach using both simulated data and experimental resolution measurements of a USAF 1951 resolution target; and demonstrate the utility for biological applications with in vivo volumetric calcium imaging of larval zebrafish brain.

  15. Improving performance and reducing costs of cooling towers

    International Nuclear Information System (INIS)

    Bartz, J.A.

    1992-01-01

    Cooling towers represent a significant capital investment at a steam electric power station. In addition, deficiencies in thermal performance can result in major operating penalties of fuel cost, replacement energy, and capacity addition. This paper summarizes two recent EPRI research projects aimed at reducing thermal performance deficiencies and decreasing installed costs of evaporative cooling towers. First, EPRI Research Project 2113, Cooling Tower Performance Prediction and Improvement, is summarized. This project has resulted in published data sets on the measured thermal performance characteristics of a variety of cooling tower packings, computer codes to predict tower performance, and computer code validation through large-scale tower performance measurements. Principal results are contained in an EPRIGEMS software module, Cooling Tower Advisor. This PC- based software contains a tutorial plus codes to predict tower thermal performance, arranged in a user-friendly format. The second EPRI effort, Research Project 2819-10/11, Fabric Structures for Power Plant Applications, has resulted in designs and costs of large structures with shells constructed of recently-developed fabrics. Primary power plant applications for such structures are the shells of natural draft cooling towers and coal-pile covers. Fabric structures offer low initial cost, acceptable life, and seismic superiority, among other advantages. Detailed conceptual designs and installed cost data are reviewed. 8 refs., 9 figs., 3 tabs

  16. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  17. Intelligent Patching of Conceptual Geometry for CFD Analysis

    Science.gov (United States)

    Li, Wu

    2010-01-01

    The iPatch computer code for intelligently patching surface grids was developed to convert conceptual geometry to computational fluid dynamics (CFD) geometry (see figure). It automatically uses bicubic B-splines to extrapolate (if necessary) each surface in a conceptual geometry so that all the independently defined geometric components (such as wing and fuselage) can be intersected to form a watertight CFD geometry. The software also computes the intersection curves of surface patches at any resolution (up to 10.4 accuracy) specified by the user, and it writes the B-spline surface patches, and the corresponding boundary points, for the watertight CFD geometry in the format that can be directly used by the grid generation tool VGRID. iPatch requires that input geometry be in PLOT3D format where each component surface is defined by a rectangular grid {(x(i,j), y(i,j), z(i,j)):1less than or equal to i less than or equal to m, 1 less than or equal to j less than or equal to n} that represents a smooth B-spline surface. All surfaces in the PLOT3D file conceptually represent a watertight geometry of components of an aircraft on the half-space y greater than or equal to 0. Overlapping surfaces are not allowed, but could be fixed by a utility code "fixp3d". The fixp3d utility code first finds the two grid lines on the two surface grids that are closest to each other in Hausdorff distance (a metric to measure the discrepancies of two sets); then uses one of the grid lines as the transition line, extending grid lines on one grid to the other grid to form a merged grid. Any two connecting surfaces shall have a "visually" common boundary curve, or can be described by an intersection relationship defined in a geometry specification file. The intersection of two surfaces can be at a conceptual level. However, the intersection is directional (along either i or j index direction), and each intersecting grid line (or its spine extrapolation) on the first surface should intersect

  18. Analyses with the FSTATE code: fuel performance in destructive in-pile experiments

    International Nuclear Information System (INIS)

    Bauer, T.H.; Meek, C.C.

    1982-01-01

    Thermal-mechanical analysis of a fuel pin is an essential part of the evaluation of fuel behavior during hypothetical accident transients. The FSTATE code has been developed to provide this required computational ability in situations lacking azimuthal symmetry about the fuel-pin axis by performing 2-dimensional thermal, mechanical, and fission gas release and redistribution computations for a wide range of possible transient conditions. In this paper recent code developments are described and application is made to in-pile experiments undertaken to study fast-reactor fuel under accident conditions. Three accident simulations, including a fast and slow ramp-rate overpower as well as a loss-of-cooling accident sequence, are used as representative examples, and the interpretation of STATE computations relative to experimental observations is made

  19. Performance of the coupled thermalhydraulics/neutron kinetics code R/P/C on workstation clusters and multiprocessor systems

    International Nuclear Information System (INIS)

    Hammer, C.; Paffrath, M.; Boeer, R.; Finnemann, H.; Jackson, C.J.

    1996-01-01

    The light water reactor core simulation code PANBOX has been coupled with the transient analysis code RELAP5 for the purpose of performing plant safety analyses with a three-dimensional (3-D) neutron kinetics model. The system has been parallelized to improve the computational efficiency. The paper describes the features of this system with emphasis on performance aspects. Performance results are given for different types of parallelization, i. e. for using an automatic parallelizing compiler, using the portable PVM platform on a workstation cluster, using PVM on a shared memory multiprocessor, and for using machine dependent interfaces. (author)

  20. NWTS conceptual reference repository description (CRRD). Volume III. Criteria, bases, special studies, and codes

    International Nuclear Information System (INIS)

    1981-05-01

    This volume documents the criteria, design bases, and special studies and provides the backup for the design presented in Volume II. The criteria presented here were developed by ONWI as a draft version for consideration in this conceptual report. Since these criteria were developed subsequent to preparation of the reports used as a basis for the CRRD, not all of the criteria could be fully considered in preparation of the CRRD. However, they were used as guidelines wherever possible. The criteria for terminal storage of waste are still in development. The chapter on the design bases identifies the important design considerations and provides the justification for their selection. The design bases were developed not so much to give exact values for parameters as to identify the parameters that are significant to the design. They also serve as a common basis for coordinating analysis and design studies unitl the next design phase is completed. Some of the design bases presented here were taken directly from the Stearns-Roger NWTS-R1 Conceptual Design Report. The special studies document technical aspects of the design that are of particular importance or that furnish additional information pertaining to the design

  1. UNA MIRADA CONCEPTUAL A LA GENERACIÓN AUTOMÁTICA DE CÓDIGO UMA ABORDAGEM CONCEITUAL À GERAÇÃO AUTOMÁTICA DE CÓDIGO A CONCEPTUAL APPROACH TO AUTOMATIC GENERATION OF CODE

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata

    2010-07-01

    ícil compreensão para o cliente, o que impede que se tenha uma validação em períodos prévios do desenvolvimento.Automated code generation is fostered by several software development methods. This generation is often supplied by well-known CASE (Computer-Aided Software Engineering tools. However, automation is still so far and some CASE tools are complemented by non-standard modeling projects. In this paper, we conceptualize projects related to automated code generation, starting from discourse representations in either controlled or natural language, or in conceptual schemas. In this way, we present a graphical summary of crucial concepts related to this issue, by means of a state-of-the-art review. We conclude that automated code generation usually begins from solution-based representations of the problem instead of domain-based representations. Also, we summarize that these starting points are misunderstood by the client and this situation leads to poor validation in early stages of software development lifecycle.

  2. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  3. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  4. Application and Analysis of Performance of DQPSK Advanced Modulation Format in Spectral Amplitude Coding OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-04-01

    Full Text Available SAC (Spectral Amplitude Coding is a technique of OCDMA (Optical Code Division Multiple Access to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying is applied, simulated and analyzed. m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK

  5. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  6. CONCEPTUAL FRAMEWORK FOR IMPROVING BUSINESS PERFORMANCE WITH LEAN MANUFACTURING AND SUCCESSFUL HUMAN FACTORS INTERVENTIONS-A CASE STUD

    Directory of Open Access Journals (Sweden)

    Rajiv Sharm

    2012-09-01

    Full Text Available Nowadays organizations compete between themselves in various categories such as faster delivery, price tags, state of art - technology and higher quality dimensio ns. A Conceptual framework with lean manufacturing and hum an factors interventions for improving business performance in terms of improved quality, reduced cost and faster de livery is presented and example s from literature are given to illustrate the desir ed situation in which ergonomics is considered as an integrated part of performance strategy . A case from an industry engaged in manufacturing shafts using lean manufacturing practices with successful ergonomic or human factors interventions is also inves tigated.

  7. The conceptualization model problem—surprise

    Science.gov (United States)

    Bredehoeft, John

    2005-03-01

    The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait

  8. Conceptual framework in creating and selecting the performance measurement system for marketing strategy control

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Performance measurement in modern conditions is one of the most important business requirements since enterprises face the need to exhibit returns for stockholders and investors, but also contribution of management to those returns, as well as contribution of certain business units, functional departments and activities within them. Hence, it is particularly important to assess marketing successfulness as a business function according to return on investment in marketing activities, but also according to the set of indicators from following performance groups - marketing effectiveness and marketing efficiency. Core issue is which measures to select and use in the marketing performance measurement system of certain enterprise so that it could be able to assess how effective and efficient its marketing is. In other words, adequate performance measurement system ought to contain performance measures that will be used to monitor effects and marketing strategy implementation process (controlling while implementing, and performance measures that can be applied to overall effect monitoring after the strategy implementation period. Otherwise, creating the marketing performance measurement system is a complex task for marketing managers. Therefore, the aim of this paper is to explore key principles and develop conceptual framework for creating and selecting performance measurement system for marketing strategy control which is based on characteristics and key success factors of marketing strategy, that is activities and actions for its operationalizing and effective implementing.

  9. Development of subchannel analysis code MATRA-LMR for KALIMER subassembly thermal-hydraulics

    International Nuclear Information System (INIS)

    Won-Seok Kim; Young-Gyun Kim

    2000-01-01

    In the sodium cooled liquid metal reactors, the design limit are imposed on the maximum temperatures of claddings and fuel pins. Thus an accurate prediction of core coolant/fuel temperature distribution is essential to the LMR core thermal-hydraulic design. The detailed subchannel thermal-hydraulic analysis code MATRA-LMR (Multichannel Analyzer for Steady States and Transients in Rod Arrays for Liquid Metal Reactors) is being developed for KALIMER core design and analysis, based on COBRA-IV-i and MATRA. The major modifications and improvements implemented into MATRA-LMR are as follows: a) nonuniform axial noding capability, b) sodium properties calculation subprogram, c) sodium coolant heat transfer correlations, and d) most recent pressure drop correlations, such as Novendstern, Chiu-Rohsenow-Todreas and Cheng-Todreas. To assess the development status of this code, the benchmark calculations were performed with the ORNL 19 pin tests and EBR-II seven-assembly SLTHEN calculation results. The calculation results of MATRA-LMR for ORNL 19-pin assembly tests and EBR-II 91-pin experiments were compared to the measurements, and to SABRE4 and SLTHEN code calculation results, respectively. In this comparison, the differences are found among the three codes because of the pressure drop and the thermal mixing modellings. Finally, the major technical results of the conceptual design for the KALIMER 98.03 core have been compared with the calculations of MATRA-LMR, SABRE4 and SLTHEN codes. (author)

  10. Early Experiences Writing Performance Portable OpenMP 4 Codes

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Hernandez, Oscar R [ORNL

    2016-01-01

    In this paper, we evaluate the recently available directives in OpenMP 4 to parallelize a computational kernel using both the traditional shared memory approach and the newer accelerator targeting capabilities. In addition, we explore various transformations that attempt to increase application performance portability, and examine the expressiveness and performance implications of using these approaches. For example, we want to understand if the target map directives in OpenMP 4 improve data locality when mapped to a shared memory system, as opposed to the traditional first touch policy approach in traditional OpenMP. To that end, we use recent Cray and Intel compilers to measure the performance variations of a simple application kernel when executed on the OLCF s Titan supercomputer with NVIDIA GPUs and the Beacon system with Intel Xeon Phi accelerators attached. To better understand these trade-offs, we compare our results from traditional OpenMP shared memory implementations to the newer accelerator programming model when it is used to target both the CPU and an attached heterogeneous device. We believe the results and lessons learned as presented in this paper will be useful to the larger user community by providing guidelines that can assist programmers in the development of performance portable code.

  11. COSMO: a conceptual framework for service modelling and refinement

    NARCIS (Netherlands)

    Quartel, Dick; Steen, Maarten W.A.; Pokraev, S.; van Sinderen, Marten J.

    This paper presents a conceptual framework for service modelling and refinement, called the COSMO (COnceptual Service MOdelling) framework. This framework provides concepts to model and reason about services, and to support operations, such as composition and discovery, which are performed on them

  12. Tactile communication, cooperation, and performance: an ethological study of the NBA.

    Science.gov (United States)

    Kraus, Michael W; Huang, Cassey; Keltner, Dacher

    2010-10-01

    Tactile communication, or physical touch, promotes cooperation between people, communicates distinct emotions, soothes in times of stress, and is used to make inferences of warmth and trust. Based on this conceptual analysis, we predicted that in group competition, physical touch would predict increases in both individual and group performance. In an ethological study, we coded the touch behavior of players from the National Basketball Association (NBA) during the 2008-2009 regular season. Consistent with hypotheses, early season touch predicted greater performance for individuals as well as teams later in the season. Additional analyses confirmed that touch predicted improved performance even after accounting for player status, preseason expectations, and early season performance. Moreover, coded cooperative behaviors between teammates explained the association between touch and team performance. Discussion focused on the contributions touch makes to cooperative groups and the potential implications for other group settings. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  13. User's manual for the vertical axis wind turbine performance computer code darter

    Energy Technology Data Exchange (ETDEWEB)

    Klimas, P. C.; French, R. E.

    1980-05-01

    The computer code DARTER (DARrieus, Turbine, Elemental Reynolds number) is an aerodynamic performance/loads prediction scheme based upon the conservation of momentum principle. It is the latest evolution in a sequence which began with a model developed by Templin of NRC, Canada and progressed through the Sandia National Laboratories-developed SIMOSS (SSImple MOmentum, Single Streamtube) and DART (SARrieus Turbine) to DARTER.

  14. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  15. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  16. Effect of beat noise on the performance of two-dimensional time-spreading/wavelength-hopping optical code-division multiple-access systems

    Science.gov (United States)

    Bazan, T.; Harle, D.; Andonovic, I.; Meenakshi, M.

    2005-03-01

    The effect of beat noise on optical code-division multiple-access (OCDMA) systems using a range of two-dimensional (2-D) time-spreading/wavelength-hopping (TW) code families is presented. A derivation of a general formula for the error probability of the system is given. The properties of the 2-D codes--namely, the structure, length, and cross-correlation characteristics--are found to have a great influence on system performance. Improved performance can be obtained by use of real-time dynamic thresholding.

  17. Development of an object-oriented simulation code for repository performance assessment

    International Nuclear Information System (INIS)

    Tsujimoto, Keiichi; Ahn, J.

    1999-01-01

    As understanding for mechanisms of radioactivity confinement by a deep geologic repository improves at the individual process level, it has become imperative to evaluate consequences of individual processes to the performance of the whole repository system. For this goal, the authors have developed a model for radionuclide transport in, and release from, the repository region by incorporating multiple-member decay chains and multiple waste canisters. A computer code has been developed with C++, an object-oriented language. By utilizing the feature that a geologic repository consists of thousands of objects of the same kind, such as the waste canister, the repository region is divided into multiple compartments and objects for simulation of radionuclide transport. Massive computational tasks are distributed over, and executed by, multiple networked workstations, with the help of parallel virtual machine (PVM) technology. Temporal change of the mass distribution of 28 radionuclides in the repository region for the time period of 100 million yr has been successfully obtained by the code

  18. Structural Analysis in a Conceptual Design Framework

    Science.gov (United States)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  19. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  20. Test Code Quality and Its Relation to Issue Handling Performance

    NARCIS (Netherlands)

    Athanasiou, D.; Nugroho, A.; Visser, J.; Zaidman, A.

    2014-01-01

    Automated testing is a basic principle of agile development. Its benefits include early defect detection, defect cause localization and removal of fear to apply changes to the code. Therefore, maintaining high quality test code is essential. This study introduces a model that assesses test code

  1. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  2. Performance of Different OCDMA Codes with FWM and XPM Nonlinear Effects

    Science.gov (United States)

    Rana, Shivani; Gupta, Amit

    2017-08-01

    In this paper, 1 Gb/s non-linear optical code division multiple access system have been simulated and modeled. To reduce multiple user interference multi-diagonal (MD) code which possesses the property of having zero cross-correlation have been deployed. The MD code shows better results than Walsh-Hadamard and multi-weight code under the nonlinear effect of four-wave mixing (FWM) and cross-phase modulation (XPM). The simulation results reveal that effect of FWM reduces when MD codes are employed as compared to other codes.

  3. Conceptual Design Optimization of an Augmented Stability Aircraft Incorporating Dynamic Response Performance Constraints

    Science.gov (United States)

    Welstead, Jason

    2014-01-01

    This research focused on incorporating stability and control into a multidisciplinary de- sign optimization on a Boeing 737-class advanced concept called the D8.2b. A new method of evaluating the aircraft handling performance using quantitative evaluation of the sys- tem to disturbances, including perturbations, continuous turbulence, and discrete gusts, is presented. A multidisciplinary design optimization was performed using the D8.2b transport air- craft concept. The con guration was optimized for minimum fuel burn using a design range of 3,000 nautical miles. Optimization cases were run using xed tail volume coecients, static trim constraints, and static trim and dynamic response constraints. A Cessna 182T model was used to test the various dynamic analysis components, ensuring the analysis was behaving as expected. Results of the optimizations show that including stability and con- trol in the design process drastically alters the optimal design, indicating that stability and control should be included in conceptual design to avoid system level penalties later in the design process.

  4. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    Energy Technology Data Exchange (ETDEWEB)

    Sobolik, S.R.; Ho, C.K.; Dunn, E. [Sandia National Labs., Albuquerque, NM (United States); Robey, T.H. [Spectra Research Inst., Albuquerque, NM (United States); Cruz, W.T. [Univ. del Turabo, Gurabo (Puerto Rico)

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document.

  5. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    International Nuclear Information System (INIS)

    Sobolik, S.R.; Ho, C.K.; Dunn, E.; Robey, T.H.; Cruz, W.T.

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document

  6. Process-based interpretation of conceptual hydrological model performance using a multinational catchment set

    Science.gov (United States)

    Poncelet, Carine; Merz, Ralf; Merz, Bruno; Parajka, Juraj; Oudin, Ludovic; Andréassian, Vazken; Perrin, Charles

    2017-08-01

    Most of previous assessments of hydrologic model performance are fragmented, based on small number of catchments, different methods or time periods and do not link the results to landscape or climate characteristics. This study uses large-sample hydrology to identify major catchment controls on daily runoff simulations. It is based on a conceptual lumped hydrological model (GR6J), a collection of 29 catchment characteristics, a multinational set of 1103 catchments located in Austria, France, and Germany and four runoff model efficiency criteria. Two analyses are conducted to assess how features and criteria are linked: (i) a one-dimensional analysis based on the Kruskal-Wallis test and (ii) a multidimensional analysis based on regression trees and investigating the interplay between features. The catchment features most affecting model performance are the flashiness of precipitation and streamflow (computed as the ratio of absolute day-to-day fluctuations by the total amount in a year), the seasonality of evaporation, the catchment area, and the catchment aridity. Nonflashy, nonseasonal, large, and nonarid catchments show the best performance for all the tested criteria. We argue that this higher performance is due to fewer nonlinear responses (higher correlation between precipitation and streamflow) and lower input and output variability for such catchments. Finally, we show that, compared to national sets, multinational sets increase results transferability because they explore a wider range of hydroclimatic conditions.

  7. Approaching a Conceptual Framework for Research on Sustainability Performance in Corporate Value Chains

    DEFF Research Database (Denmark)

    Kjærgaard, Thomas

    variations of stakeholder engagement and adopt a value chain narrative in their sustainability reporting. Multi-stakeholder reporting standards like the Global Reporting Initiative (GRI) and the UN Global Compact (UNGC) are adopted by corporations across industries, but only target sustainability issues...... in supply- and value chains to a limited extent. Though, this article proposes that the ongoing work towards new standards for integrated sustainability reporting represents a unique opportunity for increasing the presence of supply- and value chain perspectives in reporting in a way that facilitates a more...... optimal use of sustainability reports as a primary data source in research. Hence, this article proposes a conceptual framework for research on sustainability performance in corporate value chains, which potentially increases the future contributions to both the literature let alone practice. Different...

  8. System performances of optical space code-division multiple-access-based fiber-optic two-dimensional parallel data link.

    Science.gov (United States)

    Nakamura, M; Kitayama, K

    1998-05-10

    Optical space code-division multiple access is a scheme to multiplex and link data between two-dimensional processors such as smart pixels and spatial light modulators or arrays of optical sources like vertical-cavity surface-emitting lasers. We examine the multiplexing characteristics of optical space code-division multiple access by using optical orthogonal signature patterns. The probability density function of interference noise in interfering optical orthogonal signature patterns is calculated. The bit-error rate is derived from the result and plotted as a function of receiver threshold, code length, code weight, and number of users. Furthermore, we propose a prethresholding method to suppress the interference noise, and we experimentally verify that the method works effectively in improving system performance.

  9. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  10. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  11. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    Science.gov (United States)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  12. Newman Unit 1 advanced solar repowering advanced conceptual design. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-04-01

    The Newman Unit 1 solar repowering design is a water/steam central receiver concept supplying superheated steam. The work reported is to develop a refined baseline conceptual design that has potential for construction and operation by 1986, makes use of existing solar thermal technology, and provides the best economics for this application. Trade studies performed in the design effort are described, both for the conceptual design of the overall system and for the subsystem conceptual design. System-level functional requirements, design, operation, performance, cost, safety, environmental, institutional, and regulatory considerations are described. Subsystems described include the collector, receiver, fossil energy, electrical power generating, and master control subsystems, site and site facilities. The conceptual design, cost, and performance of each subsystem is discussed at length. A detailed economic analysis of the repowered unit is made to realistically assess the economics of the first repowered unit using present cost data for a limited production level for solar hardware. Finally, a development plan is given, including the design, procurement, construction, checkout, startup, performance validation, and commercial operation. (LEW)

  13. Overview of the geochemical code MINTEQ: applications to performance assessment for low-level wastes

    International Nuclear Information System (INIS)

    Graham, M.J.; Peterson, S.R.

    1985-09-01

    The MINTEQ geochemical computer code, developed at Pacific Northwest Laboratory, integrates many of the capabilities of its two immediate predecessors, WATEQ3 and MINEQL. MINTEQ can be used to perform the calculations necessary to simulate (model) the contact of low-level waste solutions with heterogeneous sediments or the interaction of ground water with solidified low-level wastes. The code is capable of performing calculations of ion speciation/solubility, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution of solid phases. Under the Special Waste Form Lysimeters-Arid program, the composition of effluents (leachates) from column and batch experiments, using laboratory-scale waste forms, will be used to develop a geochemical model of the interaction of ground water with commercial solidified low-level wastes. The wastes being evaluated include power reactor waste streams that have been solidified in cement, vinyl ester-styrene, and bitumen. The thermodynamic database for the code is being upgraded before the geochemical modeling is performed. Thermodynamic data for cobalt, antimony, cerium, and cesium solid phases and aqueous species are being added to the database. The need to add these data was identified from the characterization of the waste streams. The geochemical model developed from the laboratory data will then be applied to predict the release from a field-lysimeter facility that contains full-scale waste samples. The contaminant concentrations migrating from the wastes predicted using MINTEQ will be compared to the long-term lysimeter data. This comparison will constitute a partical field validation of the geochemical model. 28 refs

  14. SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT

    International Nuclear Information System (INIS)

    Carro, C.A.

    2010-01-01

    This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 (micro)m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysis is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.

  15. The picture superiority effect in conceptual implicit memory: a conceptual distinctiveness hypothesis.

    Science.gov (United States)

    Hamilton, Maryellen; Geraci, Lisa

    2006-01-01

    According to leading theories, the picture superiority effect is driven by conceptual processing, yet this effect has been difficult to obtain using conceptual implicit memory tests. We hypothesized that the picture superiority effect results from conceptual processing of a picture's distinctive features rather than a picture's semantic features. To test this hypothesis, we used 2 conceptual implicit general knowledge tests; one cued conceptually distinctive features (e.g., "What animal has large eyes?") and the other cued semantic features (e.g., "What animal is the figurehead of Tootsie Roll?"). Results showed a picture superiority effect only on the conceptual test using distinctive cues, supporting our hypothesis that this effect is mediated by conceptual processing of a picture's distinctive features.

  16. Conceptual and visual features contribute to visual memory for natural images.

    Directory of Open Access Journals (Sweden)

    Gesche M Huebner

    Full Text Available We examined the role of conceptual and visual similarity in a memory task for natural images. The important novelty of our approach was that visual similarity was determined using an algorithm [1] instead of being judged subjectively. This similarity index takes colours and spatial frequencies into account. For each target, four distractors were selected that were (1 conceptually and visually similar, (2 only conceptually similar, (3 only visually similar, or (4 neither conceptually nor visually similar to the target image. Participants viewed 219 images with the instruction to memorize them. Memory for a subset of these images was tested subsequently. In Experiment 1, participants performed a two-alternative forced choice recognition task and in Experiment 2, a yes/no-recognition task. In Experiment 3, testing occurred after a delay of one week. We analyzed the distribution of errors depending on distractor type. Performance was lowest when the distractor image was conceptually and visually similar to the target image, indicating that both factors matter in such a memory task. After delayed testing, these differences disappeared. Overall performance was high, indicating a large-capacity, detailed visual long-term memory.

  17. Selection of a computer code for Hanford low-level waste engineered-system performance assessment. Revision 1

    International Nuclear Information System (INIS)

    McGrail, B.P.; Bacon, D.H.

    1998-02-01

    Planned performance assessments for the proposed disposal of low-activity waste (LAW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. The available computer codes with suitable capabilities at the time Revision 0 of this document was prepared were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical processes expected to affect LAW glass corrosion and the mobility of radionuclides. This analysis was repeated in this report but updated to include additional processes that have been found to be important since Revision 0 was issued and to include additional codes that have been released. The highest ranked computer code was found to be the STORM code developed at PNNL for the US Department of Energy for evaluation of arid land disposal sites

  18. Development of the SCHAMBETA code for scoping analysis of HCDA

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Soo Dong; Hahn, D. H

    2000-06-01

    A computer code, SCHAMBETA(Scoping Code for HCDA Analysis using Modified Bethe-Tait Method), is developed to investigate the core disassembly process following a meltdown accident in the framework of a mofified Bethe-Tait method as part of the scoping analysis work to demonstrate the inherent safety of conceptual designs of Korea Advanced Liquid Metal Reactor(KALIMER), A 150 Mwe pool-type sodium cooled prototype fast reactor that uses U-Pu-Zr metallic fuel. The methodologies adopted in the code ared particularly useful to perform various parametric studies for better understanding of core disassembly process of liquid metal fast reactors as well as to estimate upper-limit values of the energy release resulting from a power excursion. In the SCHAMBETA code, the core kinetics and hydraulic behavior of the KALIMER is followed over the period of the super-prompt critical power excursion induced by the ramp reactivity insertion, starting at the time that the sodium-voided core reaches the melting temperature of the metallic fuels. For this purpose, the equations of state of pressure-energy density relationship are derived for the saturated-vapor as well as the solid liquid of metallic uranium fuel, and implemenmted into the formulations of the disassembly reactivity. Mathematical formulations are then developed, in the framework of Modified Bethe-Tait method, in a form relevant to utilize the improved equations of state as well as to consider Doppler effects, for scoping analysis of the super-prompt-critical power excursions driven by a specified rate of reactivity insertion.

  19. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  20. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  1. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  2. A model of R-D performance evaluation for Rate-Distortion-Complexity evaluation of H.264 video coding

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren

    2007-01-01

    This paper considers a method for evaluation of Rate-Distortion-Complexity (R-D-C) performance of video coding. A statistical model of the transformed coefficients is used to estimate the Rate-Distortion (R-D) performance. A model frame work for rate, distortion and slope of the R-D curve for inter...... and intra frame is presented. Assumptions are given for analyzing an R-D model for fast R-D-C evaluation. The theoretical expressions are combined with H.264 video coding, and confirmed by experimental results. The complexity frame work is applied to the integer motion estimation....

  3. Effects of Divided Attention at Retrieval on Conceptual Implicit Memory

    OpenAIRE

    Prull, Matthew W.; Lawless, Courtney; Marshall, Helen M.; Sherman, Annabella T. K.

    2016-01-01

    This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (Experiments 1 and 3), or the matched explicit memory test of category-cued recall (Experiment 2), both of which are conceptually-driven memory tasks, under one of two divided attention (DA) conditions in which participants simultaneously performed a distracting task. The distracting...

  4. An Analysis of the Global Code of Ethics for Tourism in the Context of Corporate Social Responsibility

    OpenAIRE

    Buzar Stipe

    2015-01-01

    The author analyzes the Global Code of Ethics for Tourism in the context of corporate social responsibility and the need for discussing this topic in ethical codes within the business and tourism sector. The text first offers an overview of the fundamental ethical concepts in business ethics and corporate social responsibility and briefly conceptualizes the relationship between these two fields. At the end, the author analyzes the content of the Global Code of Ethics for Tourism with emphasis...

  5. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    Science.gov (United States)

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  6. Analysis of the TREAT LEU Conceptual Design

    Energy Technology Data Exchange (ETDEWEB)

    Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Kontogeorgakos, D. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Papadias, D. D. [Argonne National Lab. (ANL), Argonne, IL (United States); Brunett, A. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Mo, K. [Argonne National Lab. (ANL), Argonne, IL (United States); Strons, P. S. [Argonne National Lab. (ANL), Argonne, IL (United States); Fei, T. [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, A. E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    Analyses were performed to evaluate the performance of the low enriched uranium (LEU) conceptual design fuel for the conversion of the Transient Reactor Test Facility (TREAT) from its current highly enriched uranium (HEU) fuel. TREAT is an experimental nuclear reactor designed to produce high neutron flux transients for the testing of reactor fuels and other materials. TREAT is currently in non-operational standby, but is being restarted under the U.S. Department of Energy’s Resumption of Transient Testing Program. The conversion of TREAT is being pursued in keeping with the mission of the Department of Energy National Nuclear Security Administration’s Material Management and Minimization (M3) Reactor Conversion Program. The focus of this study was to demonstrate that the converted LEU core is capable of maintaining the performance of the existing HEU core, while continuing to operate safely. Neutronic and thermal hydraulic simulations have been performed to evaluate the performance of the LEU conceptual-design core under both steady-state and transient conditions, for both normal operation and reactivity insertion accident scenarios. In addition, ancillary safety analyses which were performed for previous LEU design concepts have been reviewed and updated as-needed, in order to evaluate if the converted LEU core will function safely with all existing facility systems. Simulations were also performed to evaluate the detailed behavior of the UO2-graphite fuel, to support future fuel manufacturing decisions regarding particle size specifications. The results of these analyses will be used in conjunction with work being performed at Idaho National Laboratory and Los Alamos National Laboratory, in order to develop the Conceptual Design Report project deliverable.

  7. MARS CODE MANAUAL VOLUME IV - Developmental Assessment Report

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu; Lee, Won Jae; Lee, Young Jin; Lee, Seung Wook; Kim, Kyung Doo; Bae, Sung Won

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This assessment manual provides a complete list of code assessment results of the MARS code for various conceptual problem, separate effect test and integral effect test. From these validation procedures, the soundness and accuracy of the MARS code has been confirmed. The overall structure of the input is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  8. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  9. Validation of the transportation computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND

    International Nuclear Information System (INIS)

    Maheras, S.J.; Pippen, H.K.

    1995-05-01

    The computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND were used to estimate radiation doses from the transportation of radioactive material in the Department of Energy Programmatic Spent Nuclear Fuel Management and Idaho National Engineering Laboratory Environmental Restoration and Waste Management Programs Environmental Impact Statement. HIGHWAY and INTERLINE were used to estimate transportation routes for truck and rail shipments, respectively. RADTRAN 4 was used to estimate collective doses from incident-free transportation and the risk (probability x consequence) from transportation accidents. RISKIND was used to estimate incident-free radiation doses for maximally exposed individuals and the consequences from reasonably foreseeable transportation accidents. The purpose of this analysis is to validate the estimates made by these computer codes; critiques of the conceptual models used in RADTRAN 4 are also discussed. Validation is defined as ''the test and evaluation of the completed software to ensure compliance with software requirements.'' In this analysis, validation means that the differences between the estimates generated by these codes and independent observations are small (i.e., within the acceptance criterion established for the validation analysis). In some cases, the independent observations used in the validation were measurements; in other cases, the independent observations used in the validation analysis were generated using hand calculations. The results of the validation analyses performed for HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND show that the differences between the estimates generated using the computer codes and independent observations were small. Based on the acceptance criterion established for the validation analyses, the codes yielded acceptable results; in all cases the estimates met the requirements for successful validation

  10. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  11. ETR/ITER systems code

    International Nuclear Information System (INIS)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs

  12. A conceptual framework for evaluating impairments in myasthenia gravis.

    Science.gov (United States)

    Barnett, Carolina; Bril, Vera; Kapral, Moira; Kulkarni, Abhaya; Davis, Aileen M

    2014-01-01

    Myasthenia gravis is characterized by weakness and fatigability of different muscle groups, including ocular, bulbar and the limbs. Therefore, a measure of disease severity at the impairment level in myasthenia needs to reflect all the relevant impairments, as well as their variations with activity and fatigue. We conducted a qualitative study of patients with myasthenia, to explore their experiences and related impairments, aimed at developing a conceptual framework of disease severity at the impairment level in myasthenia gravis. Twenty patients representing the spectrum of disease participated in semi-structured interviews. Interviews were recorded and the transcripts were analyzed by content analysis using an inductive approach with line-by-line open coding. Themes were generated from these codes. Two main themes were identified: the severity of the impairments and fatigability (i.e., triggering or worsening of an impairment with activity). The impairments were further classified within body regions (ocular, bulbar and axial/limbs). Fatigability was described as a phenomenon affecting the whole body but also affecting specific impairments, and was associated with fluctuation of the symptoms. Patients were concerned that clinical examination at a single point in time might not reflect their true clinical state due to fatigability and fluctuations in severity. This conceptual framework reflects the relevance of both severity and fatigability in understanding impairment-based disease severity in myasthenia. This framework could inform the development of impairment measures in myasthenia gravis.

  13. A conceptual framework for evaluating impairments in myasthenia gravis.

    Directory of Open Access Journals (Sweden)

    Carolina Barnett

    Full Text Available BACKGROUND: Myasthenia gravis is characterized by weakness and fatigability of different muscle groups, including ocular, bulbar and the limbs. Therefore, a measure of disease severity at the impairment level in myasthenia needs to reflect all the relevant impairments, as well as their variations with activity and fatigue. We conducted a qualitative study of patients with myasthenia, to explore their experiences and related impairments, aimed at developing a conceptual framework of disease severity at the impairment level in myasthenia gravis. METHODS: Twenty patients representing the spectrum of disease participated in semi-structured interviews. Interviews were recorded and the transcripts were analyzed by content analysis using an inductive approach with line-by-line open coding. Themes were generated from these codes. RESULTS: Two main themes were identified: the severity of the impairments and fatigability (i.e., triggering or worsening of an impairment with activity. The impairments were further classified within body regions (ocular, bulbar and axial/limbs. Fatigability was described as a phenomenon affecting the whole body but also affecting specific impairments, and was associated with fluctuation of the symptoms. Patients were concerned that clinical examination at a single point in time might not reflect their true clinical state due to fatigability and fluctuations in severity. CONCLUSIONS: This conceptual framework reflects the relevance of both severity and fatigability in understanding impairment-based disease severity in myasthenia. This framework could inform the development of impairment measures in myasthenia gravis.

  14. Development of CAP code for nuclear power plant containment: Lumped model

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon Joon, E-mail: sjhong90@fnctech.com [FNC Tech. Co. Ltd., Heungdeok 1 ro 13, Giheung-gu, Yongin-si, Gyeonggi-do 446-908 (Korea, Republic of); Choo, Yeon Joon; Hwang, Su Hyun; Lee, Byung Chul [FNC Tech. Co. Ltd., Heungdeok 1 ro 13, Giheung-gu, Yongin-si, Gyeonggi-do 446-908 (Korea, Republic of); Ha, Sang Jun [Central Research Institute, Korea Hydro & Nuclear Power Company, Ltd., 70, 1312-gil, Yuseong-daero, Yuseong-gu, Daejeon 305-343 (Korea, Republic of)

    2015-09-15

    Highlights: • State-of-art containment analysis code, CAP, has been developed. • CAP uses 3-field equations, water level oriented upwind scheme, local head model. • CAP has a function of linked calculation with reactor coolant system code. • CAP code assessments showed appropriate prediction capabilities. - Abstract: CAP (nuclear Containment Analysis Package) code has been developed in Korean nuclear society for the analysis of nuclear containment thermal hydraulic behaviors including pressure and temperature trends and hydrogen concentration. Lumped model of CAP code uses 2-phase, 3-field equations for fluid behaviors, and has appropriate constitutive equations, 1-dimensional heat conductor model, component models, trip and control models, and special process models. CAP can run in a standalone mode or a linked mode with a reactor coolant system analysis code. The linked mode enables the more realistic calculation of a containment response and is expected to be applicable to a more complicated advanced plant design calculation. CAP code assessments were carried out by gradual approaches: conceptual problems, fundamental phenomena, component and principal phenomena, experimental validation, and finally comparison with other code calculations on the base of important phenomena identifications. The assessments showed appropriate prediction capabilities of CAP.

  15. Development of CAP code for nuclear power plant containment: Lumped model

    International Nuclear Information System (INIS)

    Hong, Soon Joon; Choo, Yeon Joon; Hwang, Su Hyun; Lee, Byung Chul; Ha, Sang Jun

    2015-01-01

    Highlights: • State-of-art containment analysis code, CAP, has been developed. • CAP uses 3-field equations, water level oriented upwind scheme, local head model. • CAP has a function of linked calculation with reactor coolant system code. • CAP code assessments showed appropriate prediction capabilities. - Abstract: CAP (nuclear Containment Analysis Package) code has been developed in Korean nuclear society for the analysis of nuclear containment thermal hydraulic behaviors including pressure and temperature trends and hydrogen concentration. Lumped model of CAP code uses 2-phase, 3-field equations for fluid behaviors, and has appropriate constitutive equations, 1-dimensional heat conductor model, component models, trip and control models, and special process models. CAP can run in a standalone mode or a linked mode with a reactor coolant system analysis code. The linked mode enables the more realistic calculation of a containment response and is expected to be applicable to a more complicated advanced plant design calculation. CAP code assessments were carried out by gradual approaches: conceptual problems, fundamental phenomena, component and principal phenomena, experimental validation, and finally comparison with other code calculations on the base of important phenomena identifications. The assessments showed appropriate prediction capabilities of CAP

  16. Impact of the Revised Malaysian Code on Corporate Governance on Audit Committee Attributes and Firm Performance

    OpenAIRE

    KALLAMU, Basiru Salisu

    2016-01-01

    Abstract. Using a sample of 37 finance companies listed under the finance segment of Bursa Malaysia, we examined the impact of the revision to Malaysian code on corporate governance on audit committee attributes and firm performance. Our result suggests that audit committee attributes significantly improved after the Code was revised. In addition, the coefficient for audit committee and risk committee interlock has a significant negative relationship with Tobin’s Q in the period before the re...

  17. Performance and complexity of tunable sparse network coding with gradual growing tuning functions over wireless networks

    OpenAIRE

    Garrido Ortiz, Pablo; Sørensen, Chres W.; Lucani Roetter, Daniel Enrique; Agüero Calvo, Ramón

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and comp...

  18. Effects of Divided Attention at Retrieval on Conceptual Implicit Memory.

    Science.gov (United States)

    Prull, Matthew W; Lawless, Courtney; Marshall, Helen M; Sherman, Annabella T K

    2016-01-01

    This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (CEG; Experiments 1 and 3), or the matched explicit memory test of category-cued recall (Experiment 2), both of which are conceptually driven memory tasks, under one of two divided attention (DA) conditions in which participants simultaneously performed a distracting task. The distracting task was either syllable judgments (dissimilar processes), or semantic judgments (similar processes) on unrelated words. Compared to full attention (FA) in which no distracting task was performed, DA had no effect on CEG priming overall, but reduced category-cued recall similarly regardless of distractor task. Analyses of distractor task performance also revealed differences between implicit and explicit memory retrieval. The evidence suggests that, whereas explicit memory retrieval requires attentional resources and is disrupted by semantic and phonological distracting tasks, conceptual implicit memory is automatic and unaffected even when distractor and memory tasks involve similar processes.

  19. Compilation of documented computer codes applicable to environmental assessment of radioactivity releases

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Shaeffer, D.L.; Garten, C.T. Jr.; Shor, R.W.; Ensminger, J.T.

    1977-04-01

    The objective of this paper is to present a compilation of computer codes for the assessment of accidental or routine releases of radioactivity to the environment from nuclear power facilities. The capabilities of 83 computer codes in the areas of environmental transport and radiation dosimetry are summarized in tabular form. This preliminary analysis clearly indicates that the initial efforts in assessment methodology development have concentrated on atmospheric dispersion, external dosimetry, and internal dosimetry via inhalation. The incorporation of terrestrial and aquatic food chain pathways has been a more recent development and reflects the current requirements of environmental legislation and the needs of regulatory agencies. The characteristics of the conceptual models employed by these codes are reviewed. The appendixes include abstracts of the codes and indexes by author, key words, publication description, and title

  20. Software Design Document for the AMP Nuclear Fuel Performance Code

    International Nuclear Information System (INIS)

    Philip, Bobby; Clarno, Kevin T.; Cochran, Bill

    2010-01-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  1. Assessment of MARMOT. A Mesoscale Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, M. R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, X. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yu, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Teague, M. C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andersson, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics, and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.

  2. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  3. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  4. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  5. Pipe stress analysis on HCCR-TBS ancillary systems in conceptual design

    International Nuclear Information System (INIS)

    Ahn, Mu-Young; Cho, Seungyon; Lee, Eo Hwak; Park, Yi-Hyun; Lee, Youngmin

    2016-01-01

    Highlights: • Pipe stress is performed on Korean HCCR-TBS for the load combinations including seismic events. • The resultant stress meets the requirement of the design code & standard except one position where modification is needed. • The results gives useful information for the design evolution in the next desgin phase. - Abstract: Korean Helium Cooled Ceramic Reflector (HCCR) Test Blanket System (TBS) will be tested in ITER to demonstrate feasibility of the breeding blanket concept. The HCCR-TBS comprises Test Blanket Module (TBM) with associated shield, and ancillary systems located in various positions of ITER building. Currently, conceptual design for the HCCR-TBS is in progress. This paper presents pipe stress analysis results for the HCCR-TBS ancillary systems. The pipe stress analysis was performed in accordance with ASME B31.3 for major pipes of the Helium Cooling System (HCS) and the Coolant Purification System (CPS), which are operated in high pressure and temperature. The pipe stress for various load cases and load combinations were calculated. Operational pressure and temperature during plasma operation are applied as pressure load and thermal load, respectively. In addition seismic events were combined to investigate the code compliance for sustained load case and occasional load case. It was confirmed that the resultant stress meets the requirements of ASME B31.3 except one position in which it needs modification. These results give useful information for the next design phase, for example, nozzle loads for the component selection, the support design parameters, etc.

  6. Pipe stress analysis on HCCR-TBS ancillary systems in conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Mu-Young, E-mail: myahn74@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of); Lee, Eo Hwak [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Yi-Hyun; Lee, Youngmin [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    Highlights: • Pipe stress is performed on Korean HCCR-TBS for the load combinations including seismic events. • The resultant stress meets the requirement of the design code & standard except one position where modification is needed. • The results gives useful information for the design evolution in the next desgin phase. - Abstract: Korean Helium Cooled Ceramic Reflector (HCCR) Test Blanket System (TBS) will be tested in ITER to demonstrate feasibility of the breeding blanket concept. The HCCR-TBS comprises Test Blanket Module (TBM) with associated shield, and ancillary systems located in various positions of ITER building. Currently, conceptual design for the HCCR-TBS is in progress. This paper presents pipe stress analysis results for the HCCR-TBS ancillary systems. The pipe stress analysis was performed in accordance with ASME B31.3 for major pipes of the Helium Cooling System (HCS) and the Coolant Purification System (CPS), which are operated in high pressure and temperature. The pipe stress for various load cases and load combinations were calculated. Operational pressure and temperature during plasma operation are applied as pressure load and thermal load, respectively. In addition seismic events were combined to investigate the code compliance for sustained load case and occasional load case. It was confirmed that the resultant stress meets the requirements of ASME B31.3 except one position in which it needs modification. These results give useful information for the next design phase, for example, nozzle loads for the component selection, the support design parameters, etc.

  7. Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling

    DEFF Research Database (Denmark)

    Glückstad, Fumiko Kano; Herlau, Tue; Schmidt, Mikkel Nørgaard

    2013-01-01

    This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...

  8. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  9. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  10. An Investigation of Seventh Grade Students’ Performances on Conceptual, Procedural and Graphical Problems Regarding Circles

    Directory of Open Access Journals (Sweden)

    Lütfi İncikabı

    2015-04-01

    Full Text Available The purpose of this study is to determine seventh grade students’ preferences among the procedural, conceptual and graphical questions in the subject of circles, to define their success levels in their preferences, and to compare students’ success levels in one question type with their performances in other question types. The methodology adopted during this research was case study. Based on criterion-based purposive sampling strategy, 98 middle school students were selected as the participants. Data were collected through an achievement test consisting of nine questions (three per question type. The results obtained from the study indicated that students mostly preferred graphical question types. Moreover, majority of students could not succeeded high levels in their preferred question types. In addition, the students performed better in graphical question types; however, the failure in procedural question types was remarkable. Keywords: Multiple representations, middle school students, mathematics education, circles

  11. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  12. Conceptual IT model

    Science.gov (United States)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  13. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    Science.gov (United States)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  14. A Conceptual Space Logic

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer

    1999-01-01

    Conceptual spaces have been proposed as topological or geometric means for establishing conceptual structures and models. This paper, after briey reviewing conceptual spaces, focusses on the relationship between conceptual spaces and logical concept languages with operations for combining concepts...... to form concepts. Speci cally is introduced an algebraic concept logic, for which conceptual spaces are installed as semantic domain as replacement for, or enrichment of, the traditional....

  15. Design and performance of coded aperture optical elements for the CESR-TA x-ray beam size monitor

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, J.P.; Chatterjee, A.; Conolly, C.; Edwards, E.; Ehrlichman, M.P. [Cornell University, Ithaca, NY 14853 (United States); Flanagan, J.W. [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Department of Accelerator Science, Graduate University for Advanced Studies (SOKENDAI), Tsukuba (Japan); Fontes, E. [Cornell University, Ithaca, NY 14853 (United States); Heltsley, B.K., E-mail: bkh2@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Lyndaker, A.; Peterson, D.P.; Rider, N.T.; Rubin, D.L.; Seeley, R.; Shanks, J. [Cornell University, Ithaca, NY 14853 (United States)

    2014-12-11

    We describe the design and performance of optical elements for an x-ray beam size monitor (xBSM), a device measuring e{sup +} and e{sup −} beam sizes in the CESR-TA storage ring. The device can measure vertical beam sizes of 10–100μm on a turn-by-turn, bunch-by-bunch basis at e{sup ±} beam energies of ∼2–5GeV. x-rays produced by a hard-bend magnet pass through a single- or multiple-slit (coded aperture) optical element onto a detector. The coded aperture slit pattern and thickness of masking material forming that pattern can both be tuned for optimal resolving power. We describe several such optical elements and show how well predictions of simple models track measured performances. - Highlights: • We characterize optical element performance of an e{sup ±} x-ray beam size monitor. • We standardize beam size resolving power measurements to reference conditions. • Standardized resolving power measurements compare favorably to model predictions. • Key model features include simulation of photon-counting statistics and image fitting. • Results validate a coded aperture design optimized for the x-ray spectrum encountered.

  16. Characterizing the Fundamental Intellectual Steps Required in the Solution of Conceptual Problems

    Science.gov (United States)

    Stewart, John

    2010-02-01

    At some level, the performance of a science class must depend on what is taught, the information content of the materials and assignments of the course. The introductory calculus-based electricity and magnetism class at the University of Arkansas is examined using a catalog of the basic reasoning steps involved in the solution of problems assigned in the class. This catalog was developed by sampling popular physics textbooks for conceptual problems. The solution to each conceptual problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content within the course. Using this characterization technique, an exceptionally detailed picture of the information flow and structure of the class can be produced. The intellectual structure of published conceptual inventories is compared with the information presented in the class and the dependence of conceptual performance on the details of coverage extracted. )

  17. Using a lumped conceptual hydrological model for five different catchments in Sweden

    OpenAIRE

    Ekenberg, Madeleine

    2016-01-01

    Hydrological models offer powerful tools for understanding and predicting. In this thesis we havereviewed the advantages and disadvantages of physically based distributed hydrological models andconceptually lumped hydrological models. Based on that review, we went into depth and developed aMATLAB code to test if a simple conceptual lumped hydrological model, namely GR2M, wouldperform satisfactory for five different catchments in different parts of Sweden. The model had ratherunsatisfactory re...

  18. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  19. Performance Evaluation of HARQ Technique with UMTS Turbo Code

    Directory of Open Access Journals (Sweden)

    S. S. Brkić

    2011-11-01

    Full Text Available The hybrid automatic repeat request technique (HARQ represents the error control principle which combines an error correcting code and automatic repeat request procedure (ARQ, within the same transmission system. In this paper, using Monte Carlo simulation process, the characteristics of HARQ technique are determined, for the case of the Universal Mobile Telecommunication System (UMTS turbo code.

  20. Impact of optical hard limiter on the performance of an optical overlapped-code division multiple access system

    Science.gov (United States)

    Inaty, Elie; Raad, Robert; Tablieh, Nicole

    2011-08-01

    Throughout this paper, a closed form expression of the multiple access interference (MAI) limited bit error rate (BER) is provided for the multiwavelength optical code-division multiple-access system when the system is working above the nominal transmission rate limit imposed by the passive encoding-decoding operation. This system is known in literature as the optical overlapped code division multiple access (OV-CDMA) system. A unified analytical framework is presented emphasizing the impact of optical hard limiter (OHL) on the BER performance of such a system. Results show that the performance of the OV-CDMA system may be highly improved when using OHL preprocessing at the receiver side.

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  2. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  3. An overview of the geochemical code MINTEQ: Applications to performance assessment for low-level wastes

    International Nuclear Information System (INIS)

    Peterson, S.R.; Opitz, B.E.; Graham, M.J.; Eary, L.E.

    1987-03-01

    The MINTEQ geochemical computer code, developed at the Pacific Northwest Laboratory (PNL), integrates many of the capabilities of its two immediate predecessors, MINEQL and WATEQ3. The MINTEQ code will be used in the Special Waste Form Lysimeters-Arid program to perform the calculations necessary to simulate (model) the contact of low-level waste solutions with heterogeneous sediments of the interaction of ground water with solidified low-level wastes. The code can calculate ion speciation/solubilitya, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution of solid phases. Under the Special Waste Form Lysimeters-Arid program, the composition of effluents (leachates) from column and batch experiments, using laboratory-scale waste forms, will be used to develop a geochemical model of the interaction of ground water with commercial, solidified low-level wastes. The wastes being evaluated include power-reactor waste streams that have been solidified in cement, vinyl ester-styrene, and bitumen. The thermodynamic database for the code was upgraded preparatory to performing the geochemical modeling. Thermodynamic data for solid phases and aqueous species containing Sb, Ce, Cs, or Co were added to the MINTEQ database. The need to add these data was identified from the characterization of the waste streams. The geochemical model developed from the laboratory data will then be applied to predict the release from a field-lysimeter facility that contains full-scale waste samples. The contaminant concentrations migrating from the waste forms predicted using MINTEQ will be compared to the long-term lysimeter data. This comparison will constitute a partial field validation of the geochemical model

  4. HYDRASTAR - a code for stochastic simulation of groundwater flow

    International Nuclear Information System (INIS)

    Norman, S.

    1992-05-01

    The computer code HYDRASTAR was developed as a tool for groundwater flow and transport simulations in the SKB 91 safety analysis project. Its conceptual ideas can be traced back to a report by Shlomo Neuman in 1988, see the reference section. The main idea of the code is the treatment of the rock as a stochastic continuum which separates it from the deterministic methods previously employed by SKB and also from the discrete fracture models. The current report is a comprehensive description of HYDRASTAR including such topics as regularization or upscaling of a hydraulic conductivity field, unconditional and conditional simulation of stochastic processes, numerical solvers for the hydrology and streamline equations and finally some proposals for future developments

  5. Governance codes: facts or fictions? a study of governance codes in colombia1,2

    Directory of Open Access Journals (Sweden)

    Julián Benavides Franco

    2010-10-01

    Full Text Available This article studies the effects on accounting performance and financing decisions of Colombian firms after issuing a corporate governance code. We assemble a database of Colombian issuers and test the hypotheses of improved performance and higher leverage after issuing a code. The results show that the firms’ return on assets after the code introduction improves in excess of 1%; the effect is amplified by the code quality. Additionally, the firms leverage increased, in excess of 5%, when the code quality was factored into the analysis. These results suggest that controlling parties commitment to self restrain, by reducing their private benefits and/or the expropriation of non controlling parties, through the code introduction, is indeed an effective measure and that the financial markets agree, increasing the supply of funds to the firms.

  6. Comparative performance evaluation of transform coding in image pre-processing

    Science.gov (United States)

    Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha

    2017-07-01

    We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.

  7. Conceptual Understanding of Multiplicative Properties through Endogenous Digital Game Play

    Science.gov (United States)

    Denham, Andre

    2012-01-01

    This study purposed to determine the effect of an endogenously designed instructional game on conceptual understanding of the associative and distributive properties of multiplication. Additional this study sought to investigate if performance on measures of conceptual understanding taken prior to and after game play could serve as predictors of…

  8. Divertor modelling for conceptual studies of tokamak fusion reactor FDS-III

    International Nuclear Information System (INIS)

    Chen Yiping; Liu Songlin

    2010-01-01

    Divertor modelling for the conceptual studies of tokamak fusion reactor FDS-III was carried out by using the edge plasma code package B2.5-Eirene (SOLPS5.0). The modelling was performed by taking real MHD equilibrium and divertor geometry of the reactor into account. The profiles of plasma temperature, density and heat fluxes in the computational region and at the target plates have been obtained. The modelling results show that, with the fusion power P fu =2.6 GW and the edge density N edge =6.0x10 19 l/m 3 , the peak values of electron and ion heat fluxes at the outer target plate of divertor are respectively 93.92 MW/m 2 and 58.50 MW/m 2 . According to the modelling results it is suggested that some methods for reducing the heat fluxes at the target plates should be used in order to get acceptable level of power flux at the target plates for the divertor design of the reactor.

  9. A fast and compact Fuel Rod Performance Simulator code for predictive, interpretive and educational purpose

    International Nuclear Information System (INIS)

    Lorenzen, J.

    1990-01-01

    A new Fuel rod Performance Simulator code FRPS has been developed, tested and benchmarked and is now available in different versions. The user may choose between the batch version INTERPIN producing results in form of listings or beforehand defined plots, or the interactive simulator code SIMSIM which is stepping through a power history under the control of user. Both versions are presently running on minicomputers and PC:s using EGA-Graphics. A third version is the implementation in a Studsvik Compact Simulator with FRPS being one of its various modules receiving the dynamic inputs from the simulator

  10. Experimental demonstration of the transmission performance for LDPC-coded multiband OFDM ultra-wideband over fiber system

    Science.gov (United States)

    He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin; Su, Jinshu

    2015-01-01

    To improve the transmission performance of multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband (UWB) over optical fiber, a pre-coding scheme based on low-density parity-check (LDPC) is adopted and experimentally demonstrated in the intensity-modulation and direct-detection MB-OFDM UWB over fiber system. Meanwhile, a symbol synchronization and pilot-aided channel estimation scheme is implemented on the receiver of the MB-OFDM UWB over fiber system. The experimental results show that the LDPC pre-coding scheme can work effectively in the MB-OFDM UWB over fiber system. After 70 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1 × 10-3, the receiver sensitivities are improved about 4 dB when the LDPC code rate is 75%.

  11. Summary of Conceptual Models and Data Needs to Support the INL Remote-Handled Low-Level Waste Disposal Facility Performance Assessment and Composite Analysis

    International Nuclear Information System (INIS)

    Sondrup, A. Jeff; Schafter, Annette L.; Rood, Arthur S.

    2010-01-01

    An overview of the technical approach and data required to support development of the performance assessment, and composite analysis are presented for the remote handled low-level waste disposal facility on-site alternative being considered at Idaho National Laboratory. Previous analyses and available data that meet requirements are identified and discussed. Outstanding data and analysis needs are also identified and summarized. The on-site disposal facility is being evaluated in anticipation of the closure of the Radioactive Waste Management Complex at the INL. An assessment of facility performance and of the composite performance are required to meet the Department of Energy's Low-Level Waste requirements (DOE Order 435.1, 2001) which stipulate that operation and closure of the disposal facility will be managed in a manner that is protective of worker and public health and safety, and the environment. The corresponding established procedures to ensure these protections are contained in DOE Manual 435.1-1, Radioactive Waste Management Manual (DOE M 435.1-1 2001). Requirements include assessment of (1) all-exposure pathways, (2) air pathway, (3) radon, and (4) groundwater pathway doses. Doses are computed from radionuclide concentrations in the environment. The performance assessment and composite analysis are being prepared to assess compliance with performance objectives and to establish limits on concentrations and inventories of radionuclides at the facility and to support specification of design, construction, operation and closure requirements. Technical objectives of the PA and CA are primarily accomplished through the development of an establish inventory, and through the use of predictive environmental transport models implementing an overarching conceptual framework. This document reviews the conceptual model, inherent assumptions, and data required to implement the conceptual model in a numerical framework. Available site-specific data and data sources

  12. A Probabilistic Performance Assessment Study of Potential Low-Level Radioactive Waste Disposal Sites in Taiwan

    Science.gov (United States)

    Knowlton, R. G.; Arnold, B. W.; Mattie, P. D.; Kuo, M.; Tien, N.

    2006-12-01

    For several years now, Taiwan has been engaged in a process to select a low-level radioactive waste (LLW) disposal site. Taiwan is generating LLW from operational and decommissioning wastes associated with nuclear power reactors, as well as research, industrial, and medical radioactive wastes. The preliminary selection process has narrowed the search to four potential candidate sites. These sites are to be evaluated in a performance assessment analysis to determine the likelihood of meeting the regulatory criteria for disposal. Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research have been working together to develop the necessary performance assessment methodology and associated computer models to perform these analyses. The methodology utilizes both deterministic (e.g., single run) and probabilistic (e.g., multiple statistical realizations) analyses to achieve the goals. The probabilistic approach provides a means of quantitatively evaluating uncertainty in the model predictions and a more robust basis for performing sensitivity analyses to better understand what is driving the dose predictions from the models. Two types of disposal configurations are under consideration: a shallow land burial concept and a cavern disposal concept. The shallow land burial option includes a protective cover to limit infiltration potential to the waste. Both conceptual designs call for the disposal of 55 gallon waste drums within concrete lined trenches or tunnels, and backfilled with grout. Waste emplaced in the drums may be solidified. Both types of sites are underlain or placed within saturated fractured bedrock material. These factors have influenced the conceptual model development of each site, as well as the selection of the models to employ for the performance assessment analyses. Several existing codes were integrated in order to facilitate a comprehensive performance assessment methodology to evaluate the potential disposal sites. First, a need

  13. Performance Evaluation of Wavelet-Coded OFDM on a 4.9 Gbps W-Band Radio-over-Fiber Link

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Rommel, Simon; Dinis, Rui

    2017-01-01

    Future generation mobile communications running on mm-wave frequencies will require great robustness against frequency selective channels. In this work we evaluate the transmission performance of 4.9 Gbps Wavelet-Coded OFDM signals on a 10 km fiber plus 58 m wireless Radio-over-Fiber link using...... a mm-wave radio frequency carrier. The results show that a 2×128 Wavelet-Coded OFDM system achieves a bit-error rate of 1e-4 with nearly 2.5 dB less signal-to-noise ratio than a convolutional coded OFDM system with equivalent spectral efficiency for 8 GHz-wide signals with 512 sub-carriers on a carrier...

  14. An Analysis of the Global Code of Ethics for Tourism in the Context of Corporate Social Responsibility

    Directory of Open Access Journals (Sweden)

    Buzar Stipe

    2015-12-01

    Full Text Available The author analyzes the Global Code of Ethics for Tourism in the context of corporate social responsibility and the need for discussing this topic in ethical codes within the business and tourism sector. The text first offers an overview of the fundamental ethical concepts in business ethics and corporate social responsibility and briefly conceptualizes the relationship between these two fields. At the end, the author analyzes the content of the Global Code of Ethics for Tourism with emphasis on the elements pertaining to corporate social responsibility, after which he offers a critical opinion about the contribution of the aforemntioned code.

  15. Current Status of the LIFE Fast Reactors Fuel Performance Codes

    International Nuclear Information System (INIS)

    Yacout, A.M.; Billone, M.C.

    2013-01-01

    The LIFE-4 (Rev. 1) code was calibrated and validated using data from (U,Pu)O2 mixed-oxide fuel pins and UO2 blanket rods which were irradiation tested under steady-state and transient conditions. – It integrates a broad material and fuel-pin irradiation database into a consistent framework for use and extrapolation of the database to reactor design applications. – The code is available and running on different computer platforms (UNIX & PC) – Detailed documentations of the code’s models, routines, calibration and validation data sets are available. LIFE-METAL code is based on LIFE4 with modifications to include key phenomena applicable to metallic fuel, and metallic fuel properties – Calibrated with large database from irradiations in EBR-II – Further effort for calibration and detailed documentation. Recent activities with the codes are related to reactor design studies and support of licensing efforts for 4S and KAERI SFR designs. Future activities are related to re-assessment of the codes calibration and validation and inclusion of models for advanced fuels (transmutation fuels)

  16. Conceptualizing Programme Evaluation

    Science.gov (United States)

    Hassan, Salochana

    2013-01-01

    The main thrust of this paper deals with the conceptualization of theory-driven evaluation pertaining to a tutor training programme. Conceptualization of evaluation, in this case, is an integration between a conceptualization model as well as a theoretical framework in the form of activity theory. Existing examples of frameworks of programme…

  17. The Conceptual Framework of Thematic Mapping in Case Conceptualization.

    Science.gov (United States)

    Ridley, Charles R; Jeffrey, Christina E

    2017-04-01

    This article, the 3rd in a series of 5, introduces the conceptual framework for thematic mapping, a novel approach to case conceptualization. The framework is transtheoretical in that it is not constrained by the tenets or concepts of any one therapeutic orientation and transdiagnostic in that it conceptualizes clients outside the constraints of diagnostic criteria. Thematic mapping comprises 4 components: a definition, foundational principles, defining features, and core concepts. These components of the framework, deemed building blocks, are explained in this article. Like the foundation of any structure, the heuristic value of the method requires that the building blocks have integrity, coherence, and sound anchoring. We assert that the conceptual framework provides a solid foundation, making thematic mapping a potential asset in mental health treatment. © 2017 Wiley Periodicals, Inc.

  18. INSTRUMENTAL CONCEPTUALIZATION SUBJECT AREA SOCIOLOGY: SOME POSSIBLE SOLUTIONS

    Directory of Open Access Journals (Sweden)

    E. V. Maslennikov

    2016-01-01

    Full Text Available The article outlines some of the possible ways of integrating the deployment tool conceptualizing domains of sociology on the basis of the machine sets of steps3. Substantiates the urgency of the problem of application of structural constructs of mathematics as a structure-formalism domain of sociology as a combination of theoretical knowledge. Formulated understanding of the sociological dimension in a broad sense of the concept of research as a measurement based on the use of instrumental in conceptualizing the methodology of sociological research. Under instrumental conceptualization refers to the construction of complex conceptual (conceptual schema structurally interconnected relationships between their individual elements, which are the units of the consideration related conceptual integrity, derived from interpretation of the properties “Set” construct. The paper proposes a definition based on the properties set in the scale set by the structure of N. Bourbaki4 relations systems in the data sets under the structural dimension of social phenomena to understand the interpretation of the investigated properties of social phenomena in terms of a construct that lies at the basis of the theoretical model that reflects the diversity of these qualities with the help of conceptual schemes that determine the quality of each as a structure of relations systems (ie, property in these qualities. In conclusion, the article lists presented in a number of publications, some preliminary results of the application of the methodology of conceptualizing instrumental in related disciplines from sociology. These works can perform suggestive role in the knowledge and understanding of methods of problem fields and objectives of the work on the conceptualization of theoretical sociology, using the mathematical theory of forms. 

  19. FY2017 Updates to the SAS4A/SASSYS-1 Safety Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-09-30

    The SAS4A/SASSYS-1 safety analysis software is used to perform deterministic analysis of anticipated events as well as design-basis and beyond-design-basis accidents for advanced fast reactors. It plays a central role in the analysis of U.S. DOE conceptual designs, proposed test and demonstration reactors, and in domestic and international collaborations. This report summarizes the code development activities that have taken place during FY2017. Extensions to the void and cladding reactivity feedback models have been implemented, and Control System capabilities have been improved through a new virtual data acquisition system for plant state variables and an additional Block Signal for a variable lag compensator to represent reactivity feedback for novel shutdown devices. Current code development and maintenance needs are also summarized in three key areas: software quality assurance, modeling improvements, and maintenance of related tools. With ongoing support, SAS4A/SASSYS-1 can continue to fulfill its growing role in fast reactor safety analysis and help solidify DOE’s leadership role in fast reactor safety both domestically and in international collaborations.

  20. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  1. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  2. Bearing performance degradation assessment based on time-frequency code features and SOM network

    International Nuclear Information System (INIS)

    Zhang, Yan; Tang, Baoping; Han, Yan; Deng, Lei

    2017-01-01

    Bearing performance degradation assessment and prognostics are extremely important in supporting maintenance decision and guaranteeing the system’s reliability. To achieve this goal, this paper proposes a novel feature extraction method for the degradation assessment and prognostics of bearings. Features of time-frequency codes (TFCs) are extracted from the time-frequency distribution using a hybrid procedure based on short-time Fourier transform (STFT) and non-negative matrix factorization (NMF) theory. An alternative way to design the health indicator is investigated by quantifying the similarity between feature vectors using a self-organizing map (SOM) network. On the basis of this idea, a new health indicator called time-frequency code quantification error (TFCQE) is proposed to assess the performance degradation of the bearing. This indicator is constructed based on the bearing real-time behavior and the SOM model that is previously trained with only the TFC vectors under the normal condition. Vibration signals collected from the bearing run-to-failure tests are used to validate the developed method. The comparison results demonstrate the superiority of the proposed TFCQE indicator over many other traditional features in terms of feature quality metrics, incipient degradation identification and achieving accurate prediction. Highlights • Time-frequency codes are extracted to reflect the signals’ characteristics. • SOM network served as a tool to quantify the similarity between feature vectors. • A new health indicator is proposed to demonstrate the whole stage of degradation development. • The method is useful for extracting the degradation features and detecting the incipient degradation. • The superiority of the proposed method is verified using experimental data. (paper)

  3. Presenting a conceptual pattern of HSE performance of oil trucks.

    Science.gov (United States)

    Ghaleh, Sahar; Omidvari, Manouchehr; Nassiri, Parvin; Momeni, Mansour; Lavasani, Seyed Mohammadreza Miri

    2018-01-25

    Accidents are among the main problems in the oil product supply chain. The most important effective factors in these events are the kind of trucks used and their health, safety, and environment (HSE) condition. The aim of this study was to present a conceptual pattern of the HSE performance of oil trucks in oil industries. In this study, 20 truck models (with fixed tanks), in use over different periods of time, were investigated. In this regard, the criteria and sub-criteria were first determined in two parts-carrier and tank-and weighted by fuzzy analytical hierarchy process (FAHP). The results showed that the most important sub-criteria regarding the HSE factors of the carrier were resistance and strength of the front and rear shields, the brake system, and the ventilation system. The most important sub-criteria regarding the HSE factors of the tank were tank shell thickness and a good tank design shape with respect to portable material. It should be noted that the weight of the criteria with each other and sub-criteria with each other are not equal. This issue is important for decision-making. The main reason for the use of trucks with the lowest score in developing countries is the lack of attention by managers to safety issues and international standards and agreements such as the ADR.

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  5. Coded communications with nonideal interleaving

    Science.gov (United States)

    Laufer, Shaul

    1991-02-01

    Burst error channels - a type of block interference channels - feature increasing capacity but decreasing cutoff rate as the memory rate increases. Despite the large capacity, there is degradation in the performance of practical coding schemes when the memory length is excessive. A short-coding error parameter (SCEP) was introduced, which expresses a bound on the average decoding-error probability for codes shorter than the block interference length. The performance of a coded slow frequency-hopping communication channel is analyzed for worst-case partial band jamming and nonideal interleaving, by deriving expressions for the capacity and cutoff rate. The capacity and cutoff rate, respectively, are shown to approach and depart from those of a memoryless channel corresponding to the transmission of a single code letter per hop. For multiaccess communications over a slot-synchronized collision channel without feedback, the channel was considered as a block interference channel with memory length equal to the number of letters transmitted in each slot. The effects of an asymmetrical background noise and a reduced collision error rate were studied, as aspects of real communications. The performance of specific convolutional and Reed-Solomon codes was examined for slow frequency-hopping systems with nonideal interleaving. An upper bound is presented for the performance of a Viterbi decoder for a convolutional code with nonideal interleaving, and a soft decision diversity combining technique is introduced.

  6. Distributed space-time coding

    CERN Document Server

    Jing, Yindi

    2014-01-01

    Distributed Space-Time Coding (DSTC) is a cooperative relaying scheme that enables high reliability in wireless networks. This brief presents the basic concept of DSTC, its achievable performance, generalizations, code design, and differential use. Recent results on training design and channel estimation for DSTC and the performance of training-based DSTC are also discussed.

  7. Performance evaluations of advanced massively parallel platforms based on gyrokinetic toroidal five-dimensional Eulerian code GT5D

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Jolliet, Sebastien

    2010-01-01

    A gyrokinetic toroidal five dimensional Eulerian code GT5D is ported on six advanced massively parallel platforms and comprehensive benchmark tests are performed. A parallelisation technique based on physical properties of the gyrokinetic equation is presented. By extending the parallelisation technique with a hybrid parallel model, the scalability of the code is improved on platforms with multi-core processors. In the benchmark tests, a good salability is confirmed up to several thousands cores on every platforms, and the maximum sustained performance of ∼18.6 Tflops is achieved using 16384 cores of BX900. (author)

  8. THREEDANT: A code to perform three-dimensional, neutral particle transport calculations

    International Nuclear Information System (INIS)

    Alcouffe, R.E.

    1994-01-01

    The THREEDANT code solves the three-dimensional neutral particle transport equation in its first order, multigroup, discrate ordinate form. The code allows an unlimited number of groups (depending upon the cross section set), angular quadrature up to S-100, and unlimited Pn order again depending upon the cross section set. The code has three options for spatial differencing, diamond with set-to-zero fixup, adaptive weighted diamond, and linear modal. The geometry options are XYZ and RZΘ with a special XYZ option based upon a volume fraction method. This allows objects or bodies of any shape to be modelled as input which gives the code as much geometric description flexibility as the Monte Carlo code MCNP. The transport equation is solved by source iteration accelerated by the DSA method. Both inner and outer iterations are so accelerated. Some results are presented which demonstrate the effectiveness of these techniques. The code is available on several types of computing platforms

  9. Performance-based building codes: a call for injury prevention indicators that bridge health and building sectors.

    Science.gov (United States)

    Edwards, N

    2008-10-01

    The international introduction of performance-based building codes calls for a re-examination of indicators used to monitor their implementation. Indicators used in the building sector have a business orientation, target the life cycle of buildings, and guide asset management. In contrast, indicators used in the health sector focus on injury prevention, have a behavioural orientation, lack specificity with respect to features of the built environment, and do not take into account patterns of building use or building longevity. Suggestions for metrics that bridge the building and health sectors are discussed. The need for integrated surveillance systems in health and building sectors is outlined. It is time to reconsider commonly used epidemiological indicators in the field of injury prevention and determine their utility to address the accountability requirements of performance-based codes.

  10. Performance of Multilevel Coding Schemes with Different Decoding Methods and Mapping Strategies in Mobile Fading Channels

    Institute of Scientific and Technical Information of China (English)

    YUAN Dongfeng; WANG Chengxiang; YAO Qi; CAO Zhigang

    2001-01-01

    Based on "capacity rule", the perfor-mance of multilevel coding (MLC) schemes with dif-ferent set partitioning strategies and decoding meth-ods in AWGN and Rayleigh fading channels is investi-gated, in which BCH codes are chosen as componentcodes and 8ASK modulation is used. Numerical re-sults indicate that MLC scheme with UP strategy canobtain optimal performance in AWGN channels andBP is the best mapping strategy for Rayleigh fadingchannels. BP strategy is of good robustness in bothkinds of channels to realize an optimum MLC system.Multistage decoding (MSD) is a sub-optimal decodingmethod of MLC for both channels. For Ungerboeckpartitioning (UP) and mixed partitioning (MP) strat-egy, MSD is strongly recommended to use for MLCsystem, while for BP strategy, PDL is suggested to useas a simple decoding method compared with MSD.

  11. Accelerator-driven subcritical facility:Conceptual design development

    Science.gov (United States)

    Gohar, Yousry; Bolshinsky, Igor; Naberezhnev, Dmitry; Duo, Jose; Belch, Henry; Bailey, James

    2006-06-01

    A conceptual design development of an accelerator-driven subcritical facility has been carried out in the preparation of a joint activity with Kharkov Institute of Physics and Technology of Ukraine. The main functions of the facility are the medical isotope production and the support of the Ukraine nuclear industry. An electron accelerator is considered to drive the subcritical assembly. The neutron source intensity and spectrum have been studied. The energy deposition, spatial neutron generation, neutron utilization fraction, and target dimensions have been quantified to define the main target performance parameters, and to select the target material and beam parameters. Different target conceptual designs have been developed based the engineering requirements including heat transfer, thermal hydraulics, structure, and material issues. The subcritical assembly is designed to obtain the highest possible neutron flux level with a Keff of 0.98. Different fuel materials, uranium enrichments, and reflector materials are considered in the design process. The possibility of using low enrichment uranium without penalizing the facility performance is carefully evaluated. The mechanical design of the facility has been developed to maximize its utility and minimize the time for replacing the target and the fuel assemblies. Safety, reliability, and environmental considerations are included in the facility conceptual design. The facility is configured to accommodate future design improvements, upgrades, and new missions. In addition, it has large design margins to accommodate different operating conditions and parameters. In this paper, the conceptual design and the design analyses of the facility will be presented.

  12. Development of methodology to construct a generic conceptual model of river-valley evolution for performance assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Kawamura, Makoto; Tanikawa, Shin-ichi; Yasue, Ken-ichi; Niizato, Tadafumi

    2011-01-01

    In order to assess the long-term safety of a geological disposal system for high-level radioactive waste (HLW), it is important to consider the impact of uplift and erosion, which cannot be precluded on a timescale in the order of several hundred thousand years for many locations in Japan. Geomorphic evolution, caused by uplift and erosion and coupled to climatic and sea-level changes, will impact the geological disposal system due to resulting spatial and temporal changes in the disposal environment. Degradation of HLW barrier performance will be particularly significant when the remnant repository structures near, and are eventually exposed at, the ground surface. In previous studies, fluvial erosion was densified as the key concern in most settings in Japan. Interpretation of the impact of the phenomena at relevant locations in Japan has led to development of a generic conceptual model which contains the features typical at middle reach of rivers. Here, therefore, we present a methodology for development of a generic conceptual model based on best current understanding of fluvial erosion in Japan, which identifies the simplifications and uncertainties involved and assesses their consequences in the context of repository performance. (author)

  13. Effects of Divided Attention at Retrieval on Conceptual Implicit Memory

    Directory of Open Access Journals (Sweden)

    Matthew W. Prull

    2016-01-01

    Full Text Available This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (Experiments 1 and 3, or the matched explicit memory test of category-cued recall (Experiment 2, both of which are conceptually-driven memory tasks, under one of two divided attention (DA conditions in which participants simultaneously performed a distracting task. The distracting task was either syllable judgments (dissimilar processes, or semantic judgments (similar processes on unrelated words. Compared to full attention (FA in which no distracting task was performed, DA had no effect on category exemplar generation priming overall, but reduced category-cued recall similarly regardless of distractor task. Analyses of distractor task performance also revealed differences between implicit and explicit memory retrieval. The evidence suggests that, whereas explicit memory retrieval requires attentional resources and is disrupted by semantic and phonological distracting tasks, conceptual implicit memory is automatic and unaffected even when distractor and memory tasks involve similar processes.

  14. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  15. MIMO-OFDM System's Performance Using LDPC Codes for a Mobile Robot

    Science.gov (United States)

    Daoud, Omar; Alani, Omar

    This work deals with the performance of a Sniffer Mobile Robot (SNFRbot)-based spatial multiplexed wireless Orthogonal Frequency Division Multiplexing (OFDM) transmission technology. The use of Multi-Input Multi-Output (MIMO)-OFDM technology increases the wireless transmission rate without increasing transmission power or bandwidth. A generic multilayer architecture of the SNFRbot is proposed with low power and low cost. Some experimental results are presented and show the efficiency of sniffing deadly gazes, sensing high temperatures and sending live videos of the monitored situation. Moreover, simulation results show the achieved performance by tackling the Peak-to-Average Power Ratio (PAPR) problem of the used technology using Low Density Parity Check (LDPC) codes; and the effect of combating the PAPR on the bit error rate (BER) and the signal to noise ratio (SNR) over a Doppler spread channel.

  16. Effectiveness of Persona with Personality Traits on Conceptual Design

    DEFF Research Database (Denmark)

    Anvari, Farshid; Richards, Deborah; Hitchens, Michael

    2015-01-01

    traits) on students’ performance in creating conceptual designs. Our results indicate that the students were able to identify the personality traits of personas and their ratings of the personalities match closely with the intended personalities. A majority of the participants stated that their designs...... were tailored to meet the needs of the given personas’ personality traits. Results suggest that the Holistic Personas can help students to take into account personality traits in the conceptual design process. Further studies are warranted to assess the value of incorporating Holistic Personas......Conceptual design is an important skill in Software Engineering. Teaching conceptual design that can deliver a useful product is challenging, particularly when access to real users is limited. This study explores the effects of the use of Holistic Personas (i.e. a persona enriched with personality...

  17. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  18. Performance and Complexity of Tunable Sparse Network Coding with Gradual Growing Tuning Functions over Wireless Networks

    DEFF Research Database (Denmark)

    Garrido, Pablo; Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages...... a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and computational cost. In addition, it would be difficult to implement, due to the feedback delay. In this work...

  19. Validating Avionics Conceptual Architectures with Executable Specifications

    Directory of Open Access Journals (Sweden)

    Nils Fischer

    2012-08-01

    Full Text Available Current avionics systems specifications, developed after conceptual design, have a high degree of uncertainty. Since specifications are not sufficiently validated in the early development process and no executable specification exists at aircraft level, system designers cannot evaluate the impact of their design decisions at aircraft or aircraft application level. At the end of the development process of complex systems, e. g. aircraft, an average of about 65 per cent of all specifications have to be changed because they are incorrect, incomplete or too vaguely described. In this paper, a model-based design methodology together with a virtual test environment is described that makes complex high level system specifications executable and testable during the very early levels of system design. An aircraft communication system and its system context is developed to demonstrate the proposed early validation methodology. Executable specifications for early conceptual system architectures enable system designers to couple functions, architecture elements, resources and performance parameters, often called non-functional parameters. An integrated executable specification at Early Conceptual Architecture Level is developed and used to determine the impact of different system architecture decisions on system behavior and overall performance.

  20. SIEX: a correlated code for the prediction of liquid metal fast breeder reactor (LMFBR) fuel thermal performance

    International Nuclear Information System (INIS)

    Dutt, D.S.; Baker, R.B.

    1975-06-01

    The SIEX computer program is a steady state heat transfer code developed to provide thermal performance calculations for a mixed-oxide fuel element in a fast neutron environment. Fuel restructuring, fuel-cladding heat conduction and fission gas release are modeled to provide assessment of the temperature. Modeling emphasis has been placed on correlations to measurable quantities from EBR-II irradiation tests and the inclusion of these correlations in a physically based computational scheme. SIEX is completely modular in construction allowing the user options for material properties and correlated models. Required code input is limited to geometric and environmental parameters, with a ''consistent'' set of material properties and correlated models provided by the code. 24 references. (U.S.)