WorldWideScience

Sample records for performing organization code

  1. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  2. Setting live coding performance in wider historical contexts

    OpenAIRE

    Norman, Sally Jane

    2016-01-01

    This paper sets live coding in the wider context of performing arts, construed as the poetic modelling and projection of liveness. Concepts of liveness are multiple, evolving, and scale-dependent: entities considered live from different cultural perspectives range from individual organisms and social groupings to entire ecosystems, and consequently reflect diverse temporal and spatial orders. Concepts of liveness moreover evolve with our tools, which generate and reveal new senses and places ...

  3. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  4. Blood and Books: Performing Code Switching

    Directory of Open Access Journals (Sweden)

    Jeff Friedman

    2008-05-01

    Full Text Available Code switching is a linguistic term that identifies ways individuals use communication modes and registers to negotiate difference in social relations. This essay suggests that arts-based inquiry, in the form of choreography and performance, provides a suitable and efficacious location within which both verbal and nonverbal channels of code switching can be investigated. Blood and Books, a case study of dance choreography within the context of post-colonial Maori performance in Aotearoa/New Zealand, is described and analyzed for its performance of code switching. The essay is framed by a discussion of how arts-based research within tertiary higher education requires careful negotiation in the form of code switching, as performed by the author's reflexive use of vernacular and formal registers in the essay. URN: urn:nbn:de:0114-fqs0802462

  5. Establishing a code of ethics for nuclear operating organizations

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA Technical Working Group on Training and Qualification of Nuclear Power Plant Personnel (TWG-T and Q) recommended that the IAEA develop a publication on improving the performance of nuclear facility operating organizations through focusing on the ethics and professionalism of personnel at all levels of such organizations. This publication has been prepared in response to that recommendation. The TWG-T and Q made its recommendation based upon an understanding that an organization's code of ethics should apply to behaviours at all levels of the organization; from the Board Room to the working level. The TWG-T and Q also recognized that having the technical competencies related to nuclear technology is not enough to ensure that an operating organization's performance is at the high standards needed for a sustainable nuclear industry. The values and ethics of individuals and organizational units play an equally important role. This publication is addressed primarily to senior managers of operating organizations, as experience has shown that, in order to succeed, such initiatives need to come from and be continually supported by the highest levels of the organization. This publication was developed under an IAEA project in its 2006-7 programme entitled Achieving Excellence in the Performance of Nuclear Power Plant Personnel. The principal objectives of this project were: - To enhance the capability of Member States to utilize proven practices accumulated, developed and transferred by the Agency for improving personnel performance and maintaining high standards, and - To demonstrate how positive attitudes and professionalism, appropriate performance management, adherence to a systematic approach to training, quality management and the use of effective information and knowledge management technologies contribute to the success in achieving organization objectives in a challenging business environment

  6. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  7. Modification in the FUDA computer code to predict fuel performance at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Das, M; Arunakumar, B V; Prasad, P N [Nuclear Power Corp., Mumbai (India)

    1997-08-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig.

  8. Modification in the FUDA computer code to predict fuel performance at high burnup

    International Nuclear Information System (INIS)

    Das, M.; Arunakumar, B.V.; Prasad, P.N.

    1997-01-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig

  9. Code of ethics as a tool for resolving conflict in the organization

    Directory of Open Access Journals (Sweden)

    Prokopenko O.

    2016-02-01

    Full Text Available This article addresses the issues as selection tools to resolve conflicts in organizations because of the importance and topicality of this issue. One such tool among these is the effective functioning of the organization code of ethics. This document, the more detailed, the more effective. Nowadays, more and more organizations have their own codes of ethics. Therefore, it would be wrong underestimation of the code of ethics as a tool that could be used to resolve conflicts in organizations.

  10. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  11. Performance measures for transform data coding.

    Science.gov (United States)

    Pearl, J.; Andrews, H. C.; Pratt, W. K.

    1972-01-01

    This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.

  12. Developments of fuel performance analysis codes in KEPCO NF

    International Nuclear Information System (INIS)

    Han, H. T.; Choi, J. M.; Jung, C. D.; Yoo, J. S.

    2012-01-01

    The KEPCO NF has developed fuel performance analysis and design code named as ROPER, and utility codes of XGCOL and XDNB in order to perform fuel rod design evaluation for Korean nuclear power plants. The ROPER code intends to cover full range of fuel performance evaluation. The XGCOL code is for the clad flattening evaluation and the XDNB code is for the extensive DNB propagation evaluation. In addition to these, the KEPCO NF is now in the developing stage for 3-dimensional fuel performance analysis code, named as OPER3D, using 3-dimensional FEM for the nest generation within the joint project CANDU ENERGY in order to analyze PCMI behavior and fuel performance under load following operation. Of these, the ROPER code is now in the stage of licensing activities by Korean regulatory body and the other two are almost in the final developing stage. After finishing the developing, licensing activities are to be performed. These activities are intending to acquire competitiveness, originality, vendor-free ownership of fuel performance codes in the KEPCO NF

  13. On the Performance of the Cache Coding Protocol

    Directory of Open Access Journals (Sweden)

    Behnaz Maboudi

    2018-03-01

    Full Text Available Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent these attacks while allowing for the benefits of coding in mesh networks, the cache coding protocol was proposed. This protocol only allows recoding at the relays when the relay has received enough coded packets to decode an entire generation of packets. At that point, the relay node recodes and signs the recoded packets with its own private key, allowing the system to detect and minimize the effect of pollution attacks and making the relays accountable for changes on the data. This paper analyzes the delay performance of cache coding to understand the security-performance trade-off of this scheme. We introduce an analytical model for the case of two relays in an erasure channel relying on an absorbing Markov chain and an approximate model to estimate the performance in terms of the number of transmissions before successfully decoding at the receiver. We confirm our analysis using simulation results. We show that cache coding can overcome the security issues of unrestricted recoding with only a moderate decrease in system performance.

  14. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  15. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  16. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  17. Practical moral codes in the transgenic organism debate.

    Science.gov (United States)

    Cooley, D R; Goreham, Gary; Youngs, George A

    2004-01-01

    In one study funded by the United States Department of Agriculture, people from North Dakota were interviewed to discover which moral principles they use in evaluating the morality of transgenic organisms and their introduction into markets. It was found that although the moral codes the human subjects employed were very similar, their views on transgenics were vastly different. In this paper, the codes that were used by the respondents are developed, compared to that of the academically composed Belmont Report, and then modified to create the more practical Common Moral Code. At the end, it is shown that the Common Moral Code has inherent inconsistency flaws that might be resolvable, but would require extensive work on the definition of terms and principles. However, the effort is worthwhile, especially if it results in a common moral code that all those involved in the debate are willing to use in negotiating a resolution to their differences.

  18. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  19. The UK core performance code package

    International Nuclear Information System (INIS)

    Hutt, P.K.; Gaines, N.; McEllin, M.; White, R.J.; Halsall, M.J.

    1991-01-01

    Over the last few years work has been co-ordinated by Nuclear Electric, originally part of the Central Electricity Generating Board, with contributions from the United Kingdom Atomic Energy Authority and British Nuclear Fuels Limited, to produce a generic, easy-to-use and integrated package of core performance codes able to perform a comprehensive range of calculations for fuel cycle design, safety analysis and on-line operational support for Light Water Reactor and Advanced Gas Cooled Reactor plant. The package consists of modern rationalized generic codes for lattice physics (WIMS), whole reactor calculations (PANTHER), thermal hydraulics (VIPRE) and fuel performance (ENIGMA). These codes, written in FORTRAN77, are highly portable and new developments have followed modern quality assurance standards. These codes can all be run ''stand-alone'' but they are also being integrated within a new UNIX-based interactive system called the Reactor Physics Workbench (RPW). The RPW provides an interactive user interface and a sophisticated data management system. It offers quality assurance features to the user and has facilities for defining complex calculational sequences. The Paper reviews the current capabilities of these components, their integration within the package and outlines future developments underway. Finally, the Paper describes the development of an on-line version of this package which is now being commissioned on UK AGR stations. (author)

  20. BER performance comparison of optical CDMA systems with/without turbo codes

    Science.gov (United States)

    Kulkarni, Muralidhar; Chauhan, Vijender S.; Dutta, Yashpal; Sinha, Ravindra K.

    2002-08-01

    In this paper, we have analyzed and simulated the BER performance of a turbo coded optical code-division multiple-access (TC-OCDMA) system. A performance comparison has been made between uncoded OCDMA and TC-OCDMA systems employing various OCDMA address codes (optical orthogonal codes (OOCs), Generalized Multiwavelength Prime codes (GMWPC's), and Generalized Multiwavelength Reed Solomon code (GMWRSC's)). The BER performance of TC-OCDMA systems has been analyzed and simulated by varying the code weight of address code employed by the system. From the simulation results, it is observed that lower weight address codes can be employed for TC-OCDMA systems that can have the equivalent BER performance of uncoded systems employing higher weight address codes for a fixed number of active users.

  1. Performance of code 'FAIR' in IAEA CRP on FUMEX

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Kakodkar, A.

    1996-01-01

    A modern fuel performance analysis code FAIR has been developed for analysing high burnup fuel pins of water/heavy water cooled reactors. The code employs finite element method for modelling thermo mechanical behaviour of fuel pins and mechanistic models for modelling various physical and chemical phenomena affecting the behaviour of nuclear reactor fuel pins. High burnup affects such as pellet thermal conductivity degradation, enhanced fission gas release and radial flux redistribution are incorporated in the code FAIR. The code FAIR is capable of performing statistical analysis of fuel pins using Monte Carlo technique. The code is implemented on BARC parallel processing system ANUPAM. The code has recently participated in an International Atomic Energy Agency (IAEA) coordinated research program (CRP) on fuel modelling at extended burnups (FUMEX). Nineteen agencies from different countries participated in this exercise. In this CRP, spread over a period of three years, a number of high burnup fuel pins irradiated at Halden reactor are analysed. The first phase of the CRP is a blind code comparison exercise, where the computed results are compared with experimental results. The second phase consists of modifications to the code based on the experimental results of first phase and statistical analysis of fuel pins. The performance of the code FAIR in this CRP has been very good. The present report highlights the main features of code FAIR and its performance in the IAEA CRP on FUMEX. 14 refs., 5 tabs., ills

  2. Long Non-Coding RNAs in Metabolic Organs and Energy Homeostasis

    Directory of Open Access Journals (Sweden)

    Maude Giroud

    2017-11-01

    Full Text Available Single cell organisms can surprisingly exceed the number of human protein-coding genes, which are thus not at the origin of the complexity of an organism. In contrast, the relative amount of non-protein-coding sequences increases consistently with organismal complexity. Moreover, the mammalian transcriptome predominantly comprises non-(protein-coding RNAs (ncRNA, of which the long ncRNAs (lncRNAs constitute the most abundant part. lncRNAs are highly species- and tissue-specific with very versatile modes of action in accordance with their binding to a large spectrum of molecules and their diverse localization. lncRNAs are transcriptional regulators adding an additional regulatory layer in biological processes and pathophysiological conditions. Here, we review lncRNAs affecting metabolic organs with a focus on the liver, pancreas, skeletal muscle, cardiac muscle, brain, and adipose organ. In addition, we will discuss the impact of lncRNAs on metabolic diseases such as obesity and diabetes. In contrast to the substantial number of lncRNA loci in the human genome, the functionally characterized lncRNAs are just the tip of the iceberg. So far, our knowledge concerning lncRNAs in energy homeostasis is still in its infancy, meaning that the rest of the iceberg is a treasure chest yet to be discovered.

  3. On the performance of diagonal lattice space-time codes

    KAUST Repository

    Abediseid, Walid

    2013-11-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple output (MIMO) channel. All the coding design up-to-date focuses on either high-performance, high rates, low complexity encoding and decoding, or targeting a combination of these criteria [1]-[9]. In this paper, we analyze in details the performance limits of diagonal lattice space-time codes under lattice decoding. We present both lower and upper bounds on the average decoding error probability. We first derive a new closed-form expression for the lower bound using the so-called sphere lower bound. This bound presents the ultimate performance limit a diagonal lattice space-time code can achieve at any signal-to-noise ratio (SNR). The upper bound is then derived using the union-bound which demonstrates how the average error probability can be minimized by maximizing the minimum product distance of the code. Combining both the lower and the upper bounds on the average error probability yields a simple upper bound on the the minimum product distance that any (complex) lattice code can achieve. At high-SNR regime, we discuss the outage performance of such codes and provide the achievable diversity-multiplexing tradeoff under lattice decoding. © 2013 IEEE.

  4. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  5. Performance of FSO-OFDM based on BCH code

    Directory of Open Access Journals (Sweden)

    Jiao Xiao-lu

    2016-01-01

    Full Text Available As contrasted with the traditional OOK (on-off key system, FSO-OFDM system can resist the atmospheric scattering and improve the spectrum utilization rate effectively. Due to the instability of the atmospheric channel, the system will be affected by various factors, and resulting in a high BER. BCH code has a good error correcting ability, particularly in the short-length and medium-length code, and its performance is close to the theoretical value. It not only can check the burst errors but also can correct the random errors. Therefore, the BCH code is applied to the system to reduce the system BER. At last, the semi-physical simulation has been conducted with MATLAB. The simulation results show that when the BER is 10-2, the performance of OFDM is superior 4dB compared with OOK. In different weather conditions (extension rain, advection fog, dust days, when the BER is 10-5, the performance of BCH (255,191 channel coding is superior 4~5dB compared with uncoded system. All in all, OFDM technology and BCH code can reduce the system BER.

  6. Performance of JPEG Image Transmission Using Proposed Asymmetric Turbo Code

    Directory of Open Access Journals (Sweden)

    Siddiqi Mohammad Umar

    2007-01-01

    Full Text Available This paper gives the results of a simulation study on the performance of JPEG image transmission over AWGN and Rayleigh fading channels using typical and proposed asymmetric turbo codes for error control coding. The baseline JPEG algorithm is used to compress a QCIF ( "Suzie" image. The recursive systematic convolutional (RSC encoder with generator polynomials , that is, (13/11 in decimal, and 3G interleaver are used for the typical WCDMA and CDMA2000 turbo codes. The proposed asymmetric turbo code uses generator polynomials , that is, (13/11; 13/9 in decimal, and a code-matched interleaver. The effect of interleaver in the proposed asymmetric turbo code is studied using weight distribution and simulation. The simulation results and performance bound for proposed asymmetric turbo code for the frame length , code rate with Log-MAP decoder over AWGN channel are compared with the typical system. From the simulation results, it is observed that the image transmission using proposed asymmetric turbo code performs better than that with the typical system.

  7. Predictive Bias and Sensitivity in NRC Fuel Performance Codes

    Energy Technology Data Exchange (ETDEWEB)

    Geelhood, Kenneth J.; Luscher, Walter G.; Senor, David J.; Cunningham, Mitchel E.; Lanning, Donald D.; Adkins, Harold E.

    2009-10-01

    The latest versions of the fuel performance codes, FRAPCON-3 and FRAPTRAN were examined to determine if the codes are intrinsically conservative. Each individual model and type of code prediction was examined and compared to the data that was used to develop the model. In addition, a brief literature search was performed to determine if more recent data have become available since the original model development for model comparison.

  8. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  9. A comparison of thermal algorithms of fuel rod performance code systems

    International Nuclear Information System (INIS)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C.

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance

  10. A comparison of thermal algorithms of fuel rod performance code systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. J.; Park, J. H.; Kang, K. H.; Ryu, H. J.; Moon, J. S.; Jeong, I. H.; Lee, C. Y.; Song, K. C

    2003-11-01

    The goal of the fuel rod performance is to identify the robustness of a fuel rod with cladding material. Computer simulation of the fuel rod performance becomes one of important parts to designed and evaluate new nuclear fuels and claddings. To construct a computing code system for the fuel rod performance, several algorithms of the existing fuel rod performance code systems are compared and are summarized as a preliminary work. Among several code systems, FRAPCON, and FEMAXI for LWR, ELESTRES for CANDU reactor, and LIFE for fast reactor are reviewed. Thermal algorithms of the above codes are investigated including methodologies and subroutines. This work will be utilized to construct a computing code system for dry process fuel rod performance.

  11. Performance testing of thermal analysis codes for nuclear fuel casks

    International Nuclear Information System (INIS)

    Sanchez, L.C.

    1987-01-01

    In 1982 Sandia National Laboratories held the First Industry/Government Joint Thermal and Structural Codes Information Exchange and presented the initial stages of an investigation of thermal analysis computer codes for use in the design of nuclear fuel shipping casks. The objective of the investigation was to (1) document publicly available computer codes, (2) assess code capabilities as determined from their user's manuals, and (3) assess code performance on cask-like model problems. Computer codes are required to handle the thermal phenomena of conduction, convection and radiation. Several of the available thermal computer codes were tested on a set of model problems to assess performance on cask-like problems. Solutions obtained with the computer codes for steady-state thermal analysis were in good agreement and the solutions for transient thermal analysis differed slightly among the computer codes due to modeling differences

  12. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  13. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  14. Performance Evaluation of Spectral Amplitude Codes for OCDMA PON

    DEFF Research Database (Denmark)

    Binti Othman, Maisara; Jensen, Jesper Bevensee; Zhang, Xu

    2011-01-01

    the MAI effects in OCDMA. The performance has been characterized through received optical power (ROP) sensitivity and dispersion tolerance assessments. The numerical results show that the ZCC code has a slightly better performance compared to the other two codes for the ROP and similar behavior against...

  15. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  16. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  17. Preserving Envelope Efficiency in Performance Based Code Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A. [Thornton Energy Consulting (United States); Sullivan, Greg P. [Efficiency Solutions (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baechler, Michael C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-20

    The City of Seattle 2012 Energy Code (Seattle 2014), one of the most progressive in the country, is under revision for its 2015 edition. Additionally, city personnel participate in the development of the next generation of the Washington State Energy Code and the International Energy Code. Seattle has pledged carbon neutrality by 2050 including buildings, transportation and other sectors. The United States Department of Energy (DOE), through Pacific Northwest National Laboratory (PNNL) provided technical assistance to Seattle in order to understand the implications of one potential direction for its code development, limiting trade-offs of long-lived building envelope components less stringent than the prescriptive code envelope requirements by using better-than-code but shorter-lived lighting and heating, ventilation, and air-conditioning (HVAC) components through the total building performance modeled energy compliance path. Weaker building envelopes can permanently limit building energy performance even as lighting and HVAC components are upgraded over time, because retrofitting the envelope is less likely and more expensive. Weaker building envelopes may also increase the required size, cost and complexity of HVAC systems and may adversely affect occupant comfort. This report presents the results of this technical assistance. The use of modeled energy code compliance to trade-off envelope components with shorter-lived building components is not unique to Seattle and the lessons and possible solutions described in this report have implications for other jurisdictions and energy codes.

  18. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K. [Cray Inc., St. Paul, MN 55101 (United States); Porter, D. [Minnesota Supercomputing Institute for Advanced Computational Research, Minneapolis, MN USA (United States); O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W. [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Edmon, P., E-mail: pjm@cray.com, E-mail: nradclif@cray.com, E-mail: kkandalla@cray.com, E-mail: oneill@astro.umn.edu, E-mail: nolt0040@umn.edu, E-mail: donnert@ira.inaf.it, E-mail: twj@umn.edu, E-mail: dhp@umn.edu, E-mail: pedmon@cfa.harvard.edu [Institute for Theory and Computation, Center for Astrophysics, Harvard University, Cambridge, MA 02138 (United States)

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  19. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    International Nuclear Information System (INIS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W.; Edmon, P.

    2017-01-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  20. 1 CFR 21.14 - Deviations from standard organization of the Code of Federal Regulations.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Deviations from standard organization of the... CODIFICATION General Numbering § 21.14 Deviations from standard organization of the Code of Federal Regulations. (a) Any deviation from standard Code of Federal Regulations designations must be approved in advance...

  1. Organization of Risk Analysis Codes for Living Evaluations (ORACLE)

    International Nuclear Information System (INIS)

    Batt, D.L.; MacDonald, P.E.; Sattison, M.B.; Vesely, E.

    1987-01-01

    ORACLE (Organization of Risk Analysis Codes for Living Evaluations) is an integration concept for using risk-based information in United States Nuclear Regulatory Commission (USNRC) applications. Portions of ORACLE are being developed at the Idaho Nationale Engineering Laboratory for the USNRC. The ORACLE concept consists of related databases, software, user interfaces, processes, and quality control checks allowing a wide variety of regulatory problems and activities to be addressed using current, updated PRA information. The ORACLE concept provides for smooth transitions between one code and the next without pre- or post-processing. (orig.)

  2. On the Performance of the Cache Coding Protocol

    DEFF Research Database (Denmark)

    Maboudi, Behnaz; Sehat, Hadi; Pahlevani, Peyman

    2018-01-01

    Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent thes...

  3. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  4. The fuel performance code future

    International Nuclear Information System (INIS)

    Ronchi, C.; Van de Laar, J.

    1988-01-01

    The paper describes the LWR version of the fuel performance code FUTURE, which was recently developed to calculate the fuel response (swelling, cladding deformation, release) to reactor transient conditions, starting from a broad-based description of the processes of major concern. The main physical models assumed are presented together with the scheme of the computer program

  5. Development of LWR fuel performance code FEMAXI-6

    International Nuclear Information System (INIS)

    Suzuki, Motoe

    2006-01-01

    LWR fuel performance code: FEMAXI-6 (Finite Element Method in AXIs-symmetric system) is a representative fuel analysis code in Japan. Development history, background, design idea, features of model, and future are stated. Characteristic performance of LWR fuel and analysis code, what is model, development history of FEMAXI, use of FEMAXI code, fuel model, and a special feature of FEMAXI model is described. As examples of analysis, PCMI (Pellet-Clad Mechanical Interaction), fission gas release, gap bonding, and fission gas bubble swelling are reported. Thermal analysis and dynamic analysis system of FEMAXI-6, function block at one time step of FEMAXI-6, analytical example of PCMI in the output increase test by FEMAXI-III, analysis of fission gas release in Halden reactor by FEMAXI-V, comparison of the center temperature of fuel in Halden reactor, and analysis of change of diameter of fuel rod in high burn up BWR fuel are shown. (S.Y.)

  6. Performance Analysis of Optical Code Division Multiplex System

    Science.gov (United States)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  7. Code structure for U-Mo fuel performance analysis in high performance research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Gwan Yoon; Cho, Tae Won; Lee, Chul Min; Sohn, Dong Seong [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Kyu Hong; Park, Jong Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A performance analysis modeling applicable to research reactor fuel is being developed with available models describing fuel performance phenomena observed from in-pile tests. We established the calculation algorithm and scheme to best predict fuel performance using radio-thermo-mechanically coupled system to consider fuel swelling, interaction layer growth, pore formation in the fuel meat, and creep fuel deformation and mass relocation, etc. In this paper, we present a general structure of the performance analysis code for typical research reactor fuel and advanced features such as a model to predict fuel failure induced by combination of breakaway swelling and pore growth in the fuel meat. Thermo-mechanical code dedicated to the modeling of U-Mo dispersion fuel plates is being under development in Korea to satisfy a demand for advanced performance analysis and safe assessment of the plates. The major physical phenomena during irradiation are considered in the code such that interaction layer formation by fuel-matrix interdiffusion, fission induced swelling of fuel particle, mass relocation by fission induced stress, and pore formation at the interface between the reaction product and Al matrix.

  8. Structure of fuel performance audit code for SFR metal fuel

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yong Sik; Kim, Hyo Chan [KAERI, Daejeon (Korea, Republic of); Jeong, Hye Dong; Shin, An Dong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    A Sodium Cooled Fast Reactor (SFR) is a promising option to solve the spent fuel problems, but, there are still much technical issues to commercialize a SFR. One of issues is a development of advanced fuel which can solve the safety and the economic issues at the same time. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured. In Korea Institute of Nuclear Safety (KINS), the new project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. To develop the new code system, the code structure design and its requirements need to be studied. Various performance models and code systems are reviewed and their characteristics are analyzed in this paper. Based on this study, the fundamental performance models are deduced and basic code requirements and structure are established.

  9. Performance Analysis of New Binary User Codes for DS-CDMA Communication

    Science.gov (United States)

    Usha, Kamle; Jaya Sankar, Kottareddygari

    2016-03-01

    This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.

  10. A fuel performance code TRUST VIc and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, M; Kogai, T [Nippon Nuclear Fuel Development Co. Ltd., Oarai, Ibaraki (Japan)

    1997-08-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs.

  11. A fuel performance code TRUST VIc and its validation

    International Nuclear Information System (INIS)

    Ishida, M.; Kogai, T.

    1997-01-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs

  12. The NMC code: conduct, performance and ethics.

    Science.gov (United States)

    Goldsmith, Jan

    The Code: Standards of Conduct, Performance and Ethics for Nurses and Midwives is a set of key principles that should underpin the practice of all nurses and midwives, and remind them of their professional responsibilities. It is not just a tool used in fitness-to-practise cases--it should be used to guide daily practice for all nurses and midwives. Alongside other standards, guidance and advice from the NMC, the code should be used to support professional development.

  13. On the performance of diagonal lattice space-time codes

    KAUST Repository

    Abediseid, Walid; Alouini, Mohamed-Slim

    2013-01-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple output (MIMO) channel. All the coding design up-to-date focuses on either high-performance, high rates, low complexity encoding

  14. The METEOR/TRANSURANUS fuel performance code

    International Nuclear Information System (INIS)

    Struzik, C.; Guerin, Y.

    1996-01-01

    The first calculations for the FUMEX exercise were performed using version 1.1 of the METEOR/TRANSURANUS code. Since then, important improvements have been implemented on several models. In its present state, the code describes fuel rod behaviour in standard PWR conditions. Its validity extends to UO 2 and MOX fuels clad in Zircaloy-4. Power transient calculations for UO 2 and Gd doped fuel calculations are possible, but further developments are in progress, and the applications will be fully qualified in version 2.0. A considerable effort is made to replace semi-empirical models with models that have a sounder physical basis. (authors). 14 refs

  15. Iterative optimization of performance libraries by hierarchical division of codes

    International Nuclear Information System (INIS)

    Donadio, S.

    2007-09-01

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  16. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  17. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    Science.gov (United States)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  18. Optimizing fusion PIC code performance at scale on Cori Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, T. S.; Deslippe, J.

    2017-07-23

    In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale well up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.

  19. Performance Analysis for Cooperative Communication System with QC-LDPC Codes Constructed with Integer Sequences

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2015-01-01

    Full Text Available This paper presents four different integer sequences to construct quasi-cyclic low-density parity-check (QC-LDPC codes with mathematical theory. The paper introduces the procedure of the coding principle and coding. Four different integer sequences constructing QC-LDPC code are compared with LDPC codes by using PEG algorithm, array codes, and the Mackey codes, respectively. Then, the integer sequence QC-LDPC codes are used in coded cooperative communication. Simulation results show that the integer sequence constructed QC-LDPC codes are effective, and overall performance is better than that of other types of LDPC codes in the coded cooperative communication. The performance of Dayan integer sequence constructed QC-LDPC is the most excellent performance.

  20. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  1. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  2. The JAERI code system for evaluation of BWR ECCS performance

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Akimoto, Masayuki; Asahi, Yoshiro; Abe, Kiyoharu; Muramatsu, Ken; Araya, Fumimasa; Sato, Kazuo

    1982-12-01

    Development of respective computer code system of BWR and PWR for evaluation of ECCS has been conducted since 1973 considering the differences of the reactor cooling system, core structure and ECCS. The first version of the BWR code system, of which developmental work started earlier than that of the PWR, has been completed. The BWR code system is designed to provide computational tools to analyze all phases of LOCAs and to evaluate the performance of the ECCS including an ''Evaluation Model (EM)'' feature in compliance with the requirements of the current Japanese Evaluation Guideline of ECCS. The BWR code system could be used for licensing purpose, i.e. for ECCS performance evaluation or audit calculations to cross-examine the methods and results of applicants or vendors. The BWR code system presented in this report comprises several computer codes, each of which analyzes a particular phase of a LOCA or a system blowdown depending on a range of LOCAs, i.e. large and small breaks in a variety of locations in the reactor system. The system includes ALARM-B1, HYDY-B1 and THYDE-B1 for analysis of the system blowdown for various break sizes, THYDE-B-REFLOOD for analysis of the reflood phase and SCORCH-B2 for the calculation of the fuel assembl hot plane temperature. When the multiple codes are used to analyze a broad range of LOCA as stated above, it is very important to evaluate the adequacy and consistency between the codes used to cover an entire break spectrum. The system consistency together with the system performance are discussed for a large commercial BWR. (author)

  3. Performance analysis of LDPC codes on OOK terahertz wireless channels

    International Nuclear Information System (INIS)

    Liu Chun; Wang Chang; Cao Jun-Cheng

    2016-01-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. (paper)

  4. Performance enhancement of successive interference cancellation scheme based on spectral amplitude coding for optical code-division multiple-access systems using Hadamard codes

    Science.gov (United States)

    Eltaif, Tawfig; Shalaby, Hossam M. H.; Shaari, Sahbudin; Hamarsheh, Mohammad M. N.

    2009-04-01

    A successive interference cancellation scheme is applied to optical code-division multiple-access (OCDMA) systems with spectral amplitude coding (SAC). A detailed analysis of this system, with Hadamard codes used as signature sequences, is presented. The system can easily remove the effect of the strongest signal at each stage of the cancellation process. In addition, simulation of the prose system is performed in order to validate the theoretical results. The system shows a small bit error rate at a large number of active users compared to the SAC OCDMA system. Our results reveal that the proposed system is efficient in eliminating the effect of the multiple-user interference and in the enhancement of the overall performance.

  5. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  6. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  7. FEMAXI-III, a computer code for fuel rod performance analysis

    International Nuclear Information System (INIS)

    Ito, K.; Iwano, Y.; Ichikawa, M.; Okubo, T.

    1983-01-01

    This paper presents a method of fuel rod thermal-mechanical performance analysis used in the FEMAXI-III code. The code incorporates the models describing thermal-mechanical processes such as pellet-cladding thermal expansion, pellet irradiation swelling, densification, relocation and fission gas release as they affect pellet-cladding gap thermal conductance. The code performs the thermal behavior analysis of a full-length fuel rod within the framework of one-dimensional multi-zone modeling. The mechanical effects including ridge deformation is rigorously analyzed by applying the axisymmetric finite element method. The finite element geometrical model is confined to a half-pellet-height region with the assumption that pellet-pellet interaction is symmetrical. The 8-node quadratic isoparametric ring elements are adopted for obtaining accurate finite element solutions. The Newton-Raphson iteration with an implicit algorithm is applied to perform the analysis of non-linear material behaviors accurately and stably. The pellet-cladding interaction mechanism is exactly treated using the nodal continuity conditions. The code is applicable to the thermal-mechanical analysis of water reactor fuel rods experiencing variable power histories. (orig.)

  8. SCANAIR: A transient fuel performance code

    International Nuclear Information System (INIS)

    Moal, Alain; Georgenthum, Vincent; Marchand, Olivier

    2014-01-01

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  9. SCANAIR: A transient fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Moal, Alain, E-mail: alain.moal@irsn.fr; Georgenthum, Vincent; Marchand, Olivier

    2014-12-15

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  10. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Directory of Open Access Journals (Sweden)

    Nawawi N. M.

    2017-01-01

    Full Text Available In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA system using Zero Cross Correlation (ZCC code and multiband Orthogonal Frequency Division Multiplexing (OFDM called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  11. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Science.gov (United States)

    Nawawi, N. M.; Anuar, M. S.; Junita, M. N.; Rashidi, C. B. M.

    2017-11-01

    In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA) system using Zero Cross Correlation (ZCC) code and multiband Orthogonal Frequency Division Multiplexing (OFDM) called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  12. Performance Analysis of Faulty Gallager-B Decoding of QC-LDPC Codes with Applications

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2014-06-01

    Full Text Available In this paper we evaluate the performance of Gallager-B algorithm, used for decoding low-density parity-check (LDPC codes, under unreliable message computation. Our analysis is restricted to LDPC codes constructed from circular matrices (QC-LDPC codes. Using Monte Carlo simulation we investigate the effects of different code parameters on coding system performance, under a binary symmetric communication channel and independent transient faults model. One possible application of the presented analysis in designing memory architecture with unreliable components is considered.

  13. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  14. Performance of Product Codes and Related Structures with Iterated Decoding

    DEFF Research Database (Denmark)

    Justesen, Jørn

    2011-01-01

    Several modifications of product codes have been suggested as standards for optical networks. We show that the performance exhibits a threshold that can be estimated from a result about random graphs. For moderate input bit error probabilities, the output error rates for codes of finite length can...

  15. PAPIRUS - a computer code for FBR fuel performance analysis

    International Nuclear Information System (INIS)

    Kobayashi, Y.; Tsuboi, Y.; Sogame, M.

    1991-01-01

    The FBR fuel performance analysis code PAPIRUS has been developed to design fuels for demonstration and future commercial reactors. A pellet structural model was developed to describe the generation, depletion and transport of vacancies and atomic elements in unified fashion. PAPIRUS results in comparison with the power - to - melt test data from HEDL showed validity of the code at the initial reactor startup. (author)

  16. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  17. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    Highlights: ► The application of advanced validation techniques (sensitivity, calibration and prediction) to nuclear performance codes FRAPCON and LIFE-4 is the focus of the paper. ► A sensitivity ranking methodology narrows down the number of selected modeling parameters from 61 to 24 for FRAPCON and from 69 to 35 for LIFE-4. ► Fuel creep, fuel thermal conductivity, fission gas transport/release, crack/boundary, and fuel gap conductivity models of LIFE-4 are identified for improvements. ► FRAPCON sensitivity results indicated the importance of the fuel thermal conduction and the fission gas release models. -- Abstract: Evolving nuclear energy programs expect to use enhanced modeling and simulation (M and S) capabilities, using multiscale, multiphysics modeling approaches, to reduce both cost and time from the design through the licensing phases. Interest in the development of the multiscale, multiphysics approach has increased in the last decade because of the need for predictive tools for complex interacting processes as a means of eliminating the limited use of empirically based model development. Complex interacting processes cannot be predicted by analyzing each individual component in isolation. In most cases, the mathematical models of complex processes and their boundary conditions are nonlinear. As a result, the solutions of these mathematical models often require high-performance computing capabilities and resources. The use of multiscale, multiphysics (MS/MP) models in conjunction with high-performance computational software and hardware introduces challenges in validating these predictive tools—traditional methodologies will have to be modified to address these challenges. The advanced MS/MP codes for nuclear fuels and reactors are being developed within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of the US Department of Energy (DOE) – Nuclear Energy (NE). This paper does not directly address challenges in calibration

  18. The development of the fuel rod transient performance analysis code FTPAC

    International Nuclear Information System (INIS)

    Han Zhijie; Ji Songtao

    2014-01-01

    Fuel rod behavior, especially the integrity of cladding, played an important role in fuel safety research during reactor transient and hypothetical accidents conditions. In order to study fuel rod performance under transient accidents, FTPAC (Fuel Transient Performance Analysis Code) has been developed for simulating light water reactor fuel rod transient behavior when power or coolant boundary conditions are rapidly changing. It is composed of temperature, mechanical deformation, cladding oxidation and gas pressure model. The assessment was performed by comparing FTPAC code analysis result to experiments data and FRAPTRAN code calculations. Comparison shows that, the FTPAC gives reasonable agreement in temperature, deformation and gas pressure prediction. And the application of slip coefficient is more suitable for simulating the sliding between pellet and cladding when the gap is closed. (authors)

  19. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  20. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/..mu..Ci-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult.

  1. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/μCi-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult

  2. Sensitivity Analysis of FEAST-Metal Fuel Performance Code: Initial Results

    International Nuclear Information System (INIS)

    Edelmann, Paul Guy; Williams, Brian J.; Unal, Cetin; Yacout, Abdellatif

    2012-01-01

    This memo documents the completion of the LANL milestone, M3FT-12LA0202041, describing methodologies and initial results using FEAST-Metal. The FEAST-Metal code calculations for this work are being conducted at LANL in support of on-going activities related to sensitivity analysis of fuel performance codes. The objective is to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. This report summarizes our preliminary results for the sensitivity analysis using 6 calibration datasets for metallic fuel developed at ANL for EBR-II experiments. Sensitivity ranking methodology was deployed to narrow down the selected parameters for the current study. There are approximately 84 calibration parameters in the FEAST-Metal code, of which 32 were ultimately used in Phase II of this study. Preliminary results of this sensitivity analysis led to the following ranking of FEAST models for future calibration and improvements: fuel conductivity, fission gas transport/release, fuel creep, and precipitation kinetics. More validation data is needed to validate calibrated parameter distributions for future uncertainty quantification studies with FEAST-Metal. Results of this study also served to point out some code deficiencies and possible errors, and these are being investigated in order to determine root causes and to improve upon the existing code models.

  3. High-performance computational fluid dynamics: a custom-code approach

    International Nuclear Information System (INIS)

    Fannon, James; Náraigh, Lennon Ó; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain

    2016-01-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier–Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing. (paper)

  4. High-performance computational fluid dynamics: a custom-code approach

    Science.gov (United States)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  5. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  6. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  7. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines

    Science.gov (United States)

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology’s Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1–4.6%), respectively (Pethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities. PMID:26192805

  8. PERFORMANCE ANALYSIS OF OPTICAL CDMA SYSTEM USING VC CODE FAMILY UNDER VARIOUS OPTICAL PARAMETERS

    Directory of Open Access Journals (Sweden)

    HASSAN YOUSIF AHMED

    2012-06-01

    Full Text Available The intent of this paper is to study the performance of spectral-amplitude coding optical code-division multiple-access (OCDMA systems using Vector Combinatorial (VC code under various optical parameters. This code can be constructed by an algebraic way based on Euclidian vectors for any positive integer number. One of the important properties of this code is that the maximum cross-correlation is always one which means that multi-user interference (MUI and phase induced intensity noise are reduced. Transmitter and receiver structures based on unchirped fiber Bragg grating (FBGs using VC code and taking into account effects of the intensity, shot and thermal noise sources is demonstrated. The impact of the fiber distance effects on bit error rate (BER is reported using a commercial optical systems simulator, virtual photonic instrument, VPITM. The VC code is compared mathematically with reported codes which use similar techniques. We analyzed and characterized the fiber link, received power, BER and channel spacing. The performance and optimization of VC code in SAC-OCDMA system is reported. By comparing the theoretical and simulation results taken from VPITM, we have demonstrated that, for a high number of users, even if data rate is higher, the effective power source is adequate when the VC is used. Also it is found that as the channel spacing width goes from very narrow to wider, the BER decreases, best performance occurs at a spacing bandwidth between 0.8 and 1 nm. We have shown that the SAC system utilizing VC code significantly improves the performance compared with the reported codes.

  9. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines.

    Science.gov (United States)

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (Porganizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.

  10. Code division multiple-access techniques in optical fiber networks. II - Systems performance analysis

    Science.gov (United States)

    Salehi, Jawad A.; Brackett, Charles A.

    1989-08-01

    A technique based on optical orthogonal codes was presented by Salehi (1989) to establish a fiber-optic code-division multiple-access (FO-CDMA) communications system. The results are used to derive the bit error rate of the proposed FO-CDMA system as a function of data rate, code length, code weight, number of users, and receiver threshold. The performance characteristics for a variety of system parameters are discussed. A means of reducing the effective multiple-access interference signal by placing an optical hard-limiter at the front end of the desired optical correlator is presented. Performance calculations are shown for the FO-CDMA with an ideal optical hard-limiter, and it is shown that using a optical hard-limiter would, in general, improve system performance.

  11. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    Science.gov (United States)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  12. Fusion PIC code performance analysis on the Cori KNL system

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, Tuomas S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Friesen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Raman, Karthic [INTEL Corp. (United States)

    2017-05-25

    We study the attainable performance of Particle-In-Cell codes on the Cori KNL system by analyzing a miniature particle push application based on the fusion PIC code XGC1. We start from the most basic building blocks of a PIC code and build up the complexity to identify the kernels that cost the most in performance and focus optimization efforts there. Particle push kernels operate at high AI and are not likely to be memory bandwidth or even cache bandwidth bound on KNL. Therefore, we see only minor benefits from the high bandwidth memory available on KNL, and achieving good vectorization is shown to be the most beneficial optimization path with theoretical yield of up to 8x speedup on KNL. In practice we are able to obtain up to a 4x gain from vectorization due to limitations set by the data layout and memory latency.

  13. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  14. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  15. Bearing performance degradation assessment based on time-frequency code features and SOM network

    International Nuclear Information System (INIS)

    Zhang, Yan; Tang, Baoping; Han, Yan; Deng, Lei

    2017-01-01

    Bearing performance degradation assessment and prognostics are extremely important in supporting maintenance decision and guaranteeing the system’s reliability. To achieve this goal, this paper proposes a novel feature extraction method for the degradation assessment and prognostics of bearings. Features of time-frequency codes (TFCs) are extracted from the time-frequency distribution using a hybrid procedure based on short-time Fourier transform (STFT) and non-negative matrix factorization (NMF) theory. An alternative way to design the health indicator is investigated by quantifying the similarity between feature vectors using a self-organizing map (SOM) network. On the basis of this idea, a new health indicator called time-frequency code quantification error (TFCQE) is proposed to assess the performance degradation of the bearing. This indicator is constructed based on the bearing real-time behavior and the SOM model that is previously trained with only the TFC vectors under the normal condition. Vibration signals collected from the bearing run-to-failure tests are used to validate the developed method. The comparison results demonstrate the superiority of the proposed TFCQE indicator over many other traditional features in terms of feature quality metrics, incipient degradation identification and achieving accurate prediction. Highlights • Time-frequency codes are extracted to reflect the signals’ characteristics. • SOM network served as a tool to quantify the similarity between feature vectors. • A new health indicator is proposed to demonstrate the whole stage of degradation development. • The method is useful for extracting the degradation features and detecting the incipient degradation. • The superiority of the proposed method is verified using experimental data. (paper)

  16. Performance in Public Organizations

    DEFF Research Database (Denmark)

    Andersen, Lotte Bøgh; Boesen, Andreas; Pedersen, Lene Holm

    2016-01-01

    of management and performance are classified. The results illustrate how a systematization of the conceptual space of performance in public organizations can help researchers select what to study and what to leave out with greater accuracy while also bringing greater clarity to public debates about performance.......Performance in public organizations is a key concept that requires clarification. Based on a conceptual review of research published in 10 public administration journals, this article proposes six distinctions to describe the systematic differences in performance criteria: From which stakeholder...

  17. Improving Eleventh Graders’ Reading Comprehension Through Text Coding and Double Entry Organizer Reading Strategies

    Directory of Open Access Journals (Sweden)

    Rocío Mahecha

    2011-07-01

    Full Text Available In this article we report on an innovation project developed with a group of eleventh graders at a public school in Bogotá. Its aim was to encourage students to improve reading comprehension of texts in English. It was conducted taking into account students' needs, interests and level of English. To do it, we implemented two reading strategies: text coding and double entry organizer. We observed the students' attitudes during two lesson plans, compared their level of comprehension before and after using the reading strategies and asked them to self-evaluate their performance. At the end, we could see their improvement, how they enjoyed doing the activities and became more confident.

  18. Sensitivity assessment of fuel performance codes for LOCA accident scenario

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Alfredo; Gomes, Daniel; Silva, Antonio Teixeira e; Muniz, Rafael O.R. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Giovedi, Claudia; Martins, Marcelo, E-mail: ayabe@ipen.br, E-mail: claudia.giovedi@labrisco.usp.br [Universidade de Sao Paulo (LABRISCO/USP), Sao Paulo, SP (Brazil). Lab. de Análise, Avaliação e Gerenciamento de Risco

    2017-07-01

    FRAPCON code predicts fuel rod performance in LWR (Light Water Reactor) by modeling fuel responses under normal operating conditions and anticipated operational occurrences; FRAPTRAN code is applied for fuel transient under fast transient and accident conditions. The codes are well known and applied for different purposes and one of the use is to address sensitivity analysis considering fuel design parameters associated to fabrication, moreover can address the effect of physical models bias. The objective of this work was to perform an assessment of fuel manufacturing parameters tolerances and fuel models bias using FRAPCON and FRAPTRAN codes for Loss of Coolant Accident (LOCA) scenario. The preliminary analysis considered direct approach taken into account most relevant manufacturing tolerances (lower and upper bounds) related to design parameters and physical models bias without considering their statistical distribution. The simulations were carried out using the data available in the open literature related to the series of LOCA experiment performed at the Halden reactor (specifically IFA-650.5). The manufacturing tolerances associated to design parameters considered in this paper were: enrichment, cladding thickness, pellet diameter, pellet density, and filling gas pressure. The physical models considered were: fuel thermal expansion, fission gas release, fuel swelling, irradiation creep, cladding thermal expansion, cladding corrosion, and cladding hydrogen pickup. The results obtained from sensitivity analysis addressed the impact of manufacturing tolerances and physical models in the fuel cladding burst time observed for the IFA-650.5 experiment. (author)

  19. Sensitivity assessment of fuel performance codes for LOCA accident scenario

    International Nuclear Information System (INIS)

    Abe, Alfredo; Gomes, Daniel; Silva, Antonio Teixeira e; Muniz, Rafael O.R.; Giovedi, Claudia; Martins, Marcelo

    2017-01-01

    FRAPCON code predicts fuel rod performance in LWR (Light Water Reactor) by modeling fuel responses under normal operating conditions and anticipated operational occurrences; FRAPTRAN code is applied for fuel transient under fast transient and accident conditions. The codes are well known and applied for different purposes and one of the use is to address sensitivity analysis considering fuel design parameters associated to fabrication, moreover can address the effect of physical models bias. The objective of this work was to perform an assessment of fuel manufacturing parameters tolerances and fuel models bias using FRAPCON and FRAPTRAN codes for Loss of Coolant Accident (LOCA) scenario. The preliminary analysis considered direct approach taken into account most relevant manufacturing tolerances (lower and upper bounds) related to design parameters and physical models bias without considering their statistical distribution. The simulations were carried out using the data available in the open literature related to the series of LOCA experiment performed at the Halden reactor (specifically IFA-650.5). The manufacturing tolerances associated to design parameters considered in this paper were: enrichment, cladding thickness, pellet diameter, pellet density, and filling gas pressure. The physical models considered were: fuel thermal expansion, fission gas release, fuel swelling, irradiation creep, cladding thermal expansion, cladding corrosion, and cladding hydrogen pickup. The results obtained from sensitivity analysis addressed the impact of manufacturing tolerances and physical models in the fuel cladding burst time observed for the IFA-650.5 experiment. (author)

  20. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  1. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  2. A general purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.; Rochester Univ., NY

    1984-01-01

    A general-purpose computer code MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the 'computer' is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations. (orig.)

  3. User manual for the probabilistic fuel performance code FRP

    International Nuclear Information System (INIS)

    Friis Jensen, J.; Misfeldt, I.

    1980-10-01

    This report describes the use of the probabilistic fuel performance code FRP. Detailed description of both input to and output from the program are given. The use of the program is illustrated by an example. (author)

  4. Performance of Low-Density Parity-Check Coded Modulation

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.

  5. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  6. An Efficient Method for Verifying Gyrokinetic Microstability Codes

    Science.gov (United States)

    Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.

    2009-11-01

    Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.

  7. A Paradigm Shift in the Implementation of Ethics Codes in Construction Organizations in Hong Kong: Towards an Ethical Behaviour.

    Science.gov (United States)

    Ho, Christabel Man-Fong; Oladinrin, Olugbenga Timo

    2018-01-30

    Due to the economic globalization which is characterized with business scandals, scholars and practitioners are increasingly engaged with the implementation of codes of ethics as a regulatory mechanism for stimulating ethical behaviours within an organization. The aim of this study is to examine various organizational practices regarding the effective implementation of codes of ethics within construction contracting companies. Views on ethics management in construction organizations together with the recommendations for improvement were gleaned through 19 semi-structured interviews, involving construction practitioners from various construction companies in Hong Kong. The findings suggested some practices for effective implementation of codes of ethics in order to diffuse ethical behaviours in an organizational setting which include; introduction of effective reward schemes, arrangement of ethics training for employees, and leadership responsiveness to reported wrongdoings. Since most of the construction companies in Hong Kong have codes of ethics, emphasis is made on the practical implementation of codes within the organizations. Hence, implications were drawn from the recommended measures to guide construction companies and policy makers.

  8. The DIT nuclear fuel assembly physics design code

    International Nuclear Information System (INIS)

    Jonsson, A.

    1988-01-01

    The DIT code is the Combustion Engineering, Inc. (C-E) nuclear fuel assembly design code. It belongs to a class of codes, all similar in structure and strategy, that may be characterized by the spectrum and spatial calculations being performed in two dimensions and in a single job step for the entire assembly. The forerunner of this class of codes is the United Kingdom Atomic Energy Authority WIMS code, the first version of which was completed 25 yr ago. The structure and strategy of assembly spectrum codes have remained remarkably similar to the original concept thus proving its usefulness. As other organizations, including C-E, have developed their own versions of the concept, many important variations have been added that significantly influence the accuracy and performance of the resulting computational tool. Those features, which are unique to the DIT code and which might be of interest to the community of fuel assembly physics design code users and developers, are described and discussed

  9. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  10. Performance analysis of wavelength/spatial coding system with fixed in-phase code matrices in OCDMA network

    Science.gov (United States)

    Tsai, Cheng-Mu; Liang, Tsair-Chun

    2011-12-01

    This paper proposes a wavelength/spatial (W/S) coding system with fixed in-phase code (FIPC) matrix in the optical code-division multiple-access (OCDMA) network. A scheme is presented to form the FIPC matrix which is applied to construct the W/S OCDMA network. The encoder/decoder in the W/S OCDMA network is fully able to eliminate the multiple-access-interference (MAI) at the balanced photo-detectors (PD), according to fixed in-phase cross correlation. The phase-induced intensity noise (PIIN) related to the power square is markedly suppressed in the receiver by spreading the received power into each PD while the net signal power is kept the same. Simulation results show that the W/S OCDMA network based on the FIPC matrices cannot only completely remove the MAI but effectively suppress the PIIN to upgrade the network performance.

  11. Performance Analysis of Wavelength Multiplexed Sac Ocdma Codes in Beat Noise Mitigation in Sac Ocdma Systems

    Science.gov (United States)

    Alhassan, A. M.; Badruddin, N.; Saad, N. M.; Aljunid, S. A.

    2013-07-01

    In this paper we investigate the use of wavelength multiplexed spectral amplitude coding (WM SAC) codes in beat noise mitigation in coherent source SAC OCDMA systems. A WM SAC code is a low weight SAC code, where the whole code structure is repeated diagonally (once or more) in the wavelength domain to achieve the same cardinality as a higher weight SAC code. Results show that for highly populated networks, the WM SAC codes provide better performance than SAC codes. However, for small number of active users the situation is reversed. Apart from their promising improvement in performance, these codes are more flexible and impose less complexity on the system design than their SAC counterparts.

  12. Analysis of parallel computing performance of the code MCNP

    International Nuclear Information System (INIS)

    Wang Lei; Wang Kan; Yu Ganglin

    2006-01-01

    Parallel computing can reduce the running time of the code MCNP effectively. With the MPI message transmitting software, MCNP5 can achieve its parallel computing on PC cluster with Windows operating system. Parallel computing performance of MCNP is influenced by factors such as the type, the complexity level and the parameter configuration of the computing problem. This paper analyzes the parallel computing performance of MCNP regarding with these factors and gives measures to improve the MCNP parallel computing performance. (authors)

  13. Participation in the international comparison of probabilistic consequence assessment codes organized by OECD/NEA and CEC. Final Report

    International Nuclear Information System (INIS)

    Rossi, J.

    1994-02-01

    Probabilistic Consequence Assessment (PCA) methods are exploited not only in risk evaluation but also to study alternative design features, reactor siting recommendations and to obtain acceptable dose criteria by the radiation safety authorities. The models are programmed into computer codes for these kind of assessment. To investigate the quality and competence of different models, OECD/NEA and CEC organized the international code comparison exercise, which was participated by the organizations from 15 countries. There were seven codes participating in the exercise. The objectives of the code comparison exercise were to compare the results by the codes, to contribute to PCA code quality assurance, to harmonize the codes, to provide a forum for discussion on various approaches and to produce the report on the exercise. The project started in 1991 and the results of the calculations were completed in autumn 1992. The international report consists of two parts: the Overview Report for decision makers and the supporting detailed Technical Report. The results of the project are reviewed as an user of the ARANO-programme of VTT and trends of it's further development are indicated in this report. (orig.) (11 refs., 13 figs., 4 tabs.)

  14. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  15. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  16. Performance Comparison of Containment PT analysis between CAP and CONTEMPT Code

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Jun; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul [FNC Tech., Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [KHNP-CENTERAL RESEARCH INSTITUTE, Daejeon (Korea, Republic of)

    2013-10-15

    CAP, in the form that is linked with SPACE, computed the containment back-pressure during LOCA accident. In previous SAR (safety analysis report) report of Shin-Kori Units 3 and 4, the CONTEMPT series of codes(hereby referred to as just 'CONTEMPT') is used to evaluate the containment safety during the postulated loss-of-coolant accident (LOCA). In more detail, CONTEMPT-LT/028 was used to calculate the containment maximum PT, while CONTEMPT4/MOD5 to calculate the minimum PT. Actually, in minimum PT analysis, CONTEMPT4/MOD5, which provide back pressure condition of containment, was linked with RELAP5/MOD3.3 which calculate the amount of blowdown into containment. In this analysis, CONTEMPT4/MOD5 was modified based on KREM. CONTEMPT code was developed to predict the long term behavior of water-cooled nuclear reactor containment systems subjected to LOCA conditions. It calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments, leakage on containment response. Models are provided for fan cooler and cooling spray as engineered safety systems. Any compartment may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. As mentioned above, CONTEMP has the similar code features and it therefore is expected to show the similar analysis performance with CAP. In this study, the differences between CAP and two CONTEMPT code versions (CONTEMPT-LT/028 for maximum PT and CONTEMPT4/MOD5 for minimum PT) are, in detail, identified and the code performances were compared for the same problem. Code by code comparison was carried out to identify the difference of LOCA analysis between a series of COMTEMPT and CAP code. With regard to important factors that affect the transient behavior of compartment thermodynamic

  17. Performance Comparison of Containment PT analysis between CAP and CONTEMPT Code

    International Nuclear Information System (INIS)

    Choo, Yeon Jun; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2013-01-01

    CAP, in the form that is linked with SPACE, computed the containment back-pressure during LOCA accident. In previous SAR (safety analysis report) report of Shin-Kori Units 3 and 4, the CONTEMPT series of codes(hereby referred to as just 'CONTEMPT') is used to evaluate the containment safety during the postulated loss-of-coolant accident (LOCA). In more detail, CONTEMPT-LT/028 was used to calculate the containment maximum PT, while CONTEMPT4/MOD5 to calculate the minimum PT. Actually, in minimum PT analysis, CONTEMPT4/MOD5, which provide back pressure condition of containment, was linked with RELAP5/MOD3.3 which calculate the amount of blowdown into containment. In this analysis, CONTEMPT4/MOD5 was modified based on KREM. CONTEMPT code was developed to predict the long term behavior of water-cooled nuclear reactor containment systems subjected to LOCA conditions. It calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments, leakage on containment response. Models are provided for fan cooler and cooling spray as engineered safety systems. Any compartment may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. As mentioned above, CONTEMP has the similar code features and it therefore is expected to show the similar analysis performance with CAP. In this study, the differences between CAP and two CONTEMPT code versions (CONTEMPT-LT/028 for maximum PT and CONTEMPT4/MOD5 for minimum PT) are, in detail, identified and the code performances were compared for the same problem. Code by code comparison was carried out to identify the difference of LOCA analysis between a series of COMTEMPT and CAP code. With regard to important factors that affect the transient behavior of compartment thermodynamic state in

  18. Performance of the dot product function in radiative transfer code SORD

    Science.gov (United States)

    Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent

    2016-10-01

    The successive orders of scattering radiative transfer (RT) codes frequently call the scalar (dot) product function. In this paper, we study performance of some implementations of the dot product in the RT code SORD using 50 scenarios for light scattering in the atmosphere-surface system. In the dot product function, we use the unrolled loops technique with different unrolling factor. We also considered the intrinsic Fortran functions. We show results for two machines: ifort compiler under Windows, and pgf90 under Linux. Intrinsic DOT_PRODUCT function showed best performance for the ifort. For the pgf90, the dot product implemented with unrolling factor 4 was the fastest. The RT code SORD together with the interface that runs all the mentioned tests are publicly available from ftp://maiac.gsfc.nasa.gov/pub/skorkin/SORD_IP_16B (current release) or by email request from the corresponding (first) author.

  19. Performance and Complexity Evaluation of Iterative Receiver for Coded MIMO-OFDM Systems

    Directory of Open Access Journals (Sweden)

    Rida El Chall

    2016-01-01

    Full Text Available Multiple-input multiple-output (MIMO technology in combination with channel coding technique is a promising solution for reliable high data rate transmission in future wireless communication systems. However, these technologies pose significant challenges for the design of an iterative receiver. In this paper, an efficient receiver combining soft-input soft-output (SISO detection based on low-complexity K-Best (LC-K-Best decoder with various forward error correction codes, namely, LTE turbo decoder and LDPC decoder, is investigated. We first investigate the convergence behaviors of the iterative MIMO receivers to determine the required inner and outer iterations. Consequently, the performance of LC-K-Best based receiver is evaluated in various LTE channel environments and compared with other MIMO detection schemes. Moreover, the computational complexity of the iterative receiver with different channel coding techniques is evaluated and compared with different modulation orders and coding rates. Simulation results show that LC-K-Best based receiver achieves satisfactory performance-complexity trade-offs.

  20. An Examination of the Performance Based Building Code on the Design of a Commercial Building

    Directory of Open Access Journals (Sweden)

    John Greenwood

    2012-11-01

    Full Text Available The Building Code of Australia (BCA is the principal code under which building approvals in Australia are assessed. The BCA adopted performance-based solutions for building approvals in 1996. Performance-based codes are based upon a set of explicit objectives, stated in terms of a hierarchy of requirements beginning with key general objectives. With this in mind, the research presented in this paper aims to analyse the impact of the introduction of the performance-based code within Western Australia to gauge the effect and usefulness of alternative design solutions in commercial construction using a case study project. The research revealed that there are several advantages to the use of alternative designs and that all parties, in general, are in favour of the performance-based building code of Australia. It is suggested that change in the assessment process to streamline the alternative design path is needed for the greater use of the performance-based alternative. With appropriate quality control measures, minor variations to the deemed-to-satisfy provisions could easily be managed by the current and future building surveying profession.

  1. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, Alessandro; D' Auria, Francesco [University of Pisa, San Piero a Grado (Italy). Nuclear Research Group San Piero a Grado (GRNSPG); Galetti, Regina, E-mail: regina@cnen.gov.b [National Commission for Nuclear Energy (CNEN), Rio de Janeiro, RJ (Brazil); Bajs, Tomislav [University of Zagreb (Croatia). Fac. of Electrical Engineering and Computing. Dept. of Power Systems; Reventos, Francesc [Technical University of Catalonia, Barcelona (Spain). Dept. of Physics and Nuclear Engineering

    2011-07-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  2. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Galetti, Regina; Bajs, Tomislav; Reventos, Francesc

    2011-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  3. The Dit nuclear fuel assembly physics design code

    International Nuclear Information System (INIS)

    Jonsson, A.

    1987-01-01

    DIT is the Combustion Engineering, Inc. (C-E) nuclear fuel assembly design code. It belongs to a class of codes, all similar in structure and strategy, which may be characterized by the spectrum and spatial calculations being performed in 2D and in a single job step for the entire assembly. The forerunner of this class of codes is the U.K.A.E.A. WIMS code, the first version of which was completed 25 years ago. The structure and strategy of assembly spectrum codes have remained remarkably similar to the original concept thus proving its usefulness. As other organizations, including C-E, have developed their own versions of the concept, many important variations have been added which significantly influence the accuracy and performance of the resulting computational tool. This paper describes and discusses those features which are unique to the DIT code and which might be of interest to the community of fuel assembly physics design code users and developers

  4. New Technique for Improving Performance of LDPC Codes in the Presence of Trapping Sets

    Directory of Open Access Journals (Sweden)

    Mohamed Adnan Landolsi

    2008-06-01

    Full Text Available Trapping sets are considered the primary factor for degrading the performance of low-density parity-check (LDPC codes in the error-floor region. The effect of trapping sets on the performance of an LDPC code becomes worse as the code size decreases. One approach to tackle this problem is to minimize trapping sets during LDPC code design. However, while trapping sets can be reduced, their complete elimination is infeasible due to the presence of cycles in the underlying LDPC code bipartite graph. In this work, we introduce a new technique based on trapping sets neutralization to minimize the negative effect of trapping sets under belief propagation (BP decoding. Simulation results for random, progressive edge growth (PEG and MacKay LDPC codes demonstrate the effectiveness of the proposed technique. The hardware cost of the proposed technique is also shown to be minimal.

  5. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  6. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  7. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  8. Performance analysis of LDPC codes on OOK terahertz wireless channels

    Science.gov (United States)

    Chun, Liu; Chang, Wang; Jun-Cheng, Cao

    2016-02-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. Project supported by the National Key Basic Research Program of China (Grant No. 2014CB339803), the National High Technology Research and Development Program of China (Grant No. 2011AA010205), the National Natural Science Foundation of China (Grant Nos. 61131006, 61321492, and 61204135), the Major National Development Project of Scientific Instrument and Equipment (Grant No. 2011YQ150021), the National Science and Technology Major Project (Grant No. 2011ZX02707), the International Collaboration and Innovation Program on High Mobility Materials Engineering of the Chinese Academy of Sciences, and the Shanghai Municipal Commission of Science and Technology (Grant No. 14530711300).

  9. ATES/heat pump simulations performed with ATESSS code

    Science.gov (United States)

    Vail, L. W.

    1989-01-01

    Modifications to the Aquifer Thermal Energy Storage System Simulator (ATESSS) allow simulation of aquifer thermal energy storage (ATES)/heat pump systems. The heat pump algorithm requires a coefficient of performance (COP) relationship of the form: COP = COP sub base + alpha (T sub ref minus T sub base). Initial applications of the modified ATES code to synthetic building load data for two sizes of buildings in two U.S. cities showed insignificant performance advantage of a series ATES heat pump system over a conventional groundwater heat pump system. The addition of algorithms for a cooling tower and solar array improved performance slightly. Small values of alpha in the COP relationship are the principal reason for the limited improvement in system performance. Future studies at Pacific Northwest Laboratory (PNL) are planned to investigate methods to increase system performance using alternative system configurations and operations scenarios.

  10. On the performance of diagonal lattice space-time codes for the quasi-static MIMO channel

    KAUST Repository

    Abediseid, Walid

    2013-06-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple-output (MIMO) channel. All the coding design to date focuses on either high-performance, high rates, low complexity encoding and decoding, or targeting a combination of these criteria. In this paper, we analyze in detail the performance of diagonal lattice space-time codes under lattice decoding. We present both upper and lower bounds on the average error probability. We derive a new closed form expression of the lower bound using the so-called sphere-packing bound. This bound presents the ultimate performance limit a diagonal lattice space-time code can achieve at any signal-to-noise ratio (SNR). The upper bound is simply derived using the union-bound and demonstrates how the average error probability can be minimized by maximizing the minimum product distance of the code. © 2013 IEEE.

  11. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  12. SNR and BER Models and the Simulation for BER Performance of Selected Spectral Amplitude Codes for OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-01-01

    Full Text Available Many encoding schemes are used in OCDMA (Optical Code Division Multiple Access Network but SAC (Spectral Amplitude Codes is widely used. It is considered an effective arrangement to eliminate dominant noise called MAI (Multi Access Interference. Various codes are studied for evaluation with respect to their performance against three noises namely shot noise, thermal noise and PIIN (Phase Induced Intensity Noise. Various Mathematical models for SNR (Signal to Noise Ratios and BER (Bit Error Rates are discussed where the SNRs are calculated and BERs are computed using Gaussian distribution assumption. After analyzing the results mathematically, it is concluded that ZCC (Zero Cross Correlation Code performs better than the other selected SAC codes and can serve larger number of active users than the other codes do. At various receiver power levels, analysis points out that RDC (Random Diagonal Code also performs better than the other codes. For the power interval between -10 and -20 dBm performance of RDC is better ZCC. Their lowest BER values suggest that these codes should be part of an efficient and cost effective OCDM access network in the future.

  13. Development of a general coupling interface for the fuel performance code TRANSURANUS – Tested with the reactor dynamics code DYN3D

    International Nuclear Information System (INIS)

    Holt, L.; Rohde, U.; Seidl, M.; Schubert, A.; Van Uffelen, P.; Macián-Juan, R.

    2015-01-01

    Highlights: • A general coupling interface was developed for couplings of the TRANSURANUS code. • With this new tool simplified fuel behavior models in codes can be replaced. • Applicable e.g. for several reactor types and from normal operation up to DBA. • The general coupling interface was applied to the reactor dynamics code DYN3D. • The new coupled code system DYN3D–TRANSURANUS was successfully tested for RIA. - Abstract: A general interface is presented for coupling the TRANSURANUS fuel performance code with thermal hydraulics system, sub-channel thermal hydraulics, computational fluid dynamics (CFD) or reactor dynamics codes. As first application the reactor dynamics code DYN3D was coupled at assembly level in order to describe the fuel behavior in more detail. In the coupling, DYN3D provides process time, time-dependent rod power and thermal hydraulics conditions to TRANSURANUS, which in case of the two-way coupling approach transfers parameters like fuel temperature and cladding temperature back to DYN3D. Results of the coupled code system are presented for the reactivity transient scenario, initiated by control rod ejection. More precisely, the two-way coupling approach systematically calculates higher maximum values for the node fuel enthalpy. These differences can be explained thanks to the greater detail in fuel behavior modeling. The numerical performance for DYN3D–TRANSURANUS was proved to be fast and stable. The coupled code system can therefore improve the assessment of safety criteria, at a reasonable computational cost

  14. PORST: a computer code to analyze the performance of retrofitted steam turbines

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C.; Hwang, I.T.

    1980-09-01

    The computer code PORST was developed to analyze the performance of a retrofitted steam turbine that is converted from a single generating to a cogenerating unit for purposes of district heating. Two retrofit schemes are considered: one converts a condensing turbine to a backpressure unit; the other allows the crossover extraction of steam between turbine cylinders. The code can analyze the performance of a turbine operating at: (1) valve-wide-open condition before retrofit, (2) partial load before retrofit, (3) valve-wide-open after retrofit, and (4) partial load after retrofit.

  15. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  16. Dexter - A one-dimensional code for calculating thermionic performance of long converters.

    Science.gov (United States)

    Sawyer, C. D.

    1971-01-01

    This paper describes a versatile code for computing the coupled thermionic electric-thermal performance of long thermionic converters in which the temperature and voltage variations cannot be neglected. The code is capable of accounting for a variety of external electrical connection schemes, coolant flow paths and converter failures by partial shorting. Example problem solutions are given.

  17. Reliability issues and solutions for coding social communication performance in classroom settings.

    Science.gov (United States)

    Olswang, Lesley B; Svensson, Liselotte; Coggins, Truman E; Beilinson, Jill S; Donaldson, Amy L

    2006-10-01

    To explore the utility of time-interval analysis for documenting the reliability of coding social communication performance of children in classroom settings. Of particular interest was finding a method for determining whether independent observers could reliably judge both occurrence and duration of ongoing behavioral dimensions for describing social communication performance. Four coders participated in this study. They observed and independently coded 6 social communication behavioral dimensions using handheld computers. The dimensions were mutually exclusive and accounted for all verbal and nonverbal productions during a specified time frame. The technology allowed for coding frequency and duration for each entered code. Data were collected from 20 different 2-min video segments of children in kindergarten through 3rd-grade classrooms. Data were analyzed for interobserver and intraobserver agreements using time-interval sorting and Cohen's kappa. Further, interval size and total observation length were manipulated to determine their influence on reliability. The data revealed interval sorting and kappa to be a suitable method for examining reliability of occurrence and duration of ongoing social communication behavioral dimensions. Nearly all comparisons yielded medium to large kappa values; interval size and length of observation minimally affected results. Implications The analysis procedure described in this research solves a challenge in reliability: comparing coding by independent observers of both occurrence and duration of behaviors. Results indicate the utility of a new coding taxonomy and technology for application in online observations of social communication in a classroom setting.

  18. Survey of nuclear fuel-cycle codes

    International Nuclear Information System (INIS)

    Thomas, C.R.; de Saussure, G.; Marable, J.H.

    1981-04-01

    A two-month survey of nuclear fuel-cycle models was undertaken. This report presents the information forthcoming from the survey. Of the nearly thirty codes reviewed in the survey, fifteen of these codes have been identified as potentially useful in fulfilling the tasks of the Nuclear Energy Analysis Division (NEAD) as defined in their FY 1981-1982 Program Plan. Six of the fifteen codes are given individual reviews. The individual reviews address such items as the funding agency, the author and organization, the date of completion of the code, adequacy of documentation, computer requirements, history of use, variables that are input and forecast, type of reactors considered, part of fuel cycle modeled and scope of the code (international or domestic, long-term or short-term, regional or national). The report recommends that the Model Evaluation Team perform an evaluation of the EUREKA uranium mining and milling code

  19. Organization Features and School Performance

    OpenAIRE

    Atkins, Lois Major

    2005-01-01

    The purpose of this study was to determine the odds of school organization features predicting schools meeting district or state performance goals. The school organization features were organizational complexity, shared decision making, and leadership behavior. The dependent variable was school performance, operationally defined as a principalâ s yes response or no response to the question, â did your school meet district or state performance goals.â The independent variables representing...

  20. KUGEL: a thermal, hydraulic, fuel performance, and gaseous fission product release code for pebble bed reactor core analysis

    International Nuclear Information System (INIS)

    Shamasundar, B.I.; Fehrenbach, M.E.

    1981-05-01

    The KUGEL computer code is designed to perform thermal/hydraulic analysis and coated-fuel particle performance calculations for axisymmetric pebble bed reactor (PBR) cores. This computer code was developed as part of a Department of Energy (DOE)-funded study designed to verify the published core performance data on PBRs. The KUGEL code is designed to interface directly with the 2DB code, a two-dimensional neutron diffusion code, to obtain distributions of thermal power, fission rate, fuel burnup, and fast neutron fluence, which are needed for thermal/hydraulic and fuel performance calculations. The code is variably dimensioned so that problem size can be easily varied. An interpolation routine allows variable mesh size to be used between the 2DB output and the two-dimensional thermal/hydraulic calculations

  1. Probabilistic evaluation of fuel element performance by the combined use of a fast running simplistic and a detailed deterministic fuel performance code

    International Nuclear Information System (INIS)

    Misfeldt, I.

    1980-01-01

    A comprehensive evaluation of fuel element performance requires a probabilistic fuel code supported by a well bench-marked deterministic code. This paper presents an analysis of a SGHWR ramp experiment, where the probabilistic fuel code FRP is utilized in combination with the deterministic fuel models FFRS and SLEUTH/SEER. The statistical methods employed in FRP are Monte Carlo simulation or a low-order Taylor approximation. The fast-running simplistic fuel code FFRS is used for the deterministic simulations, whereas simulations with SLEUTH/SEER are used to verify the predictions of FFRS. The ramp test was performed with a SGHWR fuel element, where 9 of the 36 fuel pins failed. There seemed to be good agreement between the deterministic simulations and the experiment, but the statistical evaluation shows that the uncertainty on the important performance parameters is too large for this ''nice'' result. The analysis does therefore indicate a discrepancy between the experiment and the deterministic code predictions. Possible explanations for this disagreement are discussed. (author)

  2. DEXTER: A one-dimensional code for calculating thermionic performance of long converters

    Science.gov (United States)

    Sawyer, C. D.

    1971-01-01

    A versatile code is described for computing the coupled thermionic electric-thermal performance of long thermionic converters in which the temperature and voltage variations cannot be neglected. The code is capable of accounting for a variety of external electrical connection schemes, coolant flow paths and converter failures by partial shorting. Example problem solutions are included along with a user's manual.

  3. Performance, Accuracy and Efficiency Evaluation of a Three-Dimensional Whole-Core Neutron Transport Code AGENT

    International Nuclear Information System (INIS)

    Jevremovic, Tatjana; Hursin, Mathieu; Satvat, Nader; Hopkins, John; Xiao, Shanjie; Gert, Godfree

    2006-01-01

    The AGENT (Arbitrary Geometry Neutron Transport) an open-architecture reactor modeling tool is deterministic neutron transport code for two or three-dimensional heterogeneous neutronic design and analysis of the whole reactor cores regardless of geometry types and material configurations. The AGENT neutron transport methodology is applicable to all generations of nuclear power and research reactors. It combines three theories: (1) the theory of R-functions used to generate real three-dimensional whole-cores of square, hexagonal or triangular cross sections, (2) the planar method of characteristics used to solve isotropic neutron transport in non-homogenized 2D) reactor slices, and (3) the one-dimensional diffusion theory used to couple the planar and axial neutron tracks through the transverse leakage and angular mesh-wise flux values. The R-function-geometrical module allows a sequential building of the layers of geometry and automatic sub-meshing based on the network of domain functions. The simplicity of geometry description and selection of parameters for accurate treatment of neutron propagation is achieved through the Boolean algebraic hierarchically organized simple primitives into complex domains (both being represented with corresponding domain functions). The accuracy is comparable to Monte Carlo codes and is obtained by following neutron propagation through real geometrical domains that does not require homogenization or simplifications. The efficiency is maintained through a set of acceleration techniques introduced at all important calculation levels. The flux solution incorporates power iteration with two different acceleration techniques: Coarse Mesh Re-balancing (CMR) and Coarse Mesh Finite Difference (CMFD). The stand-alone originally developed graphical user interface of the AGENT code design environment allows the user to view and verify input data by displaying the geometry and material distribution. The user can also view the output data such

  4. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  5. The CMSSW benchmarking suite: Using HEP code to measure CPU performance

    International Nuclear Information System (INIS)

    Benelli, G

    2010-01-01

    The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.

  6. Oxide fuel pin transient performance analysis and design with the TEMECH code

    International Nuclear Information System (INIS)

    Bard, F.E.; Dutt, S.P.; Hinman, C.A.; Hunter, C.W.; Pitner, A.L.

    1986-01-01

    The TEMECH code is a fast-running, thermal-mechanical-hydraulic, analytical program used to evaluate the transient performance of LMR oxide fuel pins. The code calculates pin deformation and failure probability due to fuel-cladding differential thermal expansion, expansion of fuel upon melting, and fission gas pressurization. The mechanistic fuel model in the code accounts for fuel cracking, crack closure, porosity decrease, and the temperature dependence of fuel creep through the course of the transient. Modeling emphasis has been placed on results obtained from Fuel Cladding Transient Test (FCTT) testing, Transient Fuel Deformation (TFD) tests and TREAT integral fuel pin experiments

  7. Staff Performance Evaluation in Public Organizations

    Directory of Open Access Journals (Sweden)

    Drumea C.

    2014-12-01

    Full Text Available In public Organizations staff performance is difficult to measure in absence of overall quantitative performance indicators. There are also the qualitative indicators that give an overview on staff’s motivation, strive, ability, commitment to values, teamwork. These aspects are even less easy to illustrate, in private and public sectors equally. In both cases, measuring staff performance at work, as well as its input on the global performance of the organization is a difficult task which has in practice different approaches. Subsequently, this paper is discussing the system indicators and performance triggers used in International Organizations UN affiliated, in order to adjust staff classification and benefits to their staff’s performance.

  8. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  9. Impact of intra-flow network coding on the relay channel performance: an analytical study

    OpenAIRE

    Apavatjrut , Anya; Goursaud , Claire; Jaffrès-Runser , Katia; Gorce , Jean-Marie

    2012-01-01

    International audience; One of the most powerful ways to achieve trans- mission reliability over wireless links is to employ efficient coding techniques. This paper investigates the performance of a transmission over a relay channel where information is protected by two layers of coding. In the first layer, transmission reliability is ensured by fountain coding at the source. The second layer incorporates network coding at the relay node. Thus, fountain coded packets are re-encoded at the relay...

  10. Typical performance of regular low-density parity-check codes over general symmetric channels

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Toshiyuki [Department of Electronics and Information Engineering, Tokyo Metropolitan University, 1-1 Minami-Osawa, Hachioji-shi, Tokyo 192-0397 (Japan); Saad, David [Neural Computing Research Group, Aston University, Aston Triangle, Birmingham B4 7ET (United Kingdom)

    2003-10-31

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models.

  11. Typical performance of regular low-density parity-check codes over general symmetric channels

    International Nuclear Information System (INIS)

    Tanaka, Toshiyuki; Saad, David

    2003-01-01

    Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. Relationship between the free energy in statistical-mechanics approach and the mutual information used in the information-theory literature is established within a general framework; Gallager and MacKay-Neal codes are studied as specific examples of LDPC codes. It is shown that basic properties of these codes known for particular channels, including their potential to saturate Shannon's bound, hold for general symmetric channels. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel models

  12. A ''SuperCode'' for performing systems analysis of tokamak experiments and reactors

    International Nuclear Information System (INIS)

    Haney, S.W.; Barr, W.L.; Crotinger, J.A.; Perkins, L.J.; Solomon, C.J.; Chaniotakis, E.A.; Freidberg, J.P.; Wei, J.; Galambos, J.D.; Mandrekas, J.

    1992-01-01

    A new code, named the ''SUPERCODE,'' has been developed to fill the gap between currently available zero dimensional systems codes and highly sophisticated, multidimensional plasma performance codes. The former are comprehensive in content, fast to execute, but rather simple in terms of the accuracy of the physics and engineering models. The latter contain state-of-the-art plasma physics modelling but are limited in engineering content and time consuming to run. The SUPERCODE upgrades the reliability and accuracy of systems codes by calculating the self consistent 1 1/2 dimensional MHD-transport plasma evolution in a realistic engineering environment. By a combination of variational techniques and careful formation, there is only a modest increase in CPU time over O-D runs, thereby making the SUPERCODE suitable for use as a systems studies tool. In addition, considerable effort has been expended to make the code user- and programming-friendly, as well as operationally flexible, with the hope of encouraging wide usage throughout the fusion community

  13. Performance analysis of a decoding algorithm for algebraic-geometry codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund

    1999-01-01

    The fast decoding algorithm for one point algebraic-geometry codes of Sakata, Elbrond Jensen, and Hoholdt corrects all error patterns of weight less than half the Feng-Rao minimum distance. In this correspondence we analyze the performance of the algorithm for heavier error patterns. It turns out...

  14. Development of a general coupling interface for the fuel performance code transuranus tested with the reactor dynamic code DYN3D

    International Nuclear Information System (INIS)

    Holt, L.; Rohde, U.; Seidl, M.; Schubert, A.; Van Uffelen, P.

    2013-01-01

    Several institutions plan to couple the fuel performance code TRANSURANUS developed by the European Institute for Transuranium Elements with their own codes. One of these codes is the reactor dynamic code DYN3D maintained by the Helmholtz-Zentrum Dresden - Rossendorf. DYN3D was developed originally for VVER type reactors and was extended later to western type reactors. Usually, the fuel rod behavior is modeled in thermal hydraulics and neutronic codes in a simplified manner. The main idea of this coupling is to describe the fuel rod behavior in the frame of core safety analysis in a more detailed way, e.g. including the influence of the high burn-up structure, geometry changes and fission gas release. It allows to take benefit from the improved computational power and software achieved over the last two decades. The coupling interface was developed in a general way from the beginning. Thence it can be easily used also by other codes for a coupling with TRANSURANUS. The user can choose between a one-way as well as a two-way online coupling option. For a one-way online coupling, DYN3D provides only the time-dependent rod power and thermal hydraulics conditions to TRANSURANUS, but the fuel performance code doesn’t transfer any variable back to DYN3D. In a two-way online coupling, TRANSURANUS in addition transfers parameters like fuel temperature and cladding temperature back to DYN3D. This list of variables can be extended easily by geometric and further variables of interest. First results of the code system DYN3D-TRANSURANUS will be presented for a control rod ejection transient in a modern western type reactor. Pre-analyses show already that a detailed fuel rod behavior modeling will influence the thermal hydraulics and thence also the neutronics due to the Doppler reactivity effect of the fuel temperature. The coupled code system has therefore a potential to improve the assessment of safety criteria. The developed code system DYN3D-TRANSURANUS can be used also

  15. Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment

    Science.gov (United States)

    Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.

    2017-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of

  16. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  17. Performance studies of the parallel VIM code

    International Nuclear Information System (INIS)

    Shi, B.; Blomquist, R.N.

    1996-01-01

    In this paper, the authors evaluate the performance of the parallel version of the VIM Monte Carlo code on the IBM SPx at the High Performance Computing Research Facility at ANL. Three test problems with contrasting computational characteristics were used to assess effects in performance. A statistical method for estimating the inefficiencies due to load imbalance and communication is also introduced. VIM is a large scale continuous energy Monte Carlo radiation transport program and was parallelized using history partitioning, the master/worker approach, and p4 message passing library. Dynamic load balancing is accomplished when the master processor assigns chunks of histories to workers that have completed a previously assigned task, accommodating variations in the lengths of histories, processor speeds, and worker loads. At the end of each batch (generation), the fission sites and tallies are sent from each worker to the master process, contributing to the parallel inefficiency. All communications are between master and workers, and are serial. The SPx is a scalable 128-node parallel supercomputer with high-performance Omega switches of 63 microsec latency and 35 MBytes/sec bandwidth. For uniform and reproducible performance, they used only the 120 identical regular processors (IBM RS/6000) and excluded the remaining eight planet nodes, which may be loaded by other's jobs

  18. Coding Instead of Splitting - Algebraic Combinations in Time and Space

    Science.gov (United States)

    2016-06-09

    AND ADDRESS(ES) 8. PERFORMING ORGANIZATION Massachusetts Inst itute of Technology REPORT NUMBER 77 Massachusetts Ave, Cambridge , MA 02 139 9...provide simulation results to illustrate the performance of our algorithms. Network Coded Distributed Storage: In distributed cloud storages fault tolerance

  19. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran.

    Science.gov (United States)

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-09-01

    Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  20. The error performance analysis over cyclic redundancy check codes

    Science.gov (United States)

    Yoon, Hee B.

    1991-06-01

    The burst error is generated in digital communication networks by various unpredictable conditions, which occur at high error rates, for short durations, and can impact services. To completely describe a burst error one has to know the bit pattern. This is impossible in practice on working systems. Therefore, under the memoryless binary symmetric channel (MBSC) assumptions, the performance evaluation or estimation schemes for digital signal 1 (DS1) transmission systems carrying live traffic is an interesting and important problem. This study will present some analytical methods, leading to efficient detecting algorithms of burst error using cyclic redundancy check (CRC) code. The definition of burst error is introduced using three different models. Among the three burst error models, the mathematical model is used in this study. The probability density function, function(b) of burst error of length b is proposed. The performance of CRC-n codes is evaluated and analyzed using function(b) through the use of a computer simulation model within CRC block burst error. The simulation result shows that the mean block burst error tends to approach the pattern of the burst error which random bit errors generate.

  1. Use of advanced simulations in fuel performance codes

    International Nuclear Information System (INIS)

    Van Uffelen, P.

    2015-01-01

    The simulation of the cylindrical fuel rod behaviour in a reactor or a storage pool for spent fuel requires a fuel performance code. Such tool solves the equations for the heat transfer, the stresses and strains in fuel and cladding, the evolution of several isotopes and the behaviour of various fission products in the fuel rod. The main equations along with their limitations are briefly described. The current approaches adopted for overcoming these limitations and the perspectives are also outlined. (author)

  2. Introduction into scientific work methods-a necessity when performance-based codes are introduced

    DEFF Research Database (Denmark)

    Dederichs, Anne; Sørensen, Lars Schiøtt

    The introduction of performance-based codes in Denmark in 2004 requires new competences from people working with different aspects of fire safety in the industry and the public sector. This abstract presents an attempt in reducing problems with handling and analysing the mathematical methods...... and CFD models when applying performance-based codes. This is done within the educational program "Master of Fire Safety Engineering" at the department of Civil Engineering at the Technical University of Denmark. It was found that the students had general problems with academic methods. Therefore, a new...

  3. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  4. Simulation of hydrogen deflagration experiment – Benchmark exercise with lumped-parameter codes

    Energy Technology Data Exchange (ETDEWEB)

    Kljenak, Ivo, E-mail: ivo.kljenak@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Kuznetsov, Mikhail, E-mail: mike.kuznetsov@kit.edu [Karlsruhe Institute of Technology, Kaiserstraße 12, 76131 Karlsruhe (Germany); Kostka, Pal, E-mail: kostka@nubiki.hu [NUBIKI Nuclear Safety Research Institute, Konkoly-Thege Miklós út 29-33, 1121 Budapest (Hungary); Kubišova, Lubica, E-mail: lubica.kubisova@ujd.gov.sk [Nuclear Regulatory Authority of the Slovak Republic, Bajkalská 27, 82007 Bratislava (Slovakia); Maltsev, Mikhail, E-mail: maltsev_MB@aep.ru [JSC Atomenergoproekt, 1, st. Podolskykh Kursantov, Moscow (Russian Federation); Manzini, Giovanni, E-mail: giovanni.manzini@rse-web.it [Ricerca sul Sistema Energetico, Via Rubattino 54, 20134 Milano (Italy); Povilaitis, Mantas, E-mail: mantas.p@mail.lei.lt [Lithuania Energy Institute, Breslaujos g.3, 44403 Kaunas (Lithuania)

    2015-03-15

    Highlights: • Blind and open simulations of hydrogen combustion experiment in large-scale containment-like facility with different lumped-parameter codes. • Simulation of axial as well as radial flame propagation. • Confirmation of adequacy of lumped-parameter codes for safety analyses of actual nuclear power plants. - Abstract: An experiment on hydrogen deflagration (Upward Flame Propagation Experiment – UFPE) was proposed by the Jozef Stefan Institute (Slovenia) and performed in the HYKA A2 facility at the Karlsruhe Institute of Technology (Germany). The experimental results were used to organize a benchmark exercise for lumped-parameter codes. Six organizations (JSI, AEP, LEI, NUBIKI, RSE and UJD SR) participated in the benchmark exercise, using altogether four different computer codes: ANGAR, ASTEC, COCOSYS and ECART. Both blind and open simulations were performed. In general, all the codes provided satisfactory results of the pressure increase, whereas the results of the temperature show a wider dispersal. Concerning the flame axial and radial velocities, the results may be considered satisfactory, given the inherent simplification of the lumped-parameter description compared to the local instantaneous description.

  5. Simulation of hydrogen deflagration experiment – Benchmark exercise with lumped-parameter codes

    International Nuclear Information System (INIS)

    Kljenak, Ivo; Kuznetsov, Mikhail; Kostka, Pal; Kubišova, Lubica; Maltsev, Mikhail; Manzini, Giovanni; Povilaitis, Mantas

    2015-01-01

    Highlights: • Blind and open simulations of hydrogen combustion experiment in large-scale containment-like facility with different lumped-parameter codes. • Simulation of axial as well as radial flame propagation. • Confirmation of adequacy of lumped-parameter codes for safety analyses of actual nuclear power plants. - Abstract: An experiment on hydrogen deflagration (Upward Flame Propagation Experiment – UFPE) was proposed by the Jozef Stefan Institute (Slovenia) and performed in the HYKA A2 facility at the Karlsruhe Institute of Technology (Germany). The experimental results were used to organize a benchmark exercise for lumped-parameter codes. Six organizations (JSI, AEP, LEI, NUBIKI, RSE and UJD SR) participated in the benchmark exercise, using altogether four different computer codes: ANGAR, ASTEC, COCOSYS and ECART. Both blind and open simulations were performed. In general, all the codes provided satisfactory results of the pressure increase, whereas the results of the temperature show a wider dispersal. Concerning the flame axial and radial velocities, the results may be considered satisfactory, given the inherent simplification of the lumped-parameter description compared to the local instantaneous description

  6. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  7. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  8. Performance improvement: the organization's quest.

    Science.gov (United States)

    McKinley, C O; Parmer, D E; Saint-Amand, R A; Harbin, C B; Roulston, J C; Ellis, R A; Buchanan, J R; Leonard, R B

    1999-01-01

    In today's health care marketplace, quality has become an expectation. Stakeholders are demanding quality clinical outcomes, and accrediting bodies are requiring clinical performance data. The Roosevelt Institute's quest was to define and quantify quality outcomes, develop an organizational culture of performance improvement, and ensure customer satisfaction. Several of the organization's leaders volunteered to work as a team to develop a specific performance improvement approach tailored to the organization. To date, over 200 employees have received an orientation to the model and its philosophy and nine problem action and process improvement teams have been formed.

  9. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  10. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  11. Governmental organization for the regulation of nuclear power plants. A code of practice

    International Nuclear Information System (INIS)

    1978-01-01

    This Code of Practice recommends requirements for a regulatory body responsible for regulating the siting, construction, commissioning, operation and decommissioning of nuclear power plants for safety. It forms part of the Agency's programme, referred to as the NUSS programme, for establishing Codes of Practice and Safety Guides relating to land-based stationary thermal neutron power plants. This Code has been prepared to provide recommendations for Member States embarking on a nuclear power programme and covers: (1) Establishing and maintaining a regulatory body to which is assigned the responsibility for authorizing the siting, construction, commissioning, operation and decommissioning of nuclear power plants after appropriate review and assessment (2) Organizing for and conducting the review and assessment of the safety of nuclear power plants (3) Conducting the necessary regulatory inspections and taking necessary enforcement actions during all stages of the licensing process in order to ensure that the limits and conditions of the licences are being complied with by the applicants/licensees and their contractors (4) Establishing regulations and criteria for nuclear-related health, safety and environmental protection

  12. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  13. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    Directory of Open Access Journals (Sweden)

    Sara Moghaddam

    2013-08-01

    Full Text Available Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective.Methods: A cross-sectional comparative study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13.Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  14. SCANAIR a transient fuel performance code Part two: Assessment of modelling capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Georgenthum, Vincent, E-mail: vincent.georgenthum@irsn.fr; Moal, Alain; Marchand, Olivier

    2014-12-15

    Highlights: • The SCANAIR code is devoted to the study of irradiated fuel rod behaviour during RIA. • The paper deals with the status of the code validation for PWR rods. • During the PCMI stage there is a good agreement between calculations and experiments. • The boiling crisis occurrence is rather well predicted. • The code assessment during the boiling crisis has still to be improved. - Abstract: In the frame of their research programmes on fuel safety, the French Institut de Radioprotection et de Sûreté Nucléaire develops the SCANAIR code devoted to the study of irradiated fuel rod behaviour during reactivity initiated accident. A first paper was focused on detailed modellings and code description. This second paper deals with the status of the code validation for pressurised water reactor rods performed thanks to the available experimental results. About 60 integral tests carried out in CABRI and NSRR experimental reactors and 24 separated tests performed in the PATRICIA facility (devoted to the thermal-hydraulics study) have been recalculated and compared to experimental data. During the first stage of the transient, the pellet clad mechanical interaction phase, there is a good agreement between calculations and experiments: the clad residual elongation and hoop strain of non failed tests but also the failure occurrence and failure enthalpy of failed tests are correctly calculated. After this first stage, the increase of cladding temperature can lead to the Departure from Nucleate Boiling. During the film boiling regime, the clad temperature can reach a very high temperature (>700 °C). If the boiling crisis occurrence is rather well predicted, the calculation of the clad temperature and the clad hoop strain during this stage have still to be improved.

  15. Challenging the dogma: the hidden layer of non-protein-coding RNAs in complex organisms.

    Science.gov (United States)

    Mattick, John S

    2003-10-01

    The central dogma of biology holds that genetic information normally flows from DNA to RNA to protein. As a consequence it has been generally assumed that genes generally code for proteins, and that proteins fulfil not only most structural and catalytic but also most regulatory functions, in all cells, from microbes to mammals. However, the latter may not be the case in complex organisms. A number of startling observations about the extent of non-protein-coding RNA (ncRNA) transcription in the higher eukaryotes and the range of genetic and epigenetic phenomena that are RNA-directed suggests that the traditional view of the structure of genetic regulatory systems in animals and plants may be incorrect. ncRNA dominates the genomic output of the higher organisms and has been shown to control chromosome architecture, mRNA turnover and the developmental timing of protein expression, and may also regulate transcription and alternative splicing. This paper re-examines the available evidence and suggests a new framework for considering and understanding the genomic programming of biological complexity, autopoietic development and phenotypic variation. Copyright 2003 Wiley Periodicals, Inc.

  16. Construction and performance analysis of variable-weight optical orthogonal codes for asynchronous OCDMA systems

    Science.gov (United States)

    Li, Chuan-qi; Yang, Meng-jie; Zhang, Xiu-rong; Chen, Mei-juan; He, Dong-dong; Fan, Qing-bin

    2014-07-01

    A construction scheme of variable-weight optical orthogonal codes (VW-OOCs) for asynchronous optical code division multiple access (OCDMA) system is proposed. According to the actual situation, the code family can be obtained by programming in Matlab with the given code weight and corresponding capacity. The formula of bit error rate (BER) is derived by taking account of the effects of shot noise, avalanche photodiode (APD) bulk, thermal noise and surface leakage currents. The OCDMA system with the VW-OOCs is designed and improved. The study shows that the VW-OOCs have excellent performance of BER. Despite of coming from the same code family or not, the codes with larger weight have lower BER compared with the other codes in the same conditions. By taking simulation, the conclusion is consistent with the analysis of BER in theory. And the ideal eye diagrams are obtained by the optical hard limiter.

  17. Performance of Turbo Interference Cancellation Receivers in Space-Time Block Coded DS-CDMA Systems

    Directory of Open Access Journals (Sweden)

    Emmanuel Oluremi Bejide

    2008-07-01

    Full Text Available We investigate the performance of turbo interference cancellation receivers in the space time block coded (STBC direct-sequence code division multiple access (DS-CDMA system. Depending on the concatenation scheme used, we divide these receivers into the partitioned approach (PA and the iterative approach (IA receivers. The performance of both the PA and IA receivers is evaluated in Rayleigh fading channels for the uplink scenario. Numerical results show that the MMSE front-end turbo space-time iterative approach receiver (IA effectively combats the mixture of MAI and intersymbol interference (ISI. To further investigate the possible achievable data rates in the turbo interference cancellation receivers, we introduce the puncturing of the turbo code through the use of rate compatible punctured turbo codes (RCPTCs. Simulation results suggest that combining interference cancellation, turbo decoding, STBC, and RCPTC can significantly improve the achievable data rates for a synchronous DS-CDMA system for the uplink in Rayleigh flat fading channels.

  18. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  19. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  20. Safety analysis of MOX fuels by fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-12-01

    Performance of plutonium rick mixed oxide fuels specified for the Reduced-Moderation Water Reactor (RMWR) has been analysed by modified fuel performance code. Thermodynamic properties of these fuels up to 120 GWd/t burnup have not been measured and estimated using existing uranium fuel models. Fission product release, pressure rise inside fuel rods and mechanical loads of fuel cans due to internal pressure have been preliminarily assessed based on assumed axial power distribution history, which show the integrity of fuel performance. Detailed evaluation of fuel-cladding interactions due to thermal expansion or swelling of fuel pellets due to high burnup will be required for safety analysis of mixed oxide fuels. Thermal conductivity and swelling of plutonium rich mixed oxide fuels shall be taken into consideration. (T. Tanaka)

  1. Implementation of computer codes for performance assessment of the Republic repository of LLW/ILW Mochovce

    International Nuclear Information System (INIS)

    Hanusik, V.; Kopcani, I.; Gedeon, M.

    2000-01-01

    This paper describes selection and adaptation of computer codes required to assess the effects of radionuclide release from Mochovce Radioactive Waste Disposal Facility. The paper also demonstrates how these codes can be integrated into performance assessment methodology. The considered codes include DUST-MS for source term release, MODFLOW for ground-water flow and BS for transport through biosphere and dose assessment. (author)

  2. Synthetic alienation of microbial organisms by using genetic code engineering: Why and how?

    Science.gov (United States)

    Kubyshkin, Vladimir; Budisa, Nediljko

    2017-08-01

    The main goal of synthetic biology (SB) is the creation of biodiversity applicable for biotechnological needs, while xenobiology (XB) aims to expand the framework of natural chemistries with the non-natural building blocks in living cells to accomplish artificial biodiversity. Protein and proteome engineering, which overcome limitation of the canonical amino acid repertoire of 20 (+2) prescribed by the genetic code by using non-canonic amino acids (ncAAs), is one of the main focuses of XB research. Ideally, estranging the genetic code from its current form via systematic introduction of ncAAs should enable the development of bio-containment mechanisms in synthetic cells potentially endowing them with a "genetic firewall" i.e. orthogonality which prevents genetic information transfer to natural systems. Despite rapid progress over the past two decades, it is not yet possible to completely alienate an organism that would use and maintain different genetic code associations permanently. In order to engineer robust bio-contained life forms, the chemical logic behind the amino acid repertoire establishment should be considered. Starting from recent proposal of Hartman and Smith about the genetic code establishment in the RNA world, here the authors mapped possible biotechnological invasion points for engineering of bio-contained synthetic cells equipped with non-canonical functionalities. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Extending the application range of a fuel performance code from normal operating to design basis accident conditions

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Gyori, C.; Schubert, A.; Laar, J. van de; Hozer, Z.; Spykman, G.

    2008-01-01

    Two types of fuel performance codes are generally being applied, corresponding to the normal operating conditions and the design basis accident conditions, respectively. In order to simplify the code management and the interface between the codes, and to take advantage of the hardware progress it is favourable to generate a code that can cope with both conditions. In the first part of the present paper, we discuss the needs for creating such a code. The second part of the paper describes an example of model developments carried out by various members of the TRANSURANUS user group for coping with a loss of coolant accident (LOCA). In the third part, the validation of the extended fuel performance code is presented for LOCA conditions, whereas the last section summarises the present status and indicates needs for further developments to enable the code to deal with reactivity initiated accident (RIA) events

  4. Isotopic modelling using the ENIGMA-B fuel performance code

    International Nuclear Information System (INIS)

    Rossiter, G.D.; Cook, P.M.A.; Weston, R.

    2001-01-01

    A number of experimental programmes by BNFL and other MOX fabricators have now shown that the in-pile performance of MOX fuel is generally similar to that of conventional UO 2 fuel. Models based on UO 2 fuel experience form a good basis for a description of MOX fuel behaviour. However, an area where the performance of MOX fuel is sufficiently different from that of UO 2 to warrant model changes is in the radial power and burnup profile. The differences in radial power and burnup profile arise from the presence of significant concentrations of plutonium in MOX fuel, at beginning of life, and their subsequent evolution with burnup. Amongst other effects, plutonium has a greater neutron absorption cross-section than uranium. This paper focuses on the development of a new model for the radial power and burnup profile within a UO 2 or MOX fuel rod, in which the underlying fissile isotope concentration distributions are tracked during irradiation. The new model has been incorporated into the ENIGMA-B fuel performance code and has been extended to track the isotopic concentrations of the fission gases, xenon and krypton. The calculated distributions have been validated against results from rod puncture measurements and electron probe micro-analysis (EPMA) linescans, performed during the M501 post irradiation examination (PIE) programme. The predicted gas inventory of the fuel/clad gap is compared with the isotopic composition measured during rod puncture and the measured radial distributions of burnup (from neodymium measurements) and plutonium in the fuel are compared with the calculated distributions. It is shown that there is good agreement between the code predictions and the measurements. (author)

  5. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  6. On the performance of diagonal lattice space-time codes for the quasi-static MIMO channel

    KAUST Repository

    Abediseid, Walid; Alouini, Mohamed-Slim

    2013-01-01

    There has been tremendous work done on designing space-time codes for the quasi-static multiple-input multiple-output (MIMO) channel. All the coding design to date focuses on either high-performance, high rates, low complexity encoding and decoding

  7. Performance of the OVERFLOW-MLP and LAURA-MLP CFD Codes on the NASA Ames 512 CPU Origin System

    Science.gov (United States)

    Taft, James R.

    2000-01-01

    The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full

  8. Code for Internal Dosimetry (CINDY)

    International Nuclear Information System (INIS)

    Strenge, D.L.; Peloquin, R.A.; Sula, M.J.; Johnson, J.R.

    1990-10-01

    The CINDY (Code for Internal Dosimetry) Software Package has been developed by Pacific Northwest Laboratory to address the Department of Energy (DOE) Order 5480.11 by providing the capabilities to calculate organ dose equivalents and effective dose equivalents using the approach of International Commission on Radiological Protection (ICRP) 30. The code assist in the interpretation of bioassay data, evaluates committed and calendar-year doses from intake or bioassay measurement data, provides output consistent with revised DOE orders, is easy to use, and is generally applicable to DOE sites. Flexible biokinetics models are used to determine organ doses for annual, 50-year, calendar-year, or any other time-point dose necessary for chronic or acute intakes. CINDY is an interactive program that prompts the user to describe the cases to be analyzed and calculates the necessary results for the type of analysis being performed. Four types of analyses may be specified. 92 figs., 10 tabs

  9. High-performance vertical organic transistors.

    Science.gov (United States)

    Kleemann, Hans; Günther, Alrun A; Leo, Karl; Lüssem, Björn

    2013-11-11

    Vertical organic thin-film transistors (VOTFTs) are promising devices to overcome the transconductance and cut-off frequency restrictions of horizontal organic thin-film transistors. The basic physical mechanisms of VOTFT operation, however, are not well understood and VOTFTs often require complex patterning techniques using self-assembly processes which impedes a future large-area production. In this contribution, high-performance vertical organic transistors comprising pentacene for p-type operation and C60 for n-type operation are presented. The static current-voltage behavior as well as the fundamental scaling laws of such transistors are studied, disclosing a remarkable transistor operation with a behavior limited by injection of charge carriers. The transistors are manufactured by photolithography, in contrast to other VOTFT concepts using self-assembled source electrodes. Fluorinated photoresist and solvent compounds allow for photolithographical patterning directly and strongly onto the organic materials, simplifying the fabrication protocol and making VOTFTs a prospective candidate for future high-performance applications of organic transistors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Reliability in the performance-based concept of fib Model Code 2010

    NARCIS (Netherlands)

    Bigaj-van Vliet, A.; Vrouwenvelder, T.

    2013-01-01

    The design philosophy of the new fib Model Code for Concrete Structures 2010 represents the state of the art with regard to performance-based approach to the design and assessment of concrete structures. Given the random nature of quantities determining structural behaviour, the assessment of

  11. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    Science.gov (United States)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  12. General purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.

    1983-01-01

    A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations

  13. Performance of an Error Control System with Turbo Codes in Powerline Communications

    Directory of Open Access Journals (Sweden)

    Balbuena-Campuzano Carlos Alberto

    2014-07-01

    Full Text Available This paper reports the performance of turbo codes as an error control technique in PLC (Powerline Communications data transmissions. For this system, computer simulations are used for modeling data networks based on the model classified in technical literature as indoor, and uses OFDM (Orthogonal Frequency Division Multiplexing as a modulation technique. Taking into account the channel, modulation and turbo codes, we propose a methodology to minimize the bit error rate (BER, as a function of the average received signal noise ratio (SNR.

  14. Steady State and Transient Fuel Rod Performance Analyses by Pad and Transuranus Codes

    International Nuclear Information System (INIS)

    Slyeptsov, O.; Slyeptsov, S.; Kulish, G.; Ostapov, A.; Chernov, I.

    2013-01-01

    The report performed under IAEA research contract No.15370/L2 describes the analysis results of WWER and PWR fuel rod performance at steady state operation and transients by means of PAD and TRANSURANUS codes. The code TRANSURANUS v1m1j09 developed by Institute for of Transuranium Elements (ITU) was used based on the Licensing Agreement N31302. The code PAD 4.0 developed by Westinghouse Electric Company was utilized in the frame of the Ukraine Nuclear Fuel Qualification Project for safety substantiation for the use of Westinghouse fuel assemblies in the mixed core of WWER-1000 reactor. The experimental data for the Russian fuel rod behavior obtained during the steady-state operation in the WWER-440 core of reactor Kola-3 and during the power transients in the core of MIR research reactor were taken from the IFPE database of the OECD/NEA and utilized for assessing the codes themselves during simulation of such properties as fuel burnup, fuel centerline temperature (FCT), fuel swelling, cladding strain, fission gas release (FGR) and rod internal pressure (RIP) in the rod burnup range of (41 - 60) GWD/MTU. The experimental data of fuel behavior at steady-state operation during seven reactor cycles presented by AREVA for the standard PWR fuel rod design were used to examine the code FGR model in the fuel burnup range of (37 - 81) GWD/MTU. (author)

  15. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings. PMID:27242491

  16. Systemizers are better code-breakers:Self-reported systemizing predicts code-breaking performance in expert hackers and naïve participants

    Directory of Open Access Journals (Sweden)

    India eHarvey

    2016-05-01

    Full Text Available Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders - attention to detail and systemizing - may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e. crypto-analysis or code-breaking. A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001; Baron-Cohen et al., 2003. They were also tested with behavioural tasks involving code-breaking and a control task involving security x-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing was related with performance in the x-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015. We discuss the theoretical and translational implications of our findings.

  17. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  18. Development of ASME Code Section 11 visual examination requirements

    International Nuclear Information System (INIS)

    Cook, J.F.

    1990-01-01

    Section XI of the American Society for Mechanical Engineers Boiler and Pressure Vessel Code (ASME Code) defines three types of nondestructive examinations, visual, surface, and volumetric. Visual examination is important since it is the primary examination method for many safety-related components and systems and is also used as a backup examination for the components and systems which receive surface or volumetric examinations. Recent activity in the Section XI Code organization to improve the rules for visual examinations is reviewed and the technical basis for the new rules, which cover illumination, vision acuity, and performance demonstration, is explained

  19. SWAT2: The improved SWAT code system by incorporating the continuous energy Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Okuno, Hiroshi

    2003-01-01

    SWAT is a code system, which performs the burnup calculation by the combination of the neutronics calculation code, SRAC95 and the one group burnup calculation code, ORIGEN2.1. The SWAT code system can deal with the cell geometry in SRAC95. However, a precise treatment of resonance absorptions by the SRAC95 code using the ultra-fine group cross section library is not directly applicable to two- or three-dimensional geometry models, because of restrictions in SRAC95. To overcome this problem, SWAT2 which newly introduced the continuous energy Monte Carlo code, MVP into SWAT was developed. Thereby, the burnup calculation by the continuous energy in any geometry became possible. Moreover, using the 147 group cross section library called SWAT library, the reactions which are not dealt with by SRAC95 and MVP can be treated. OECD/NEA burnup credit criticality safety benchmark problems Phase-IB (PWR, a single pin cell model) and Phase-IIIB (BWR, fuel assembly model) were calculated as a verification of SWAT2, and the results were compared with the average values of calculation results of burnup calculation code of each organization. Through two benchmark problems, it was confirmed that SWAT2 was applicable to the burnup calculation of the complicated geometry. (author)

  20. Reactivity Insertion Accident (RIA) Capability Status in the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Folsom, Charles Pearson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pastore, Giovanni [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-05-01

    One of the Challenge Problems being considered within CASL relates to modelling and simulation of Light Water Reactor LWR) fuel under Reactivity Insertion Accident (RIA) conditions. BISON is the fuel performance code used within CASL for LWR fuel under both normal operating and accident conditions, and thus must be capable of addressing the RIA challenge problem. This report outlines required BISON capabilities for RIAs and describes the current status of the code. Information on recent accident capability enhancements, application of BISON to a RIA benchmark exercise, and plans for validation to RIA behavior are included.

  1. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  2. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    International Nuclear Information System (INIS)

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites

  3. Steady-State Calculation of the ATLAS Test Facility Using the SPACE Code

    International Nuclear Information System (INIS)

    Kim, Hyoung Tae; Choi, Ki Yong; Kim, Kyung Doo

    2011-01-01

    The Korean nuclear industry is developing a thermalhydraulic analysis code for safety analysis of pressurized water reactors (PWRs). The new code is called the Safety and Performance Analysis Code for Nuclear Power Plants (SPACE). Several research and industrial organizations including KAERI (Korea Atomic Energy Research Institute) are participating in the collaboration for the development of the SPACE code. One of the main tasks of KAERI is to carry out separate effect tests (SET) and integral effect tests (IET) for code verification and validation (V and V). The IET has been performed with ATLAS (Advanced Thermalhydraulic Test Loop for Accident Simulation) based on the design features of the APR1400 (Advanced Power Reactor of 1400MWe). In the present work the SPACE code input-deck for ATLAS is developed and used for simulation of the steady-state conditions of ATLAS as a preliminary work for IET V and V of the SPACE code

  4. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  5. Performance Analysis of Spectral Amplitude Coding Based OCDMA System with Gain and Splitter Mismatch

    Science.gov (United States)

    Umrani, Fahim A.; Umrani, A. Waheed; Umrani, Naveed A.; Memon, Kehkashan A.; Kalwar, Imtiaz Hussain

    2013-09-01

    This paper presents the practical analysis of the optical code-division multiple-access (O-CDMA) systems based on perfect difference codes. The work carried out use SNR criterion to select the optimal value of avalanche photodiodes (APD) gain and shows how the mismatch in the splitters and gains of the APD used in the transmitters and receivers of network can degrade the BER performance of the system. The investigations also reveal that higher APD gains are not suitable for such systems even at higher powers. The system performance, with consideration of shot noise, thermal noise, bulk and surface leakage currents is also investigated.

  6. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    Science.gov (United States)

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  7. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    Science.gov (United States)

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  8. Performance of super-orthogonal space-time trellis code in a multipath environment

    CSIR Research Space (South Africa)

    Sokoya, OA

    2007-09-01

    Full Text Available This paper investigates the performance of Super-Orthogonal Space-time Trellis Code (SOSTTC) designed primarily for non-frequency selective (i.e. flat) fading channel but now applied to a frequency selective fading channel. A new decoding trellis...

  9. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  10. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  11. Contributions of Sensory Coding and Attentional Control to Individual Differences in Performance in Spatial Auditory Selective Attention Tasks.

    Science.gov (United States)

    Dai, Lengshi; Shinn-Cunningham, Barbara G

    2016-01-01

    Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics

  12. Contributions of sensory coding and attentional control to individual differences in performance in spatial auditory selective attention tasks

    Directory of Open Access Journals (Sweden)

    Lengshi Dai

    2016-10-01

    Full Text Available Listeners with normal hearing thresholds differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding, onset event-related potentials from the scalp (ERPs, reflecting cortical responses to sound, and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones; however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance, inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with normal hearing thresholds can arise due to both subcortical coding differences and differences in attentional control, depending on

  13. Performance Evaluation of a Novel Optimization Sequential Algorithm (SeQ Code for FTTH Network

    Directory of Open Access Journals (Sweden)

    Fazlina C.A.S.

    2017-01-01

    Full Text Available The SeQ codes has advantages, such as variable cross-correlation property at any given number of users and weights, as well as effectively suppressed the impacts of phase induced intensity noise (PIIN and multiple access interference (MAI cancellation property. The result revealed, at system performance analysis of BER = 10-09, the SeQ code capable to achieved 1 Gbps up to 60 km.

  14. The moral code in Islam and organ donation in Western countries: reinterpreting religious scriptures to meet utilitarian medical objectives.

    Science.gov (United States)

    Rady, Mohamed Y; Verheijde, Joseph L

    2014-06-02

    End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life.

  15. JPEG2000 COMPRESSION CODING USING HUMAN VISUAL SYSTEM MODEL

    Institute of Scientific and Technical Information of China (English)

    Xiao Jiang; Wu Chengke

    2005-01-01

    In order to apply the Human Visual System (HVS) model to JPEG2000 standard,several implementation alternatives are discussed and a new scheme of visual optimization isintroduced with modifying the slope of rate-distortion. The novelty is that the method of visual weighting is not lifting the coefficients in wavelet domain, but is complemented by code stream organization. It remains all the features of Embedded Block Coding with Optimized Truncation (EBCOT) such as resolution progressive, good robust for error bit spread and compatibility of lossless compression. Well performed than other methods, it keeps the shortest standard codestream and decompression time and owns the ability of VIsual Progressive (VIP) coding.

  16. VINE-A NUMERICAL CODE FOR SIMULATING ASTROPHYSICAL SYSTEMS USING PARTICLES. II. IMPLEMENTATION AND PERFORMANCE CHARACTERISTICS

    International Nuclear Information System (INIS)

    Nelson, Andrew F.; Wetzstein, M.; Naab, T.

    2009-01-01

    We continue our presentation of VINE. In this paper, we begin with a description of relevant architectural properties of the serial and shared memory parallel computers on which VINE is intended to run, and describe their influences on the design of the code itself. We continue with a detailed description of a number of optimizations made to the layout of the particle data in memory and to our implementation of a binary tree used to access that data for use in gravitational force calculations and searches for smoothed particle hydrodynamics (SPH) neighbor particles. We describe the modifications to the code necessary to obtain forces efficiently from special purpose 'GRAPE' hardware, the interfaces required to allow transparent substitution of those forces in the code instead of those obtained from the tree, and the modifications necessary to use both tree and GRAPE together as a fused GRAPE/tree combination. We conclude with an extensive series of performance tests, which demonstrate that the code can be run efficiently and without modification in serial on small workstations or in parallel using the OpenMP compiler directives on large-scale, shared memory parallel machines. We analyze the effects of the code optimizations and estimate that they improve its overall performance by more than an order of magnitude over that obtained by many other tree codes. Scaled parallel performance of the gravity and SPH calculations, together the most costly components of most simulations, is nearly linear up to at least 120 processors on moderate sized test problems using the Origin 3000 architecture, and to the maximum machine sizes available to us on several other architectures. At similar accuracy, performance of VINE, used in GRAPE-tree mode, is approximately a factor 2 slower than that of VINE, used in host-only mode. Further optimizations of the GRAPE/host communications could improve the speed by as much as a factor of 3, but have not yet been implemented in VINE

  17. The added value of international benchmarks for fuel performance codes: an illustration on the basis of TRANSURANUS

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Schubert, A.; Gyeori, C.; Van De Laar, J.

    2009-01-01

    Safety authorities and fuel designers, as well as nuclear research centers rely heavily on fuel performance codes for predicting the behaviour and life-time of fuel rods. The simulation tools are developed and validated on the basis of experimental results, some of which is in the public domain such as the International Fuel Performance Experiments database of the OECD/NEA and IAEA. Publicly available data constitute an excellent basis for assessing codes themselves, but also to compare codes that are being developed by independent teams. The present report summarises the advantages for the TRANSURANUS code by taking part in previous benchmarks organised by the IAEA, and outlines the preliminary results along with the perspectives of our participation in the current coordinated research project FUMEXIII

  18. Non-Protein Coding RNAs

    CERN Document Server

    Walter, Nils G; Batey, Robert T

    2009-01-01

    This book assembles chapters from experts in the Biophysics of RNA to provide a broadly accessible snapshot of the current status of this rapidly expanding field. The 2006 Nobel Prize in Physiology or Medicine was awarded to the discoverers of RNA interference, highlighting just one example of a large number of non-protein coding RNAs. Because non-protein coding RNAs outnumber protein coding genes in mammals and other higher eukaryotes, it is now thought that the complexity of organisms is correlated with the fraction of their genome that encodes non-protein coding RNAs. Essential biological processes as diverse as cell differentiation, suppression of infecting viruses and parasitic transposons, higher-level organization of eukaryotic chromosomes, and gene expression itself are found to largely be directed by non-protein coding RNAs. The biophysical study of these RNAs employs X-ray crystallography, NMR, ensemble and single molecule fluorescence spectroscopy, optical tweezers, cryo-electron microscopy, and ot...

  19. The aminoacyl-tRNA synthetases had only a marginal role in the origin of the organization of the genetic code: Evidence in favor of the coevolution theory.

    Science.gov (United States)

    Di Giulio, Massimo

    2017-11-07

    The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal

  20. NOAA Weather Radio - EAS Event Codes

    Science.gov (United States)

    Non-Zero All Hazards Logo Emergency Alert Description Event Codes Fact Sheet FAQ Organization Search Coding Using SAME SAME Non-Zero Codes DOCUMENTS NWR Poster NWR Brochure NWR Brochure Printing Notes

  1. The Role of Performance Management in Creating and Maintaining a High-Performance Organization

    Directory of Open Access Journals (Sweden)

    André A. de Waal

    2015-04-01

    Full Text Available There is still a good deal of confusion in the literature about how the use of a performance management system affects overall organizational performance. Some researchers find that performance management enhances both the financial and non-financial results of an organization, while others do not find any positive effects or, at most, ambiguous effects. An important step toward getting more clarity in this relationship is to investigate the role performance management plays in creating and maintaining a high-performance organization (HPO. The purpose of this study is to integrate performance management analysis (PMA and high-performance organization (HPO. A questionnaire combining questions on PMA dimensions and HPO factors was administered to two European-based multinational firms. Based on 468 valid questionnaires, a correlation analysis was performed on the PMA dimensions and the HPO factors in order to test the impact of performance management on the factors of high organizational performance. The results show strong and significant correlations between all the PMA dimensions and all the HPO factors, indicating that a performance management system that fosters performance-driven behavior in the organization is of critical importance to strengthen overall financial and non-financial performance.

  2. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  3. Accounting for results: how conservation organizations report performance information.

    Science.gov (United States)

    Rissman, Adena R; Smail, Robert

    2015-04-01

    Environmental program performance information is in high demand, but little research suggests why conservation organizations differ in reporting performance information. We compared performance measurement and reporting by four private-land conservation organizations: Partners for Fish and Wildlife in the US Fish and Wildlife Service (national government), Forest Stewardship Council-US (national nonprofit organization), Land and Water Conservation Departments (local government), and land trusts (local nonprofit organization). We asked: (1) How did the pattern of performance reporting relationships vary across organizations? (2) Was political conflict among organizations' principals associated with greater performance information? and (3) Did performance information provide evidence of program effectiveness? Based on our typology of performance information, we found that most organizations reported output measures such as land area or number of contracts, some reported outcome indicators such as adherence to performance standards, but few modeled or measured environmental effects. Local government Land and Water Conservation Departments reported the most types of performance information, while local land trusts reported the fewest. The case studies suggest that governance networks influence the pattern and type of performance reporting, that goal conflict among principles is associated with greater performance information, and that performance information provides unreliable causal evidence of program effectiveness. Challenging simple prescriptions to generate more data as evidence, this analysis suggests (1) complex institutional and political contexts for environmental program performance and (2) the need to supplement performance measures with in-depth evaluations that can provide causal inferences about program effectiveness.

  4. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    2017-02-01

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functional characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.

  5. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    International Nuclear Information System (INIS)

    Tso, C.F.; Hueggenberg, R.

    2004-01-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work

  6. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Tso, C.F. [Arup (United Kingdom); Hueggenberg, R. [Gesellschaft fuer Nuklear-Behaelter mbH (Germany)

    2004-07-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work.

  7. Fire-safety engineering and performance-based codes

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    project administrators, etc. The book deals with the following topics: • Historical presentation on the subject of fire • Legislation and building project administration • European fire standardization • Passive and active fire protection • Performance-based Codes • Fire-safety Engineering • Fundamental......Fire-safety Engineering is written as a textbook for Engineering students at universities and other institutions of higher education that teach in the area of fire. The book can also be used as a work of reference for consulting engineers, Building product manufacturers, contractors, building...... thermodynamics • Heat exchange during the fire process • Skin burns • Burning rate, energy release rate and design fires • Proposal to Risk-based design fires • Proposal to a Fire scale • Material ignition and flame spread • Fire dynamics in buildings • Combustion products and toxic gases • Smoke inhalation...

  8. Improving 3D-Turbo Code's BER Performance with a BICM System over Rayleigh Fading Channel

    Directory of Open Access Journals (Sweden)

    R. Yao

    2016-12-01

    Full Text Available Classical Turbo code suffers from high error floor due to its small Minimum Hamming Distance (MHD. Newly-proposed 3D-Turbo code can effectively increase the MHD and achieve a lower error floor by adding a rate-1 post encoder. In 3D-Turbo codes, part of the parity bits from the classical Turbo encoder are further encoded through the post encoder. In this paper, a novel Bit-Interleaved Coded Modulation (BICM system is proposed by combining rotated mapping Quadrature Amplitude Modulation (QAM and 3D-Turbo code to improve the Bit Error Rate (BER performance of 3D-Turbo code over Raleigh fading channel. A key-bit protection scheme and a Two-Dimension (2D iterative soft demodulating-decoding algorithm are developed for the proposed BICM system. Simulation results show that the proposed system can obtain about 0.8-1.0 dB gain at BER of 10^{-6}, compared with the existing BICM system with Gray mapping QAM.

  9. Genome-wide identification of coding and non-coding conserved sequence tags in human and mouse genomes

    Directory of Open Access Journals (Sweden)

    Maggi Giorgio P

    2008-06-01

    Full Text Available Abstract Background The accurate detection of genes and the identification of functional regions is still an open issue in the annotation of genomic sequences. This problem affects new genomes but also those of very well studied organisms such as human and mouse where, despite the great efforts, the inventory of genes and regulatory regions is far from complete. Comparative genomics is an effective approach to address this problem. Unfortunately it is limited by the computational requirements needed to perform genome-wide comparisons and by the problem of discriminating between conserved coding and non-coding sequences. This discrimination is often based (thus dependent on the availability of annotated proteins. Results In this paper we present the results of a comprehensive comparison of human and mouse genomes performed with a new high throughput grid-based system which allows the rapid detection of conserved sequences and accurate assessment of their coding potential. By detecting clusters of coding conserved sequences the system is also suitable to accurately identify potential gene loci. Following this analysis we created a collection of human-mouse conserved sequence tags and carefully compared our results to reliable annotations in order to benchmark the reliability of our classifications. Strikingly we were able to detect several potential gene loci supported by EST sequences but not corresponding to as yet annotated genes. Conclusion Here we present a new system which allows comprehensive comparison of genomes to detect conserved coding and non-coding sequences and the identification of potential gene loci. Our system does not require the availability of any annotated sequence thus is suitable for the analysis of new or poorly annotated genomes.

  10. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    Energy Technology Data Exchange (ETDEWEB)

    Ball, J.; Glowa, G.; Wren, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Ewig, F. [GRS Koln (Germany); Dickenson, S. [AEAT, (United Kingdom); Billarand, Y.; Cantrel, L. [IPSN (France); Rydl, A. [NRIR (Czech Republic); Royen, J. [OECD/NEA (France)

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I{sup -} concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  11. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I - concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  12. The relationship between family orientation, organization context, organization structure and firm performance

    OpenAIRE

    Joris Meijaard; Lorraine Uhlaner

    2004-01-01

    This study focuses on the prediction of three firm performance indicators, sales growth, innovation performance and profitability, on a sample of small and medium-sized firms in the Netherlands. Predictions from agency theory and the resource based view of organizations lead to alternate hypotheses regarding the direct and indirect effects of family ownership and management on firm performance. Other variables in the analysis include various organization structure variables including standard...

  13. Implementation and Performance Evaluation of Distributed Cloud Storage Solutions using Random Linear Network Coding

    DEFF Research Database (Denmark)

    Fitzek, Frank; Toth, Tamas; Szabados, Áron

    2014-01-01

    This paper advocates the use of random linear network coding for storage in distributed clouds in order to reduce storage and traffic costs in dynamic settings, i.e. when adding and removing numerous storage devices/clouds on-the-fly and when the number of reachable clouds is limited. We introduce...... various network coding approaches that trade-off reliability, storage and traffic costs, and system complexity relying on probabilistic recoding for cloud regeneration. We compare these approaches with other approaches based on data replication and Reed-Solomon codes. A simulator has been developed...... to carry out a thorough performance evaluation of the various approaches when relying on different system settings, e.g., finite fields, and network/storage conditions, e.g., storage space used per cloud, limited network use, and limited recoding capabilities. In contrast to standard coding approaches, our...

  14. Application of the BISON Fuel Performance Code of the FUMEX-III Coordinated Research Project

    International Nuclear Information System (INIS)

    Williamson, R.L.; Novascone, S.R.

    2013-01-01

    Since 1981, the International Atomic Energy Agency (IAEA) has sponsored a series of Coordinated Research Projects (CRP) in the area of nuclear fuel modeling. These projects have typically lasted 3-5 years and have had broad international participation. The objectives of the projects have been to assess the maturity and predictive capability of fuel performance codes, support interaction and information exchange between countries with code development and application needs, build a database of well- defined experiments suitable for code validation, transfer a mature fuel modeling code to developing countries, and provide guidelines for code quality assurance and code application to fuel licensing. The fourth and latest of these projects, known as FUMEX-III1 (FUel Modeling at EXtended Burnup- III), began in 2008 and ended in December of 2011. FUMEX-III was the first of this series of fuel modeling CRP's in which the INL participated. Participants met at the beginning of the project to discuss and select a set of experiments ('priority cases') for consideration during the project. These priority cases were of broad interest to the participants and included reasonably well-documented and reliable data. A meeting was held midway through the project for participants to present and discuss progress on modeling the priority cases. A final meeting was held at close of the project to present and discuss final results and provide input for a final report. Also in 2008, the INL initiated development of a new multidimensional (2D and 3D) multiphysics nuclear fuel performance code called BISON, with code development progressing steadily during the three-year FUMEX-III project. Interactions with international fuel modeling researchers via FUMEX-III played a significant role in the BISON evolution, particularly influencing the selection of material and behavioral models which are now included in the code. The FUMEX-III cases are generally integral fuel rod experiments occurring

  15. Development and application of the BISON fuel performance code to the analysis of fission gas behaviour

    International Nuclear Information System (INIS)

    Pastore, G.; Hales, J.D.; Novascone, S.R.; Perez, D.M.; Spencer, B.W.; Williamson, R.L.

    2014-01-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that has been under development at Idaho National Laboratory (USA) since 2009. The capabilities of BISON comprise implicit solution of the fully coupled thermo-mechanics and diffusion equations, applicability to a variety of fuel forms, and simulation of both steady-state and transient conditions. The code includes multiphysics constitutive behavior for both fuel and cladding materials, and is designed for efficient use on highly parallel computers. This paper describes the main features of BISON, with emphasis on recent developments in modelling of fission gas behaviour in LWR-UO 2 fuel. The code is applied to the simulation of fuel rod irradiation experiments from the OECD/NEA International Fuel Performance Experiments Database. The comparison of the results with the available experimental data of fuel temperature, fission gas release, and cladding diametrical strain during pellet-cladding mechanical interaction is presented, pointing out a promising potential of the BISON code with the new fission gas behaviour model. (authors)

  16. System Performance of Concatenated STBC and Block Turbo Codes in Dispersive Fading Channels

    Directory of Open Access Journals (Sweden)

    Kam Tai Chan

    2005-05-01

    Full Text Available A new scheme of concatenating the block turbo code (BTC with the space-time block code (STBC for an OFDM system in dispersive fading channels is investigated in this paper. The good error correcting capability of BTC and the large diversity gain characteristics of STBC can be achieved simultaneously. The resulting receiver outperforms the iterative convolutional Turbo receiver with maximum- a-posteriori-probability expectation maximization (MAP-EM algorithm. Because of its ability to perform the encoding and decoding processes in parallel, the proposed system is easy to implement in real time.

  17. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    Energy Technology Data Exchange (ETDEWEB)

    Giovedi, Claudia; Martins, Marcelo Ramos, E-mail: claudia.giovedi@labrisco.usp.br, E-mail: mrmartin@usp.br [Laboratorio de Analise, Avaliacao e Gerenciamento de Risco (LabRisco/POLI/USP), São Paulo, SP (Brazil); Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e, E-mail: ayabe@ipen.br, E-mail: dsgomes@ipen.br, E-mail: teixiera@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  18. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    International Nuclear Information System (INIS)

    Giovedi, Claudia; Martins, Marcelo Ramos; Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e

    2017-01-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  19. The Impact of a Learning Organization on Performance: Focusing on Knowledge Performance and Financial Performance

    Science.gov (United States)

    Kim, Kyoungshin; Watkins, Karen E.; Lu, Zhenqiu

    2017-01-01

    Purpose: The purpose of this study is to examine the relationships among a learning organization, knowledge and financial performance using the Dimensions of the Learning Organization Questionnaire and its abbreviated version. Design/methodology/approach: This study used a secondary data set and performed second-order factor analysis and…

  20. Performance Analysis of DPSK Signals with Selection Combining and Convolutional Coding in Fading Channel

    National Research Council Canada - National Science Library

    Ong, Choon

    1998-01-01

    The performance analysis of a differential phase shift keyed (DPSK) communications system, operating in a Rayleigh fading environment, employing convolutional coding and diversity processing is presented...

  1. 3D Analysis of Cooling Performance with Loss of Offsite Power Using GOTHIC Code

    International Nuclear Information System (INIS)

    Oh, Kye Min; Heo, Gyun Young; Na, In Sik; Choi, Yu Jung

    2010-01-01

    GOTHIC code enables to analyze one-dimensional or multi-dimensional problems for evaluating the cooling performance of loss of offsite power. The conventional GOTHIC code analysis performs heat transfer between plant containment and the outside of the fan cooler tubes by modeling each of fan cooler part model and component cooling water inside tube each to analyze boiling probability. In this paper, we suggest a way which reduces the multi-procedure of the cooling performance with loss of offsite power or the heat transfer states with complex geometrical structure to a single-procedure and verify the applicability of the heat transfer differences from the containment atmosphere humidity changes by the multi-nodes which component cooling water of tube or air of Reactor Containment Fan Cooler in the containment, otherwise the component model uses only one node

  2. Organization of cytokeratin cytoskeleton and germ plasm in the vegetal cortex of Xenopus laevis oocytes depends on coding and non-coding RNAs: Three-dimensional and ultrastructural analysis

    International Nuclear Information System (INIS)

    Kloc, Malgorzata; Bilinski, Szczepan; Dougherty, Matthew T.

    2007-01-01

    Recent studies discovered a novel structural role of RNA in maintaining the integrity of the mitotic spindle and cellular cytoskeleton. In Xenopus laevis, non-coding Xlsirts and coding VegT RNAs play a structural role in anchoring localized RNAs, maintaining the organization of the cytokeratin cytoskeleton and germinal granules in the oocyte vegetal cortex and in subsequent development of the germline in the embryo. We studied the ultrastructural effects of antisense oligonucleotide driven ablation of Xlsirts and VegT RNAs on the organization of the cytokeratin, germ plasm and other components of the vegetal cortex. We developed a novel method to immunolabel and visualize cytokeratin at the electron microscopy level, which allowed us to reconstruct the ultrastructural organization of the cytokeratin network relative to the components of the vegetal cortex in Xenopus oocytes. The removal of Xlsirts and VegT RNAs not only disrupts the cytokeratin cytoskeleton but also has a profound transcript-specific effect on the anchoring and distribution of germ plasm islands and their germinal granules and the arrangement of yolk platelets within the vegetal cortex. We suggest that the cytokeratin cytoskeleton plays a role in anchoring of germ plasm islands within the vegetal cortex and germinal granules within the germ plasm islands

  3. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  4. Performance Analysis of a New Coded TH-CDMA Scheme in Dispersive Infrared Channel with Additive Gaussian Noise

    Science.gov (United States)

    Hamdi, Mazda; Kenari, Masoumeh Nasiri

    2013-06-01

    We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.

  5. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  6. Soft-Decision-Data Reshuffle to Mitigate Pulsed Radio Frequency Interference Impact on Low-Density-Parity-Check Code Performance

    Science.gov (United States)

    Ni, Jianjun David

    2011-01-01

    This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.

  7. Performance Comparison of Assorted Color Spaces for Multilevel Block Truncation Coding based Face Recognition

    OpenAIRE

    H.B. Kekre; Sudeep Thepade; Karan Dhamejani; Sanchit Khandelwal; Adnan Azmi

    2012-01-01

    The paper presents a performance analysis of Multilevel Block Truncation Coding based Face Recognition among widely used color spaces. In [1], Multilevel Block Truncation Coding was applied on the RGB color space up to four levels for face recognition. Better results were obtained when the proposed technique was implemented using Kekre’s LUV (K’LUV) color space [25]. This was the motivation to test the proposed technique using assorted color spaces. For experimental analysis, two face databas...

  8. Measurement of reactivity coefficients for code validation

    International Nuclear Information System (INIS)

    Nuding, Matthias; Loetsch, Thomas

    2005-01-01

    In the year 2003 measurements in the cold reactor state have been performed at the NPP KKI 2 in order to validate the codes that are used for reactor core calculations and especially for the proof of the shutdown margin that is produced by calculations only. For full power states code verification is quite easy because the calculations can be compared with different measured values, e.g. with the activation values determined by the aeroball system. For cold reactor states, however the data base is smaller, especially for reactor cores that are quite 'inhomogeneous' and have rather high Pu-fiss-and 235 U-contents. At the same time the cold reactor state is important regarding the shutdown margin. For these reasons the measurements mentioned above have been performed in order to check the accuracy of the codes that are used by the operator and by our organization for many years. Basically, boron concentrations and control rod worths for different configurations have been measured. The results of the calculation show a very good agreement with the measured values. Therefore, it can be stated that the operator's as well as our code system is suitable for routine use, e.g. during licensing procedures (Authors)

  9. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  10. WWER-440 fuel rod performance analysis with PIN-Micro and TRANSURANUS codes

    International Nuclear Information System (INIS)

    Vitkova, M.; Manolova, M.; Stefanova, S.; Simeonova, V.; Passage, G.; Lassmann, K.

    1994-01-01

    PIN-micro and TRANSURANUS codes were used to analyse the WWER-440 fuel rod behaviour at normal operation conditions. Two highest loaded fuel rods of the fuel assemblies irradiated in WWER-440 with different power histories were selected. A set of the most probable average values of all geometrical and technological parameters were used. A comparison between PIN-micro and TRANSURANUS codes was performed using identical input data. The results for inner gas pressure, gap size, local linear heat rate, fuel central temperature and fission gas release as a function of time calculated for the selected fuel rods are presented. The following conclusions were drawn: 1) The PIN-micro code predicts adequately the thermal and mechanical behaviour of the two fuel rods; 2) The comparison of the results obtained by PIN-micro and TRANSURANUS shows a reasonable agreement and the discrepancies could be explained by the lack of thoroughly WWER oriented verification of TRANSURANUS; 3) The advanced TRANSURANUS code could be successfully applied for WWER fuel rod thermal and mechanical analysis after incorporation of all necessary WWER specific material properties and models for the Zr+1%Nb cladding, for the fuel rod as a whole and after validation against WWER experimental and operational data. 1 tab., 10 figs., 10 refs

  11. WWER-440 fuel rod performance analysis with PIN-Micro and TRANSURANUS codes

    Energy Technology Data Exchange (ETDEWEB)

    Vitkova, M; Manolova, M; Stefanova, S; Simeonova, V; Passage, G [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika; Kharalampieva, Ts [Kombinat Atomna Energetika, Kozloduj (Bulgaria); Lassmann, K [European Atomic Energy Community, Karlsruhe (Germany). European Inst. for Transuranium Elements

    1994-12-31

    PIN-micro and TRANSURANUS codes were used to analyse the WWER-440 fuel rod behaviour at normal operation conditions. Two highest loaded fuel rods of the fuel assemblies irradiated in WWER-440 with different power histories were selected. A set of the most probable average values of all geometrical and technological parameters were used. A comparison between PIN-micro and TRANSURANUS codes was performed using identical input data. The results for inner gas pressure, gap size, local linear heat rate, fuel central temperature and fission gas release as a function of time calculated for the selected fuel rods are presented. The following conclusions were drawn: (1) The PIN-micro code predicts adequately the thermal and mechanical behaviour of the two fuel rods; (2) The comparison of the results obtained by PIN-micro and TRANSURANUS shows a reasonable agreement and the discrepancies could be explained by the lack of thoroughly WWER oriented verification of TRANSURANUS; (3) The advanced TRANSURANUS code could be successfully applied for WWER fuel rod thermal and mechanical analysis after incorporation of all necessary WWER specific material properties and models for the Zr+1%Nb cladding, for the fuel rod as a whole and after validation against WWER experimental and operational data. 1 tab., 10 figs., 10 refs.

  12. PLAN FOR PERFORMANCE ADMINISTRATION IN PYRAMIDAL STRUCTURE ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Domingo Alarcón Ortiz

    2013-11-01

    Full Text Available Performance administration has become a current strategy in evaluating management within organizations, but its implementation often lacks an action plan, resulting from the valuation of climate and leadership styles embedded in the culture of the organization. This paper proposes a model action plan for performance management, which has been implemented, executed and evaluated in pyramidal organizational structure organizations where a diagnosis of the cultural climate and leadership styles recurring in the organization have been previously made.

  13. THE CONCEPT OF PERFORMANCE IN BUSINESS ORGANIZATIONS – CASE STUDY ON THE EMPLOYEE PERFORMANCE IN ROMANIAN BUSINESS ORGANIZATIONS

    OpenAIRE

    COSMIN OCTAVIAN DOBRIN; GHEORGHE N. POPESCU; VERONICA ADRIANA POPESCU; CRISTINA RALUCA POPESCU

    2012-01-01

    The general economic recession generated by the world financial crisis, with direct implications on funding, economic actors’ interactions, economic and social environment, technological progress and knowledge development, imposes the rethinking of the concept of performance in business organizations worldwide. Our study aims to present the importance held by performance at the level of business organizations in terms of a historical and multidisciplinary approach, in order to determine the b...

  14. A probabilistic analysis of PWR and BWR fuel rod performance using the code CASINO-SLEUTH

    International Nuclear Information System (INIS)

    Bull, A.J.

    1987-01-01

    This paper presents a brief description of the Monte Carlo and response surface techniques used in the code, and a probabilistic analysis of fuel rod performance in PWR and BWR applications. The analysis shows that fission gas release predictions are very sensitive to changes in certain of the code's inputs, identifies the most dominant input parameters and compares their effects in the two cases. (orig./HP)

  15. Influence of Code Size Variation on the Performance of 2D Hybrid ZCC/MD in OCDMA System

    Directory of Open Access Journals (Sweden)

    Matem Rima.

    2018-01-01

    Full Text Available Several two dimensional OCDMA have been developed in order to overcome many problems in optical network, enhancing cardinality, suppress Multiple Access Interference (MAI and mitigate Phase Induced Intensity Noise (PIIN. This paper propose a new 2D hybrid ZCC/MD code combining between 1D ZCC spectral encoding where M is its code length and 1D MD spatial spreading where N is its code length. The spatial spreading (N code length offers a good cardinality so it represents the main effect to enhance the performance of the system compared to the spectral (M code length according to the numerical results.

  16. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  17. The development of the Nuclear Electric core performance and fault transient analysis code package in support of Sizewell B

    International Nuclear Information System (INIS)

    Hall, P.; Hutt, P.

    1994-01-01

    This paper describes Nuclear Electric's (NE) development of an integrated code package in support of all its reactors including Sizewell B, designed for the provision of fuel management design, core performance studies, operational support and fault transient analysis. The package uses the NE general purpose three-dimensional transient reactor physics code PANTHER with cross-sections derived in the PWR case from the LWRWIMS LWR lattice neutronics code. The package also includes ENIGMA a generic fuel performance code and for PWR application VIPRE-01 a subchannel thermal hydraulics code, RELAP5 the system thermal hydraulics transient code and SCORPIO an on-line surveillance system. The paper describes the capabilities and validation of the elements of this package for PWR, how they are coupled within the package and the way in which they are being applied for Sizewell B to on-line surveillance and fault transient analysis. (Author)

  18. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set. ...

  19. Indonesian Private University Lecturer Performance Improvement Model to Improve a Sustainable Organization Performance

    Science.gov (United States)

    Suryaman

    2018-01-01

    Lecturer performance will affect the quality and carrying capacity of the sustainability of an organization, in this case the university. There are many models developed to measure the performance of teachers, but not much to discuss the influence of faculty performance itself towards sustainability of an organization. This study was conducted in…

  20. Assessment of stainless steel 348 fuel rod performance against literature available data using TRANSURANUS code

    Directory of Open Access Journals (Sweden)

    Giovedi Claudia

    2016-01-01

    Full Text Available Early pressurized water reactors were originally designed to operate using stainless steel as cladding material, but during their lifetime this material was replaced by zirconium-based alloys. However, after the Fukushima Daiichi accident, the problems related to the zirconium-based alloys due to the hydrogen production and explosion under severe accident brought the importance to assess different materials. In this sense, initiatives as ATF (Accident Tolerant Fuel program are considering different material as fuel cladding and, one candidate is iron-based alloy. In order to assess the fuel performance of fuel rods manufactured using iron-based alloy as cladding material, it was necessary to select a specific stainless steel (type 348 and modify properly conventional fuel performance codes developed in the last decades. Then, 348 stainless steel mechanical and physics properties were introduced in the TRANSURANUS code. The aim of this paper is to present the obtained results concerning the verification of the modified TRANSURANUS code version against data collected from the open literature, related to reactors which operated using stainless steel as cladding. Considering that some data were not available, some assumptions had to be made. Important differences related to the conventional fuel rods were taken into account. Obtained results regarding the cladding behavior are in agreement with available information. This constitutes an evidence of the modified TRANSURANUS code capabilities to perform fuel rod investigation of fuel rods manufactured using 348 stainless steel as cladding material.

  1. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  2. Performance management practices in public sector organizations : Impact on performance

    NARCIS (Netherlands)

    Verbeeten, Frank H.M.

    2008-01-01

    Purpose - The aim of this study is to investigate whether performance management practices affect performance in public sector organizations. Design/methodology/approach - Theoretically, the research project is based on economic as well as behavioral theories. The study distinguishes amongst

  3. PERFORMANCE EVALUATION OF TURBO CODED OFDM SYSTEMS AND APPLICATION OF TURBO DECODING FOR IMPULSIVE CHANNEL

    Directory of Open Access Journals (Sweden)

    Savitha H. M.

    2010-09-01

    Full Text Available A comparison of the performance of hard and soft-decision turbo coded Orthogonal Frequency Division Multiplexing systems with Quadrature Phase Shift Keying (QPSK and 16-Quadrature Amplitude Modulation (16-QAM is considered in the first section of this paper. The results show that the soft-decision method greatly outperforms the hard-decision method. The complexity of the demapper is reduced with the use of simplified algorithm for 16-QAM demapping. In the later part of the paper, we consider the transmission of data over additive white class A noise (AWAN channel, using turbo coded QPSK and 16-QAM systems. We propose a novel turbo decoding scheme for AWAN channel. Also we compare the performance of turbo coded systems with QPSK and 16-QAM on AWAN channel with two different channel values- one computed as per additive white Gaussian noise (AWGN channel conditions and the other as per AWAN channel conditions. The results show that the use of appropriate channel value in turbo decoding helps to combat the impulsive noise more effectively. The proposed model for AWAN channel exhibits comparable Bit error rate (BER performance as compared to AWGN channel.

  4. Enabling Ethical Code Embeddedness in Construction Organizations: A Review of Process Assessment Approach.

    Science.gov (United States)

    Oladinrin, Olugbenga Timo; Ho, Christabel Man-Fong

    2016-08-01

    Several researchers have identified codes of ethics (CoEs) as tools that stimulate positive ethical behavior by shaping the organisational decision-making process, but few have considered the information needed for code implementation. Beyond being a legal and moral responsibility, ethical behavior needs to become an organisational priority, which requires an alignment process that integrates employee behavior with the organisation's ethical standards. This paper discusses processes for the responsible implementation of CoEs based on an extensive review of the literature. The internationally recognized European Foundation for Quality Management Excellence Model (EFQM model) is proposed as a suitable framework for assessing an organisation's ethical performance, including CoE embeddedness. The findings presented herein have both practical and research implications. They will encourage construction practitioners to shift their attention from ethical policies to possible enablers of CoE implementation and serve as a foundation for further research on ethical performance evaluation using the EFQM model. This is the first paper to discuss the model's use in the context of ethics in construction practice.

  5. Application and analysis of performance of dqpsk advanced modulation format in spectral amplitude coding ocdma

    International Nuclear Information System (INIS)

    Memon, A.

    2015-01-01

    SAC (Spectral Amplitude Coding) is a technique of OCDMA (Optical Code Division Multiple Access) to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying) modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying) is applied, simulated and analyzed, m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK. (author)

  6. ASME nuclear codes and standards risk management strategic planning

    International Nuclear Information System (INIS)

    Hill, Ralph S. III; Balkey, Kenneth R.; Erler, Bryan A.; Wesley Rowley, C.

    2007-01-01

    This paper is prepared in honor and in memory of the late Professor Emeritus Yasuhide Asada to recognize his contributions to ASME Nuclear Codes and Standards initiatives, particularly those related to risk-informed technology and System Based Code developments. For nearly two decades, numerous risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to properly manage the numerous initiatives currently underway or planned for the future, the ASME Board on Nuclear Codes and Standards (BNCS) has an established Risk Management Strategic Plan (Plan) that is maintained and updated by the ASME BNCS Risk Management Task Group. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent probabilistic risk assessment (PRA) standards developments for nuclear power plant applications. The paper discusses planned applications within ASME Nuclear Codes and Standards that will require expansion of the ASME PRA Standard to support new advanced light water reactor and next generation reactor developments, such as for high temperature gas-cooled reactors. Emerging regulatory developments related to risk-informed, performance- based approaches are summarized. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is also summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, including related U.S. regulatory activities. (author)

  7. Parents' Assessments of Disability in Their Children Using World Health Organization International Classification of Functioning, Disability and Health, Child and Youth Version Joined Body Functions and Activity Codes Related to Everyday Life

    DEFF Research Database (Denmark)

    Illum, Niels Ove; Gradel, Kim Oren

    2017-01-01

    : Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers......AIM: To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. METHOD...... of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after...

  8. Doping Polymer Semiconductors by Organic Salts: Toward High-Performance Solution-Processed Organic Field-Effect Transistors.

    Science.gov (United States)

    Hu, Yuanyuan; Rengert, Zachary D; McDowell, Caitlin; Ford, Michael J; Wang, Ming; Karki, Akchheta; Lill, Alexander T; Bazan, Guillermo C; Nguyen, Thuc-Quyen

    2018-04-24

    Solution-processed organic field-effect transistors (OFETs) were fabricated with the addition of an organic salt, trityl tetrakis(pentafluorophenyl)borate (TrTPFB), into thin films of donor-acceptor copolymer semiconductors. The performance of OFETs is significantly enhanced after the organic salt is incorporated. TrTPFB is confirmed to p-dope the organic semiconductors used in this study, and the doping efficiency as well as doping physics was investigated. In addition, systematic electrical and structural characterizations reveal how the doping enhances the performance of OFETs. Furthermore, it is shown that this organic salt doping method is feasible for both p- and n-doping by using different organic salts and, thus, can be utilized to achieve high-performance OFETs and organic complementary circuits.

  9. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  10. Performance Management in Healthcare Organizations: Concept and Practicum.

    Science.gov (United States)

    Dimitropoulos, Panagiotis E

    2017-01-01

    Organizational performance can create and sustain competitive advantages for corporations and even improve their sustainability and future prospects. Health care organizations present a sector where performance management is structured by multiple dimensions. The scope of this study is to analyze the issue of performance management in healthcare organizations and specifically the implementation of the Balanced Scorecard (BSC) methodology on organizations providing health services. The study provides a discussion on the BSC development process, the steps that management has to take in order to prepare the implementation of the BSC and finally discusses a practical example of a scorecard with specific strategic goals and performance indicators. Managers of healthcare organizations and specifically those providing services to the elderly and the general population could use the propositions of the study as a roadmap for processing, analyzing, evaluating and implementing the balanced scorecard approach in their organizations' daily operations. BSC methodology can give an advantage in terms of enhanced stakeholder management and preservation within a highly volatile and competitive economic environment.

  11. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  12. Performance analysis of 2D asynchronous hard-limiting optical code-division multiple access system through atmospheric scattering channel

    Science.gov (United States)

    Zhao, Yaqin; Zhong, Xin; Wu, Di; Zhang, Ye; Ren, Guanghui; Wu, Zhilu

    2013-09-01

    Optical code-division multiple access (OCDMA) systems usually allocate orthogonal or quasi-orthogonal codes to the active users. When transmitting through atmospheric scattering channel, the coding pulses are broadened and the orthogonality of the codes is worsened. In truly asynchronous case, namely both the chips and the bits are asynchronous among each active user, the pulse broadening affects the system performance a lot. In this paper, we evaluate the performance of a 2D asynchronous hard-limiting wireless OCDMA system through atmospheric scattering channel. The probability density function of multiple access interference in truly asynchronous case is given. The bit error rate decreases as the ratio of the chip period to the root mean square delay spread increases and the channel limits the bit rate to different levels when the chip period varies.

  13. Enhancing the performance of the light field microscope using wavefront coding.

    Science.gov (United States)

    Cohen, Noy; Yang, Samuel; Andalman, Aaron; Broxton, Michael; Grosenick, Logan; Deisseroth, Karl; Horowitz, Mark; Levoy, Marc

    2014-10-06

    Light field microscopy has been proposed as a new high-speed volumetric computational imaging method that enables reconstruction of 3-D volumes from captured projections of the 4-D light field. Recently, a detailed physical optics model of the light field microscope has been derived, which led to the development of a deconvolution algorithm that reconstructs 3-D volumes with high spatial resolution. However, the spatial resolution of the reconstructions has been shown to be non-uniform across depth, with some z planes showing high resolution and others, particularly at the center of the imaged volume, showing very low resolution. In this paper, we enhance the performance of the light field microscope using wavefront coding techniques. By including phase masks in the optical path of the microscope we are able to address this non-uniform resolution limitation. We have also found that superior control over the performance of the light field microscope can be achieved by using two phase masks rather than one, placed at the objective's back focal plane and at the microscope's native image plane. We present an extended optical model for our wavefront coded light field microscope and develop a performance metric based on Fisher information, which we use to choose adequate phase masks parameters. We validate our approach using both simulated data and experimental resolution measurements of a USAF 1951 resolution target; and demonstrate the utility for biological applications with in vivo volumetric calcium imaging of larval zebrafish brain.

  14. Growth performance, immune status and organ morphometry in ...

    African Journals Online (AJOL)

    Growth performance, immune status and organ morphometry in broilers fed Bacillus subtilis -supplemented diet. ... In conclusion, B. subtilis-type probiotics contributed positively to better growth performance, improved immune system and modulated morphology of lymphoid organs and gut mucosa in broilers. Keywords: ...

  15. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  16. Business Ethics: International Analysis of Codes of Ethics and Conduct

    Directory of Open Access Journals (Sweden)

    Josmar Andrade

    2017-03-01

    Full Text Available Codes of ethics and code of conduct formalize an ideal of expected behavior patterns to managers and employees of organizations, providing standards and orientation that states companies interactions with the community, through products /services, sales force, marketing communications, investments, and relationships with other stakeholders, influencing company reputation and overall Marketing performance. The objective of this study is to analyze the differences in codes of ethics of the largest companies based in Brazil and in Portugal, given their cultural and linguistic similarities. Findings show that the use of codes of ethics are more common in Brazil than in Portugal and that codes of ethics are substantially more extensive and cover a larger number of categories in Brazilian companies, reflecting the organizations’ mission and perception of stakeholders concerns and priorities. We conclude that ethical issues severely impact company reputation and, in a comprehensive sense, overall Marketing performance. Marketing professionals should be systematically aware of how company core values are transmitted to different audiences, including the use of code of ethics to communicate both with internal and external publics. 0 0 1 171 966 CASA DOS ANDRADES 23 14 1123 14.0 96 800x600 Normal 0 false false false EN-US JA X-NONE  

  17. Analyses with the FSTATE code: fuel performance in destructive in-pile experiments

    International Nuclear Information System (INIS)

    Bauer, T.H.; Meek, C.C.

    1982-01-01

    Thermal-mechanical analysis of a fuel pin is an essential part of the evaluation of fuel behavior during hypothetical accident transients. The FSTATE code has been developed to provide this required computational ability in situations lacking azimuthal symmetry about the fuel-pin axis by performing 2-dimensional thermal, mechanical, and fission gas release and redistribution computations for a wide range of possible transient conditions. In this paper recent code developments are described and application is made to in-pile experiments undertaken to study fast-reactor fuel under accident conditions. Three accident simulations, including a fast and slow ramp-rate overpower as well as a loss-of-cooling accident sequence, are used as representative examples, and the interpretation of STATE computations relative to experimental observations is made

  18. Performance of the coupled thermalhydraulics/neutron kinetics code R/P/C on workstation clusters and multiprocessor systems

    International Nuclear Information System (INIS)

    Hammer, C.; Paffrath, M.; Boeer, R.; Finnemann, H.; Jackson, C.J.

    1996-01-01

    The light water reactor core simulation code PANBOX has been coupled with the transient analysis code RELAP5 for the purpose of performing plant safety analyses with a three-dimensional (3-D) neutron kinetics model. The system has been parallelized to improve the computational efficiency. The paper describes the features of this system with emphasis on performance aspects. Performance results are given for different types of parallelization, i. e. for using an automatic parallelizing compiler, using the portable PVM platform on a workstation cluster, using PVM on a shared memory multiprocessor, and for using machine dependent interfaces. (author)

  19. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  20. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  1. Application and Analysis of Performance of DQPSK Advanced Modulation Format in Spectral Amplitude Coding OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-04-01

    Full Text Available SAC (Spectral Amplitude Coding is a technique of OCDMA (Optical Code Division Multiple Access to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying is applied, simulated and analyzed. m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK

  2. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  3. Quantitative code accuracy evaluation of ISP33

    Energy Technology Data Exchange (ETDEWEB)

    Kalli, H.; Miwrrin, A. [Lappeenranta Univ. of Technology (Finland); Purhonen, H. [VTT Energy, Lappeenranta (Finland)] [and others

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  4. How to Achieve Quality Business Performance in an Organization

    Directory of Open Access Journals (Sweden)

    Merima Bekri ć

    2013-02-01

    Full Text Available RQ: The research question is the influence of quality management on quality business performance.Purpose: Determine which top management approaches are essential for business excellence and leadership approaches that achieve quality business.Method: Analysis of articles from the Journal Organization from the last 5 years and one article from the Journal of Universal Excellent to obtain data.Results: Characteristics of managers and individual employees influence the quality of an organization's performance and business excellence. This is rudimentary for an organization to perform successfully. The results show the weaknesses that an organization can improve in.Organization: The research study facilitates in improving long-term success of an organization. The results can assist in further decision-making and timely responses to changes in its internal and external environment.Society: Better business performance contributes to the wider environment, as this also ensures stability of an organization.Originality: A different approach in viewing management issuesand searching for improvements. There are not many review articles on this topic.Limitations: The analysis was conducted with only ten articles.

  5. Impact of Performance Management in Public and Private Organizations

    DEFF Research Database (Denmark)

    Hvidman, Ulrik; Andersen, Simon Calmar

    2014-01-01

    of management is less effective in public organizations. A difference-in-differences model based on survey data on management in Danish public and private schools, combined with administrative data of students’ test scores, confirms the hypothesis. The results have important implications for the transfer......Recent theoretical developments suggest that management actions have different impacts on outcomes in public and private organizations. This proposition is important to public organizations’ widespread import of private sector management tools such as performance management. This article examines...... how performance management influences performance outcomes in otherwise similar public and private organizations. Showing that the factors expected to diminish the impact of performance management parallel the organizational characteristics of public organizations, we hypothesize that this type...

  6. Early Experiences Writing Performance Portable OpenMP 4 Codes

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Hernandez, Oscar R [ORNL

    2016-01-01

    In this paper, we evaluate the recently available directives in OpenMP 4 to parallelize a computational kernel using both the traditional shared memory approach and the newer accelerator targeting capabilities. In addition, we explore various transformations that attempt to increase application performance portability, and examine the expressiveness and performance implications of using these approaches. For example, we want to understand if the target map directives in OpenMP 4 improve data locality when mapped to a shared memory system, as opposed to the traditional first touch policy approach in traditional OpenMP. To that end, we use recent Cray and Intel compilers to measure the performance variations of a simple application kernel when executed on the OLCF s Titan supercomputer with NVIDIA GPUs and the Beacon system with Intel Xeon Phi accelerators attached. To better understand these trade-offs, we compare our results from traditional OpenMP shared memory implementations to the newer accelerator programming model when it is used to target both the CPU and an attached heterogeneous device. We believe the results and lessons learned as presented in this paper will be useful to the larger user community by providing guidelines that can assist programmers in the development of performance portable code.

  7. User's manual for the vertical axis wind turbine performance computer code darter

    Energy Technology Data Exchange (ETDEWEB)

    Klimas, P. C.; French, R. E.

    1980-05-01

    The computer code DARTER (DARrieus, Turbine, Elemental Reynolds number) is an aerodynamic performance/loads prediction scheme based upon the conservation of momentum principle. It is the latest evolution in a sequence which began with a model developed by Templin of NRC, Canada and progressed through the Sandia National Laboratories-developed SIMOSS (SSImple MOmentum, Single Streamtube) and DART (SARrieus Turbine) to DARTER.

  8. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  9. Effect of beat noise on the performance of two-dimensional time-spreading/wavelength-hopping optical code-division multiple-access systems

    Science.gov (United States)

    Bazan, T.; Harle, D.; Andonovic, I.; Meenakshi, M.

    2005-03-01

    The effect of beat noise on optical code-division multiple-access (OCDMA) systems using a range of two-dimensional (2-D) time-spreading/wavelength-hopping (TW) code families is presented. A derivation of a general formula for the error probability of the system is given. The properties of the 2-D codes--namely, the structure, length, and cross-correlation characteristics--are found to have a great influence on system performance. Improved performance can be obtained by use of real-time dynamic thresholding.

  10. Development of an object-oriented simulation code for repository performance assessment

    International Nuclear Information System (INIS)

    Tsujimoto, Keiichi; Ahn, J.

    1999-01-01

    As understanding for mechanisms of radioactivity confinement by a deep geologic repository improves at the individual process level, it has become imperative to evaluate consequences of individual processes to the performance of the whole repository system. For this goal, the authors have developed a model for radionuclide transport in, and release from, the repository region by incorporating multiple-member decay chains and multiple waste canisters. A computer code has been developed with C++, an object-oriented language. By utilizing the feature that a geologic repository consists of thousands of objects of the same kind, such as the waste canister, the repository region is divided into multiple compartments and objects for simulation of radionuclide transport. Massive computational tasks are distributed over, and executed by, multiple networked workstations, with the help of parallel virtual machine (PVM) technology. Temporal change of the mass distribution of 28 radionuclides in the repository region for the time period of 100 million yr has been successfully obtained by the code

  11. Organic migration forms of radionuclides and performance assessment

    International Nuclear Information System (INIS)

    Xu Gouqing

    2010-01-01

    Much attention is paid to inorganic migration forms of radionuclides in groundwater during performance assessment before and organic migration forms, are seldom noted. Therefore some question may come into confidence level in performance assessment. This paper mainly discusses the distribution of organic substances in groundwater and their potential effect on performance assessment. The results obtained in recent years show that clay rocks are generally impermeable to water, but in some cases the interstitial water may be observed in them and the concentration of DOC, HA and FA is rather higher than that in granitic groundwater. The concentration of DOC is relatively low in granitic groundwater, but up to now the effect of organic migration forms of radionuclides in granitic groundwater on performance assessment is not finally determined, it is necessary to make further investigations. (authors)

  12. Test Code Quality and Its Relation to Issue Handling Performance

    NARCIS (Netherlands)

    Athanasiou, D.; Nugroho, A.; Visser, J.; Zaidman, A.

    2014-01-01

    Automated testing is a basic principle of agile development. Its benefits include early defect detection, defect cause localization and removal of fear to apply changes to the code. Therefore, maintaining high quality test code is essential. This study introduces a model that assesses test code

  13. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  14. Performance of Different OCDMA Codes with FWM and XPM Nonlinear Effects

    Science.gov (United States)

    Rana, Shivani; Gupta, Amit

    2017-08-01

    In this paper, 1 Gb/s non-linear optical code division multiple access system have been simulated and modeled. To reduce multiple user interference multi-diagonal (MD) code which possesses the property of having zero cross-correlation have been deployed. The MD code shows better results than Walsh-Hadamard and multi-weight code under the nonlinear effect of four-wave mixing (FWM) and cross-phase modulation (XPM). The simulation results reveal that effect of FWM reduces when MD codes are employed as compared to other codes.

  15. In-core fuel management code package validation for BWRs

    International Nuclear Information System (INIS)

    1995-12-01

    The main goal of the present CRP (Coordinated Research Programme) was to develop benchmarks which are appropriate to check and improve the fuel management computer code packages and their procedures. Therefore, benchmark specifications were established which included a set of realistic data for running in-core fuel management codes. Secondly, the results of measurements and/or operating data were also provided to verify and compare with these parameters as calculated by the in-core fuel management codes or code packages. For the BWR it was established that the Mexican Laguna Verde 1 BWR would serve as the model for providing data on the benchmark specifications. It was decided to provide results for the first 2 cycles of Unit 1 of the Laguna Verde reactor. The analyses of the above benchmarks are performed in two stages. In the first stage, the lattice parameters are generated as a function of burnup at different voids and with and without control rod. These lattice parameters form the input for 3-dimensional diffusion theory codes for over-all reactor analysis. The lattice calculations were performed using different methods, such as, Monte Carlo, 2-D integral transport theory methods. Supercell Model and transport-diffusion model with proper correction for burnable absorber. Thus the variety of results should provide adequate information for any institute or organization to develop competence to analyze In-core fuel management codes. 15 refs, figs and tabs

  16. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  17. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  18. System performances of optical space code-division multiple-access-based fiber-optic two-dimensional parallel data link.

    Science.gov (United States)

    Nakamura, M; Kitayama, K

    1998-05-10

    Optical space code-division multiple access is a scheme to multiplex and link data between two-dimensional processors such as smart pixels and spatial light modulators or arrays of optical sources like vertical-cavity surface-emitting lasers. We examine the multiplexing characteristics of optical space code-division multiple access by using optical orthogonal signature patterns. The probability density function of interference noise in interfering optical orthogonal signature patterns is calculated. The bit-error rate is derived from the result and plotted as a function of receiver threshold, code length, code weight, and number of users. Furthermore, we propose a prethresholding method to suppress the interference noise, and we experimentally verify that the method works effectively in improving system performance.

  19. Anti-Counterfeiting Quick Response Code with Emission Color of Invisible Metal-Organic Frameworks as Encoding Information.

    Science.gov (United States)

    Wang, Yong-Mei; Tian, Xue-Tao; Zhang, Hui; Yang, Zhong-Rui; Yin, Xue-Bo

    2018-06-08

    Counterfeiting is a global epidemic that is compelling the development of new anti-counterfeiting strategy. Herein, we report a novel multiple anti-counterfeiting encoding strategy of invisible fluorescent quick response (QR) codes with emission color as information storage unit. The strategy requires red, green, and blue (RGB) light-emitting materials for different emission colors as encrypting information, single excitation for all of the emission for practicability, and ultraviolet (UV) excitation for invisibility under slight. Therefore, RGB light-emitting nanoscale metal-organic frameworks (NMOFs) are designed as inks to construct the colorful light-emitting boxes for information encrypting, while three black vertex boxes were used for positioning. Full-color emissions are obtained by mixing the trichromatic NMOFs inks through inkjet printer. The encrypting information capacity is easily adjusted by the number of light-emitting boxes with the infinite emission colors. The information is decoded with specific excitation light at 275 nm, making the QR codes invisible under daylight. The composition of inks, invisibility, inkjet printing, and the abundant encrypting information all contribute to multiple anti-counterfeiting. The proposed QR codes pattern holds great potential for advanced anti-counterfeiting.

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  2. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    Science.gov (United States)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  3. Lawrence Livermore National Laboratory Probabilistic Seismic Hazard Codes Validation

    International Nuclear Information System (INIS)

    Savy, J B

    2003-01-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time-period. LLNL has been developing the methodology and codes in support of the Nuclear Regulatory Commission (NRC) needs for reviews of site licensing of nuclear power plants, since 1978. A number of existing computer codes have been validated and still can lead to ranges of hazard estimates in some cases. Until now, the seismic hazard community had not agreed on any specific method for evaluation of these codes. The Earthquake Engineering Research Institute (EERI) and the Pacific Engineering Earthquake Research (PEER) center organized an exercise in testing of existing codes with the aim of developing a series of standard tests that future developers could use to evaluate and calibrate their own codes. Seven code developers participated in the exercise, on a voluntary basis. Lawrence Livermore National laboratory participated with some support from the NRC. The final product of the study will include a series of criteria for judging of the validity of the results provided by a computer code. This EERI/PEER project was first planned to be completed by June of 2003. As the group neared completion of the tests, the managing team decided that new tests were necessary. As a result, the present report documents only the work performed to this point. It demonstrates that the computer codes developed by LLNL perform all calculations correctly and as intended. Differences exist between the results of the codes tested, that are attributed to a series of assumptions, on the parameters and models, that the developers had to make. The managing team is planning a new series of tests to help in reaching a consensus on these assumptions

  4. AECL international standard problem ISP-41 FU/1 follow-up exercise (Phase 1): Containment Iodine Computer Code Exercise: Parametric Studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-06-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I- concentration. The codes used in this exercise were IODE (IPSN), IODE (NRIR), IMPAIR (GRS), INSPECT (AEAT), IMOD (AECL) and LIRIC (AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained from intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (authors)

  5. Overview of the geochemical code MINTEQ: applications to performance assessment for low-level wastes

    International Nuclear Information System (INIS)

    Graham, M.J.; Peterson, S.R.

    1985-09-01

    The MINTEQ geochemical computer code, developed at Pacific Northwest Laboratory, integrates many of the capabilities of its two immediate predecessors, WATEQ3 and MINEQL. MINTEQ can be used to perform the calculations necessary to simulate (model) the contact of low-level waste solutions with heterogeneous sediments or the interaction of ground water with solidified low-level wastes. The code is capable of performing calculations of ion speciation/solubility, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution of solid phases. Under the Special Waste Form Lysimeters-Arid program, the composition of effluents (leachates) from column and batch experiments, using laboratory-scale waste forms, will be used to develop a geochemical model of the interaction of ground water with commercial solidified low-level wastes. The wastes being evaluated include power reactor waste streams that have been solidified in cement, vinyl ester-styrene, and bitumen. The thermodynamic database for the code is being upgraded before the geochemical modeling is performed. Thermodynamic data for cobalt, antimony, cerium, and cesium solid phases and aqueous species are being added to the database. The need to add these data was identified from the characterization of the waste streams. The geochemical model developed from the laboratory data will then be applied to predict the release from a field-lysimeter facility that contains full-scale waste samples. The contaminant concentrations migrating from the wastes predicted using MINTEQ will be compared to the long-term lysimeter data. This comparison will constitute a partical field validation of the geochemical model. 28 refs

  6. The coevolution of genes and genetic codes: Crick's frozen accident revisited.

    Science.gov (United States)

    Sella, Guy; Ardell, David H

    2006-09-01

    The standard genetic code is the nearly universal system for the translation of genes into proteins. The code exhibits two salient structural characteristics: it possesses a distinct organization that makes it extremely robust to errors in replication and translation, and it is highly redundant. The origin of these properties has intrigued researchers since the code was first discovered. One suggestion, which is the subject of this review, is that the code's organization is the outcome of the coevolution of genes and genetic codes. In 1968, Francis Crick explored the possible implications of coevolution at different stages of code evolution. Although he argues that coevolution was likely to influence the evolution of the code, he concludes that it falls short of explaining the organization of the code we see today. The recent application of mathematical modeling to study the effects of errors on the course of coevolution, suggests a different conclusion. It shows that coevolution readily generates genetic codes that are highly redundant and similar in their error-correcting organization to the standard code. We review this recent work and suggest that further affirmation of the role of coevolution can be attained by investigating the extent to which the outcome of coevolution is robust to other influences that were present during the evolution of the code.

  7. Selection of a computer code for Hanford low-level waste engineered-system performance assessment. Revision 1

    International Nuclear Information System (INIS)

    McGrail, B.P.; Bacon, D.H.

    1998-02-01

    Planned performance assessments for the proposed disposal of low-activity waste (LAW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. The available computer codes with suitable capabilities at the time Revision 0 of this document was prepared were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical processes expected to affect LAW glass corrosion and the mobility of radionuclides. This analysis was repeated in this report but updated to include additional processes that have been found to be important since Revision 0 was issued and to include additional codes that have been released. The highest ranked computer code was found to be the STORM code developed at PNNL for the US Department of Energy for evaluation of arid land disposal sites

  8. PERFORMANCE MANAGEMENT APPROACHES IN ECONOMIC ORGANIZATIONS USING INFORMATION TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Anca Mehedintu

    2012-03-01

    Full Text Available Performance management includes activities that ensure that goals are consistently being met inan effective and efficient manner. Performance management can focus on the performance of an organization, adepartment, employee, or even the processes to build a product or service, as well as many other areas.In these days of globalization and intensive use of information technology, the organizations must defineand implement an appropriate strategy that would support their medium-term development, stability andcompetitiveness. This is achieved through a coherent and interrelated set of activities for understanding thecustomer expectations and the level at which the offer of organization add value to customers and satisfy theirneeds, define their internal organization to allow timely response to market demands without losing focus on client,tracking strategy and business model for the accomplishment of the organization mission, aligning the existing ITproject management or under development implementation in organization with the strategic management oforganization etc. Strategic Management determines the improvement of processes, effective use of resources, focuson critical areas in terms of finance, creating opportunities for innovation and technological progress, improvementof the supply mechanism and the duty to promote personal interaction and negotiation at all levels, continuousassessment of organization and its technological trends, analyze the market potential and competence field etc.Strategic management system will not give good results if the strategy is not defined by a set of operationalobjectives clearly at all levels.Business performance is based on a set of analytical processes of business, supported by informationtechnology that defines the strategic goals that can be measured by performance indicators. EnterprisePerformance Management creates a powerful and precise environment, characterized by data consistency,efficiency analysis

  9. Prediction Capability of SPACE Code about the Loop Seal Clearing on ATLAS SBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Sung Won; Lee, Jong Hyuk; Chung, Bub Dong; Kim, Kyung Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The most possible break size for loop seal reforming has been decided as 4 inch by the pre-calculation conducted by the RELAP5 and MARS. Many organizations have participated with various system analysis codes: for examples, RELAP5, MARS, TRACE. KAERI also anticipated with SPACE code. SPACE code has been developed for the use of design and safety analysis of nuclear thermal hydraulics system. KHNP and other organizations have collaborated during last 10 years. And it is currently under the certification procedures. SPACE has the capability to analyze the droplet field with full governing equation set: continuity, momentum, and energy. The SPACE code has been participated in PKL- 3 benchmark program for the international activity. The DSP-04 benchmark problem is also the application of SPACE as the domestic activities. The cold leg top slot break accident of APR1400 reactor has been modeled and surveyed by SPACE code. Benchmark experiment as a program of DSP-04 has been performed with ATLAS facility. The break size has been selected as 4 inch in APR1400 and the corresponding scale down break size has been modeled in SPACE code. The loop seal reforming has been occurred at all 4 loops. But the PCT shows no significant behaviors.

  10. A self-organized internal models architecture for coding sensory-motor schemes

    Directory of Open Access Journals (Sweden)

    Esaú eEscobar Juárez

    2016-04-01

    Full Text Available Cognitive robotics research draws inspiration from theories and models on cognition, as conceived by neuroscience or cognitive psychology, to investigate biologically plausible computational models in artificial agents. In this field, the theoretical framework of Grounded Cognition provides epistemological and methodological grounds for the computational modeling of cognition. It has been stressed in the literature that textit{simulation}, textit{prediction}, and textit{multi-modal integration} are key aspects of cognition and that computational architectures capable of putting them into play in a biologically plausible way are a necessity.Research in this direction has brought extensive empirical evidencesuggesting that textit{Internal Models} are suitable mechanisms forsensory-motor integration. However, current Internal Models architectures show several drawbacks, mainly due to the lack of a unified substrate allowing for a true sensory-motor integration space, enabling flexible and scalable ways to model cognition under the embodiment hypothesis constraints.We propose the Self-Organized Internal ModelsArchitecture (SOIMA, a computational cognitive architecture coded by means of a network of self-organized maps, implementing coupled internal models that allow modeling multi-modal sensory-motor schemes. Our approach addresses integrally the issues of current implementations of Internal Models.We discuss the design and features of the architecture, and provide empirical results on a humanoid robot that demonstrate the benefits and potentialities of the SOIMA concept for studying cognition in artificial agents.

  11. Code on the safety of nuclear power plants: Governmental organization

    International Nuclear Information System (INIS)

    1988-01-01

    This Code recommends requirements for a regulatory body responsible for regulating the siting, design, construction, commissioning, operation and decommissioning of nuclear power plants for safety. It forms part of the Agency's programme for establishing Codes and Safety Guides relating to land based stationary thermal neutron power plants

  12. Informational Closed-Loop Coding-Decoding Control Concept as the Base of the Living or Organized Systems Theory

    Science.gov (United States)

    Kirvelis, Dobilas; Beitas, Kastytis

    2008-10-01

    The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.

  13. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  14. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  15. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  16. A model of R-D performance evaluation for Rate-Distortion-Complexity evaluation of H.264 video coding

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren

    2007-01-01

    This paper considers a method for evaluation of Rate-Distortion-Complexity (R-D-C) performance of video coding. A statistical model of the transformed coefficients is used to estimate the Rate-Distortion (R-D) performance. A model frame work for rate, distortion and slope of the R-D curve for inter...... and intra frame is presented. Assumptions are given for analyzing an R-D model for fast R-D-C evaluation. The theoretical expressions are combined with H.264 video coding, and confirmed by experimental results. The complexity frame work is applied to the integer motion estimation....

  17. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    Science.gov (United States)

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  18. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  19. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  20. ETR/ITER systems code

    International Nuclear Information System (INIS)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs

  1. CBP Phase I Code Integration

    International Nuclear Information System (INIS)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-01-01

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown and Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown and Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface

  2. CBP PHASE I CODE INTEGRATION

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  3. Modular organization and hospital performance.

    Science.gov (United States)

    Kuntz, Ludwig; Vera, Antonio

    2007-02-01

    The concept of modularization represents a modern form of organization, which contains the vertical disaggregation of the firm and the use of market mechanisms within hierarchies. The objective of this paper is to examine whether the use of modular structures has a positive effect on hospital performance. The empirical section makes use of multiple regression analyses and leads to the main result that modularization does not have a positive effect on hospital performance. However, the analysis also finds out positive efficiency effects of two central ideas of modularization, namely process orientation and internal market mechanisms.

  4. Twelve gordian knots when developing an organizational code of ethics

    NARCIS (Netherlands)

    Kaptein, Muel; Wempe, Johan

    1998-01-01

    Following the example of the many organizations in the United States which have a code of ethics, an increasing interest on the part of companies, trade organizations, (semi-)governmental organizations and professions in the Netherlands to develop codes of ethics can be witnessed. We have been able

  5. The Relationship between Learning Organization Dimensions and Library Performance

    Science.gov (United States)

    Haley, Qing Kong

    2010-01-01

    The purpose of this research was to examine the relationship between learning organization dimensions and academic library performance. It studied whether differences existed in learning organization dimensions given the predictor variables of performance indicators, library resources, and demographics of the academic library. This research…

  6. Impact of the Revised Malaysian Code on Corporate Governance on Audit Committee Attributes and Firm Performance

    OpenAIRE

    KALLAMU, Basiru Salisu

    2016-01-01

    Abstract. Using a sample of 37 finance companies listed under the finance segment of Bursa Malaysia, we examined the impact of the revision to Malaysian code on corporate governance on audit committee attributes and firm performance. Our result suggests that audit committee attributes significantly improved after the Code was revised. In addition, the coefficient for audit committee and risk committee interlock has a significant negative relationship with Tobin’s Q in the period before the re...

  7. Performance and complexity of tunable sparse network coding with gradual growing tuning functions over wireless networks

    OpenAIRE

    Garrido Ortiz, Pablo; Sørensen, Chres W.; Lucani Roetter, Daniel Enrique; Agüero Calvo, Ramón

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and comp...

  8. [Comparative study of three Western models of deontological codes for dentists].

    Science.gov (United States)

    Macpherson Mayol, Ignacio; Roqué Sánchez, María Victoria; Gonzalvo-Cirac, Margarita; de Ribot, Eduard

    2013-01-01

    We performed a comparative analysis of the codes of ethics of three official organizations in Dentistry professional ethics: Code of Ethics for Dentists in the European Union, drawn up by the Council of European Dentists (CED); Código Español de Ética y Deontología Dental, published by the Consejo General de Colegios de Odontólogos y Estomatólogos de España (CGCOE); and Principles of Ethics and Code of Professional Conduct, of the American Dental Association (ADA). The analysis of the structure of the codes allows the discovery of different approaches governing professional ethics according to the ethical and legislative tradition from which they derive. While there are common elements inherent in Western culture, there are nuances in the grounds, the layout and wording of articles that allows to deduce the ethical foundations that underlie each code, and reflects the real problems encountered by dentists in the practice of their profession.

  9. Software Design Document for the AMP Nuclear Fuel Performance Code

    International Nuclear Information System (INIS)

    Philip, Bobby; Clarno, Kevin T.; Cochran, Bill

    2010-01-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  10. Assessment of MARMOT. A Mesoscale Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, M. R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, X. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yu, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Teague, M. C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andersson, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics, and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.

  11. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  12. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  13. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  14. Combining independent de novo assemblies optimizes the coding transcriptome for nonconventional model eukaryotic organisms.

    Science.gov (United States)

    Cerveau, Nicolas; Jackson, Daniel J

    2016-12-09

    Next-generation sequencing (NGS) technologies are arguably the most revolutionary technical development to join the list of tools available to molecular biologists since PCR. For researchers working with nonconventional model organisms one major problem with the currently dominant NGS platform (Illumina) stems from the obligatory fragmentation of nucleic acid material that occurs prior to sequencing during library preparation. This step creates a significant bioinformatic challenge for accurate de novo assembly of novel transcriptome data. This challenge becomes apparent when a variety of modern assembly tools (of which there is no shortage) are applied to the same raw NGS dataset. With the same assembly parameters these tools can generate markedly different assembly outputs. In this study we present an approach that generates an optimized consensus de novo assembly of eukaryotic coding transcriptomes. This approach does not represent a new assembler, rather it combines the outputs of a variety of established assembly packages, and removes redundancy via a series of clustering steps. We test and validate our approach using Illumina datasets from six phylogenetically diverse eukaryotes (three metazoans, two plants and a yeast) and two simulated datasets derived from metazoan reference genome annotations. All of these datasets were assembled using three currently popular assembly packages (CLC, Trinity and IDBA-tran). In addition, we experimentally demonstrate that transcripts unique to one particular assembly package are likely to be bioinformatic artefacts. For all eight datasets our pipeline generates more concise transcriptomes that in fact possess more unique annotatable protein domains than any of the three individual assemblers we employed. Another measure of assembly completeness (using the purpose built BUSCO databases) also confirmed that our approach yields more information. Our approach yields coding transcriptome assemblies that are more likely to be

  15. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  16. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  17. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  18. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    Science.gov (United States)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  19. Public management: organizations, governance, and performance

    National Research Council Canada - National Science Library

    O'Toole, Laurence J; Meier, Kenneth J

    2011-01-01

    ...? Can their separable contributions to performance be estimated? The fate of public policies in today's world lies in the hands of public organizations, which in turn are often intertwined with others in latticed patterns of governance...

  20. Design and performance of coded aperture optical elements for the CESR-TA x-ray beam size monitor

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, J.P.; Chatterjee, A.; Conolly, C.; Edwards, E.; Ehrlichman, M.P. [Cornell University, Ithaca, NY 14853 (United States); Flanagan, J.W. [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Department of Accelerator Science, Graduate University for Advanced Studies (SOKENDAI), Tsukuba (Japan); Fontes, E. [Cornell University, Ithaca, NY 14853 (United States); Heltsley, B.K., E-mail: bkh2@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Lyndaker, A.; Peterson, D.P.; Rider, N.T.; Rubin, D.L.; Seeley, R.; Shanks, J. [Cornell University, Ithaca, NY 14853 (United States)

    2014-12-11

    We describe the design and performance of optical elements for an x-ray beam size monitor (xBSM), a device measuring e{sup +} and e{sup −} beam sizes in the CESR-TA storage ring. The device can measure vertical beam sizes of 10–100μm on a turn-by-turn, bunch-by-bunch basis at e{sup ±} beam energies of ∼2–5GeV. x-rays produced by a hard-bend magnet pass through a single- or multiple-slit (coded aperture) optical element onto a detector. The coded aperture slit pattern and thickness of masking material forming that pattern can both be tuned for optimal resolving power. We describe several such optical elements and show how well predictions of simple models track measured performances. - Highlights: • We characterize optical element performance of an e{sup ±} x-ray beam size monitor. • We standardize beam size resolving power measurements to reference conditions. • Standardized resolving power measurements compare favorably to model predictions. • Key model features include simulation of photon-counting statistics and image fitting. • Results validate a coded aperture design optimized for the x-ray spectrum encountered.

  1. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  2. Performance Evaluation of HARQ Technique with UMTS Turbo Code

    Directory of Open Access Journals (Sweden)

    S. S. Brkić

    2011-11-01

    Full Text Available The hybrid automatic repeat request technique (HARQ represents the error control principle which combines an error correcting code and automatic repeat request procedure (ARQ, within the same transmission system. In this paper, using Monte Carlo simulation process, the characteristics of HARQ technique are determined, for the case of the Universal Mobile Telecommunication System (UMTS turbo code.

  3. Changing Metrics of Organ Procurement Organization Performance in Order to Increase Organ Donation Rates in the United States.

    Science.gov (United States)

    Goldberg, D; Kallan, M J; Fu, L; Ciccarone, M; Ramirez, J; Rosenberg, P; Arnold, J; Segal, G; Moritsugu, K P; Nathan, H; Hasz, R; Abt, P L

    2017-12-01

    The shortage of deceased-donor organs is compounded by donation metrics that fail to account for the total pool of possible donors, leading to ambiguous donor statistics. We sought to assess potential metrics of organ procurement organizations (OPOs) utilizing data from the Nationwide Inpatient Sample (NIS) from 2009-2012 and State Inpatient Databases (SIDs) from 2008-2014. A possible donor was defined as a ventilated inpatient death ≤75 years of age, without multi-organ system failure, sepsis, or cancer, whose cause of death was consistent with organ donation. These estimates were compared to patient-level data from chart review from two large OPOs. Among 2,907,658 inpatient deaths from 2009-2012, 96,028 (3.3%) were a "possible deceased-organ donor." The two proposed metrics of OPO performance were: (1) donation percentage (percentage of possible deceased-donors who become actual donors; range: 20.0-57.0%); and (2) organs transplanted per possible donor (range: 0.52-1.74). These metrics allow for comparisons of OPO performance and geographic-level donation rates, and identify areas in greatest need of interventions to improve donation rates. We demonstrate that administrative data can be used to identify possible deceased donors in the US and could be a data source for CMS to implement new OPO performance metrics in a standardized fashion. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  4. Impact of optical hard limiter on the performance of an optical overlapped-code division multiple access system

    Science.gov (United States)

    Inaty, Elie; Raad, Robert; Tablieh, Nicole

    2011-08-01

    Throughout this paper, a closed form expression of the multiple access interference (MAI) limited bit error rate (BER) is provided for the multiwavelength optical code-division multiple-access system when the system is working above the nominal transmission rate limit imposed by the passive encoding-decoding operation. This system is known in literature as the optical overlapped code division multiple access (OV-CDMA) system. A unified analytical framework is presented emphasizing the impact of optical hard limiter (OHL) on the BER performance of such a system. Results show that the performance of the OV-CDMA system may be highly improved when using OHL preprocessing at the receiver side.

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  6. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  7. The relationship between family orientation, organization context, organization structure and firm performance

    NARCIS (Netherlands)

    Meijaard, J.; Uhlaner, L.M.

    2004-01-01

    This study focuses on the prediction of three firm performance indicators, sales growth, innovation performance and profitability, on a sample of small and medium-sized firms in the Netherlands. Predictions from agency theory and the resource based view of organizations lead to alternate hypotheses

  8. An overview of the geochemical code MINTEQ: Applications to performance assessment for low-level wastes

    International Nuclear Information System (INIS)

    Peterson, S.R.; Opitz, B.E.; Graham, M.J.; Eary, L.E.

    1987-03-01

    The MINTEQ geochemical computer code, developed at the Pacific Northwest Laboratory (PNL), integrates many of the capabilities of its two immediate predecessors, MINEQL and WATEQ3. The MINTEQ code will be used in the Special Waste Form Lysimeters-Arid program to perform the calculations necessary to simulate (model) the contact of low-level waste solutions with heterogeneous sediments of the interaction of ground water with solidified low-level wastes. The code can calculate ion speciation/solubilitya, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution of solid phases. Under the Special Waste Form Lysimeters-Arid program, the composition of effluents (leachates) from column and batch experiments, using laboratory-scale waste forms, will be used to develop a geochemical model of the interaction of ground water with commercial, solidified low-level wastes. The wastes being evaluated include power-reactor waste streams that have been solidified in cement, vinyl ester-styrene, and bitumen. The thermodynamic database for the code was upgraded preparatory to performing the geochemical modeling. Thermodynamic data for solid phases and aqueous species containing Sb, Ce, Cs, or Co were added to the MINTEQ database. The need to add these data was identified from the characterization of the waste streams. The geochemical model developed from the laboratory data will then be applied to predict the release from a field-lysimeter facility that contains full-scale waste samples. The contaminant concentrations migrating from the waste forms predicted using MINTEQ will be compared to the long-term lysimeter data. This comparison will constitute a partial field validation of the geochemical model

  9. Governance codes: facts or fictions? a study of governance codes in colombia1,2

    Directory of Open Access Journals (Sweden)

    Julián Benavides Franco

    2010-10-01

    Full Text Available This article studies the effects on accounting performance and financing decisions of Colombian firms after issuing a corporate governance code. We assemble a database of Colombian issuers and test the hypotheses of improved performance and higher leverage after issuing a code. The results show that the firms’ return on assets after the code introduction improves in excess of 1%; the effect is amplified by the code quality. Additionally, the firms leverage increased, in excess of 5%, when the code quality was factored into the analysis. These results suggest that controlling parties commitment to self restrain, by reducing their private benefits and/or the expropriation of non controlling parties, through the code introduction, is indeed an effective measure and that the financial markets agree, increasing the supply of funds to the firms.

  10. Comparative performance evaluation of transform coding in image pre-processing

    Science.gov (United States)

    Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha

    2017-07-01

    We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.

  11. A fast and compact Fuel Rod Performance Simulator code for predictive, interpretive and educational purpose

    International Nuclear Information System (INIS)

    Lorenzen, J.

    1990-01-01

    A new Fuel rod Performance Simulator code FRPS has been developed, tested and benchmarked and is now available in different versions. The user may choose between the batch version INTERPIN producing results in form of listings or beforehand defined plots, or the interactive simulator code SIMSIM which is stepping through a power history under the control of user. Both versions are presently running on minicomputers and PC:s using EGA-Graphics. A third version is the implementation in a Studsvik Compact Simulator with FRPS being one of its various modules receiving the dynamic inputs from the simulator

  12. Experimental demonstration of the transmission performance for LDPC-coded multiband OFDM ultra-wideband over fiber system

    Science.gov (United States)

    He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin; Su, Jinshu

    2015-01-01

    To improve the transmission performance of multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband (UWB) over optical fiber, a pre-coding scheme based on low-density parity-check (LDPC) is adopted and experimentally demonstrated in the intensity-modulation and direct-detection MB-OFDM UWB over fiber system. Meanwhile, a symbol synchronization and pilot-aided channel estimation scheme is implemented on the receiver of the MB-OFDM UWB over fiber system. The experimental results show that the LDPC pre-coding scheme can work effectively in the MB-OFDM UWB over fiber system. After 70 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1 × 10-3, the receiver sensitivities are improved about 4 dB when the LDPC code rate is 75%.

  13. Code comparison results for the loft LP-FP-2 experiment

    International Nuclear Information System (INIS)

    Merilo, M.; Mecham, D.C.

    1991-01-01

    Computer code calculations are compared with thermal hydraulic and fission product release, transport, and deposition data obtained from the OECD-LOFT LP-FP-2 experiment. Except for the MAAP code, which is a fully integrated severe accident code, the thermalhydraulic and fission product behavior were calculated with different codes. Six organizations participated in the thermal hydraulic portion of the code comparison exercise. These calculations were performed with RELAP 5, SCDAP/RELAP 5, and MAAP. The comparisons show generally well developed capabilities to determine the thermal-hydraulic conditions during the early stages of a severe core damage accident. Four participants submitted detailed fission product behavior calculations. Except for MAAP, as stated previously, the fission product inventory, core damage, fission product release, transport and deposition were calculated independently with different codes. Much larger differences than observed for the thermalhydraulic comparison were evident. The fission product inventory calculations were generally in good agreement with each other. Large differences were observed for release fractions and amounts of deposition. Net release calculations from the primary system were generally accurate within a factor of two or three for the more important fission products

  14. Performance Evaluation of Wavelet-Coded OFDM on a 4.9 Gbps W-Band Radio-over-Fiber Link

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Rommel, Simon; Dinis, Rui

    2017-01-01

    Future generation mobile communications running on mm-wave frequencies will require great robustness against frequency selective channels. In this work we evaluate the transmission performance of 4.9 Gbps Wavelet-Coded OFDM signals on a 10 km fiber plus 58 m wireless Radio-over-Fiber link using...... a mm-wave radio frequency carrier. The results show that a 2×128 Wavelet-Coded OFDM system achieves a bit-error rate of 1e-4 with nearly 2.5 dB less signal-to-noise ratio than a convolutional coded OFDM system with equivalent spectral efficiency for 8 GHz-wide signals with 512 sub-carriers on a carrier...

  15. Current Status of the LIFE Fast Reactors Fuel Performance Codes

    International Nuclear Information System (INIS)

    Yacout, A.M.; Billone, M.C.

    2013-01-01

    The LIFE-4 (Rev. 1) code was calibrated and validated using data from (U,Pu)O2 mixed-oxide fuel pins and UO2 blanket rods which were irradiation tested under steady-state and transient conditions. – It integrates a broad material and fuel-pin irradiation database into a consistent framework for use and extrapolation of the database to reactor design applications. – The code is available and running on different computer platforms (UNIX & PC) – Detailed documentations of the code’s models, routines, calibration and validation data sets are available. LIFE-METAL code is based on LIFE4 with modifications to include key phenomena applicable to metallic fuel, and metallic fuel properties – Calibrated with large database from irradiations in EBR-II – Further effort for calibration and detailed documentation. Recent activities with the codes are related to reactor design studies and support of licensing efforts for 4S and KAERI SFR designs. Future activities are related to re-assessment of the codes calibration and validation and inclusion of models for advanced fuels (transmutation fuels)

  16. Comparison of safety assessment codes for near-surface disposal of LILW with the compartment model: SAGE and VR-KHNP

    International Nuclear Information System (INIS)

    Kim, H. J.; Park, J. W.; Park, J. B.; Kim, C. L.

    2004-01-01

    Safety Assessment Groundwater Evaluation and Virtual Repository for KHNP codes for performance assessment of the LILW disposal repository were developed by joint collaboration between KHNP and foreign consulting organizations. In both code, the disposal facility consists of a series of compartments that represent the waste form, the engineered barrier system, the unsaturated and saturated zone

  17. Round robin performance testing of organic photovoltaic devices

    DEFF Research Database (Denmark)

    Gevorgyan, Suren; Zubillaga, Oihana; de Seoane, José María Vega

    2014-01-01

    This study addresses the issue of poor intercomparability of measurements of organic photovoltaic (OPV) devices among different laboratories. We present a round robin performance testing of novel OPV devices among 16 laboratories, organized within the framework of European Research Infrastructure...

  18. Cooperative Coding and Caching for Streaming Data in Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Liu Jiangchuan

    2010-01-01

    Full Text Available This paper studies the distributed caching managements for the current flourish of the streaming applications in multihop wireless networks. Many caching managements to date use randomized network coding approach, which provides an elegant solution for ubiquitous data accesses in such systems. However, the encoding, essentially a combination operation, makes the coded data difficult to be changed. In particular, to accommodate new data, the system may have to first decode all the combined data segments, remove some unimportant ones, and then reencode the data segments again. This procedure is clearly expensive for continuously evolving data storage. As such, we introduce a novel Cooperative Coding and Caching ( scheme, which allows decoding-free data removal through a triangle-like codeword organization. Its decoding performance is very close to the conventional network coding with only a sublinear overhead. Our scheme offers a promising solution to the caching management for streaming data.

  19. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  20. Zebra: An advanced PWR lattice code

    Energy Technology Data Exchange (ETDEWEB)

    Cao, L.; Wu, H.; Zheng, Y. [School of Nuclear Science and Technology, Xi' an Jiaotong Univ., No. 28, Xianning West Road, Xi' an, ShannXi, 710049 (China)

    2012-07-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  1. Zebra: An advanced PWR lattice code

    International Nuclear Information System (INIS)

    Cao, L.; Wu, H.; Zheng, Y.

    2012-01-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  2. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  3. Organ dose conversion coefficients based on a voxel mouse model and MCNP code for external photon irradiation.

    Science.gov (United States)

    Zhang, Xiaomin; Xie, Xiangdong; Cheng, Jie; Ning, Jing; Yuan, Yong; Pan, Jie; Yang, Guoshan

    2012-01-01

    A set of conversion coefficients from kerma free-in-air to the organ absorbed dose for external photon beams from 10 keV to 10 MeV are presented based on a newly developed voxel mouse model, for the purpose of radiation effect evaluation. The voxel mouse model was developed from colour images of successive cryosections of a normal nude male mouse, in which 14 organs or tissues were segmented manually and filled with different colours, while each colour was tagged by a specific ID number for implementation of mouse model in Monte Carlo N-particle code (MCNP). Monte Carlo simulation with MCNP was carried out to obtain organ dose conversion coefficients for 22 external monoenergetic photon beams between 10 keV and 10 MeV under five different irradiation geometries conditions (left lateral, right lateral, dorsal-ventral, ventral-dorsal, and isotropic). Organ dose conversion coefficients were presented in tables and compared with the published data based on a rat model to investigate the effect of body size and weight on the organ dose. The calculated and comparison results show that the organ dose conversion coefficients varying the photon energy exhibits similar trend for most organs except for the bone and skin, and the organ dose is sensitive to body size and weight at a photon energy approximately <0.1 MeV.

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  5. Coded communications with nonideal interleaving

    Science.gov (United States)

    Laufer, Shaul

    1991-02-01

    Burst error channels - a type of block interference channels - feature increasing capacity but decreasing cutoff rate as the memory rate increases. Despite the large capacity, there is degradation in the performance of practical coding schemes when the memory length is excessive. A short-coding error parameter (SCEP) was introduced, which expresses a bound on the average decoding-error probability for codes shorter than the block interference length. The performance of a coded slow frequency-hopping communication channel is analyzed for worst-case partial band jamming and nonideal interleaving, by deriving expressions for the capacity and cutoff rate. The capacity and cutoff rate, respectively, are shown to approach and depart from those of a memoryless channel corresponding to the transmission of a single code letter per hop. For multiaccess communications over a slot-synchronized collision channel without feedback, the channel was considered as a block interference channel with memory length equal to the number of letters transmitted in each slot. The effects of an asymmetrical background noise and a reduced collision error rate were studied, as aspects of real communications. The performance of specific convolutional and Reed-Solomon codes was examined for slow frequency-hopping systems with nonideal interleaving. An upper bound is presented for the performance of a Viterbi decoder for a convolutional code with nonideal interleaving, and a soft decision diversity combining technique is introduced.

  6. Measuring Student Performance in General Organic Chemistry

    Science.gov (United States)

    Austin, Ara C.; Ben-Daat, Hagit; Zhu, Mary; Atkinson, Robert; Barrows, Nathan; Gould, Ian R.

    2015-01-01

    Student performance in general organic chemistry courses is determined by a wide range of factors including cognitive ability, motivation and cultural capital. Previous work on cognitive factors has tended to focus on specific areas rather than exploring performance across all problem types and cognitive skills. In this study, we have categorized…

  7. Distributed space-time coding

    CERN Document Server

    Jing, Yindi

    2014-01-01

    Distributed Space-Time Coding (DSTC) is a cooperative relaying scheme that enables high reliability in wireless networks. This brief presents the basic concept of DSTC, its achievable performance, generalizations, code design, and differential use. Recent results on training design and channel estimation for DSTC and the performance of training-based DSTC are also discussed.

  8. Performance evaluations of advanced massively parallel platforms based on gyrokinetic toroidal five-dimensional Eulerian code GT5D

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Jolliet, Sebastien

    2010-01-01

    A gyrokinetic toroidal five dimensional Eulerian code GT5D is ported on six advanced massively parallel platforms and comprehensive benchmark tests are performed. A parallelisation technique based on physical properties of the gyrokinetic equation is presented. By extending the parallelisation technique with a hybrid parallel model, the scalability of the code is improved on platforms with multi-core processors. In the benchmark tests, a good salability is confirmed up to several thousands cores on every platforms, and the maximum sustained performance of ∼18.6 Tflops is achieved using 16384 cores of BX900. (author)

  9. THREEDANT: A code to perform three-dimensional, neutral particle transport calculations

    International Nuclear Information System (INIS)

    Alcouffe, R.E.

    1994-01-01

    The THREEDANT code solves the three-dimensional neutral particle transport equation in its first order, multigroup, discrate ordinate form. The code allows an unlimited number of groups (depending upon the cross section set), angular quadrature up to S-100, and unlimited Pn order again depending upon the cross section set. The code has three options for spatial differencing, diamond with set-to-zero fixup, adaptive weighted diamond, and linear modal. The geometry options are XYZ and RZΘ with a special XYZ option based upon a volume fraction method. This allows objects or bodies of any shape to be modelled as input which gives the code as much geometric description flexibility as the Monte Carlo code MCNP. The transport equation is solved by source iteration accelerated by the DSA method. Both inner and outer iterations are so accelerated. Some results are presented which demonstrate the effectiveness of these techniques. The code is available on several types of computing platforms

  10. Assessment of the system code DRUFAN/ATHLET using results of LOBI tests

    International Nuclear Information System (INIS)

    Burwell, J.M.; Kirmse, R.E.; Kyncl, M.; Malhotra, P.K.

    1989-09-01

    Four post-test analyses have been performed by GRS within the Shared Cost Action Programme (SCAP) sponsored by the Commission of the European Communities (contract 3015-86-07 EL ISP D) and by the Bundesminister fuer Forschung und Technologie of the Federal Republic of Germany (Research project RS 739). The four tests were mutually selected by the contractors (CEA, GRS, IKE, Univ. Pisa) of activity No. 3 and by the project organizer. Some of the tests were selected to be analyzed by more than one participant in order to allow comparison between analytical results obtained with different codes or obtained by different code-users. DRUFAN/ATHLET verification analyses were performed by IKE too. The four tests selected for the GRS activity are: - A2-77A (Natural Circulation Test), Analysis with ATHLET - A1-76 (Steam Generator Performance Test), Analysis with DRUFAN - BL-01 (Intermediate Leak), Analysis with ATHLET - A2-81 (Small Leak), Analysis with ATHLET. This final report contains the results of the four post test analysis including the comparison between measured and calculated quantities and the description of the applied codes, the selected model of the LOBI facility and the conclusions drawn for the improvement of the codes models

  11. PERFORMANCE IN ORGANIZATIONS IN A HUMAN RESOURCE PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    LOGOFĂTU MONICA

    2017-08-01

    Full Text Available In turbulent financial and economic present conditions a major challenge for the general management of organizations and in particular for the strategic human resources management is to establish a clear, coherent and consistent framework in terms of measuring organizational performance and economic efficiency. This paper aims to conduct an exploratory research of literature concerning measuring organizational performance. Based on the results of research the paper proposes a multi-dimensional model for measuring organizational performance providing a mechanism that will allow quantification of performance based on selected criteria. The model will attempt to eliminate inconsistencies and incongruities of organizational effectiveness models developed by specialists from organization theory area, performance measurement models developed by specialists from accounting management area and models of measuring the efficiency and effectiveness developed by specialists from strategic management and entrepreneurship areas.

  12. MEASURING PERFORMANCE IN ORGANIZATIONS FROM MULTI-DIMENSIONAL PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    ȘTEFĂNESCU CRISTIAN

    2017-08-01

    Full Text Available In turbulent financial and economic present conditions a major challenge for the general management of organizations and in particular for the strategic human resources management is to establish a clear, coherent and consistent framework in terms of measuring organizational performance and economic efficiency. This paper aims to conduct an exploratory research of literature concerning measuring organizational performance. Based on the results of research the paper proposes a multi-dimensional model for measuring organizational performance providing a mechanism that will allow quantification of performance based on selected criteria. The model will attempt to eliminate inconsistencies and incongruities of organizational effectiveness models developed by specialists from organization theory area, performance measurement models developed by specialists from accounting management area and models of measuring the efficiency and effectiveness developed by specialists from strategic management and entrepreneurship areas.

  13. Performance-based building codes: a call for injury prevention indicators that bridge health and building sectors.

    Science.gov (United States)

    Edwards, N

    2008-10-01

    The international introduction of performance-based building codes calls for a re-examination of indicators used to monitor their implementation. Indicators used in the building sector have a business orientation, target the life cycle of buildings, and guide asset management. In contrast, indicators used in the health sector focus on injury prevention, have a behavioural orientation, lack specificity with respect to features of the built environment, and do not take into account patterns of building use or building longevity. Suggestions for metrics that bridge the building and health sectors are discussed. The need for integrated surveillance systems in health and building sectors is outlined. It is time to reconsider commonly used epidemiological indicators in the field of injury prevention and determine their utility to address the accountability requirements of performance-based codes.

  14. Performance of Multilevel Coding Schemes with Different Decoding Methods and Mapping Strategies in Mobile Fading Channels

    Institute of Scientific and Technical Information of China (English)

    YUAN Dongfeng; WANG Chengxiang; YAO Qi; CAO Zhigang

    2001-01-01

    Based on "capacity rule", the perfor-mance of multilevel coding (MLC) schemes with dif-ferent set partitioning strategies and decoding meth-ods in AWGN and Rayleigh fading channels is investi-gated, in which BCH codes are chosen as componentcodes and 8ASK modulation is used. Numerical re-sults indicate that MLC scheme with UP strategy canobtain optimal performance in AWGN channels andBP is the best mapping strategy for Rayleigh fadingchannels. BP strategy is of good robustness in bothkinds of channels to realize an optimum MLC system.Multistage decoding (MSD) is a sub-optimal decodingmethod of MLC for both channels. For Ungerboeckpartitioning (UP) and mixed partitioning (MP) strat-egy, MSD is strongly recommended to use for MLCsystem, while for BP strategy, PDL is suggested to useas a simple decoding method compared with MSD.

  15. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  16. MIMO-OFDM System's Performance Using LDPC Codes for a Mobile Robot

    Science.gov (United States)

    Daoud, Omar; Alani, Omar

    This work deals with the performance of a Sniffer Mobile Robot (SNFRbot)-based spatial multiplexed wireless Orthogonal Frequency Division Multiplexing (OFDM) transmission technology. The use of Multi-Input Multi-Output (MIMO)-OFDM technology increases the wireless transmission rate without increasing transmission power or bandwidth. A generic multilayer architecture of the SNFRbot is proposed with low power and low cost. Some experimental results are presented and show the efficiency of sniffing deadly gazes, sensing high temperatures and sending live videos of the monitored situation. Moreover, simulation results show the achieved performance by tackling the Peak-to-Average Power Ratio (PAPR) problem of the used technology using Low Density Parity Check (LDPC) codes; and the effect of combating the PAPR on the bit error rate (BER) and the signal to noise ratio (SNR) over a Doppler spread channel.

  17. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  18. Performance and Complexity of Tunable Sparse Network Coding with Gradual Growing Tuning Functions over Wireless Networks

    DEFF Research Database (Denmark)

    Garrido, Pablo; Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages...... a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and computational cost. In addition, it would be difficult to implement, due to the feedback delay. In this work...

  19. Code on the safety of nuclear power plants: Quality assurance

    International Nuclear Information System (INIS)

    1988-01-01

    This revised Code provides the principles and objectives for the establishment and implementation of quality assurance programmes applied to both the overall and each of the constituent activities associated with a nuclear power plant project. The quality assurance principles enumerated in the present Code can be usefully applied to nuclear facilities other than nuclear power plants. The quality assurance programme encompasses: (1) the activities that are necessary to achieve the appropriate quality of the respective item or service; and (2) the activities that are necessary for verifying that the required quality is achieved and that objective evidence is produced to that effect. Quality assurance is an essential aspect of good management and the quality assurance programme is the main management tool for a disciplined approach to all activities affecting quality, including, where appropriate, verification that each task has been satisfactorily performed and that necessary corrective actions have been implemented. The principles and objectives provided by the Code are applicable by all those responsible for the nuclear power plant, by plant designers, suppliers, architect-engineers, plant constructors, plant operators and other organizations participating in activities affecting quality. The Code is a revision of the previous Code of Practice (1978) on the same subject of interest to regulatory bodies and experts in quality assurance for design, siting and operation of nuclear power plants. Contents: Definitions; 1. Introduction; 2. Quality assurance programmes; 3. Organization; 4. Document control; 5. Design control; 6. Procurement control; 7. Control of items; 8. Process control; 9. Inspection and test control; 10. Non-conformance control; 11. Corrective actions; 12, Records; 13. Audits

  20. SIEX: a correlated code for the prediction of liquid metal fast breeder reactor (LMFBR) fuel thermal performance

    International Nuclear Information System (INIS)

    Dutt, D.S.; Baker, R.B.

    1975-06-01

    The SIEX computer program is a steady state heat transfer code developed to provide thermal performance calculations for a mixed-oxide fuel element in a fast neutron environment. Fuel restructuring, fuel-cladding heat conduction and fission gas release are modeled to provide assessment of the temperature. Modeling emphasis has been placed on correlations to measurable quantities from EBR-II irradiation tests and the inclusion of these correlations in a physically based computational scheme. SIEX is completely modular in construction allowing the user options for material properties and correlated models. Required code input is limited to geometric and environmental parameters, with a ''consistent'' set of material properties and correlated models provided by the code. 24 references. (U.S.)

  1. Constructing LDPC Codes from Loop-Free Encoding Modules

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher; Thorpe, Jeremy; Andrews, Kenneth

    2009-01-01

    A method of constructing certain low-density parity-check (LDPC) codes by use of relatively simple loop-free coding modules has been developed. The subclasses of LDPC codes to which the method applies includes accumulate-repeat-accumulate (ARA) codes, accumulate-repeat-check-accumulate codes, and the codes described in Accumulate-Repeat-Accumulate-Accumulate Codes (NPO-41305), NASA Tech Briefs, Vol. 31, No. 9 (September 2007), page 90. All of the affected codes can be characterized as serial/parallel (hybrid) concatenations of such relatively simple modules as accumulators, repetition codes, differentiators, and punctured single-parity check codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. These codes can also be characterized as hybrid turbolike codes that have projected graph or protograph representations (for example see figure); these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The present method comprises two related submethods for constructing LDPC codes from simple loop-free modules with circulant permutations. The first submethod is an iterative encoding method based on the erasure-decoding algorithm. The computations required by this method are well organized because they involve a parity-check matrix having a block-circulant structure. The second submethod involves the use of block-circulant generator matrices. The encoders of this method are very similar to those of recursive convolutional codes. Some encoders according to this second submethod have been implemented in a small field-programmable gate array that operates at a speed of 100 megasymbols per second. By use of density evolution (a computational- simulation technique for analyzing performances of LDPC codes), it has been shown through some examples that as the block size goes to infinity, low iterative decoding thresholds close to

  2. Application of the coupled code Athlet-Quabox/Cubbox for the extreme scenarios of the OECD/NRC BWR turbine trip benchmark and its performance on multi-processor computers

    International Nuclear Information System (INIS)

    Langenbuch, S.; Schmidt, K.D.; Velkov, K.

    2003-01-01

    The OECD/NRC BWR Turbine Trip (TT) Benchmark is investigated to perform code-to-code comparison of coupled codes including a comparison to measured data which are available from turbine trip experiments at Peach Bottom 2. This Benchmark problem for a BWR over-pressure transient represents a challenging application of coupled codes which integrate 3-dimensional neutron kinetics into thermal-hydraulic system codes for best-estimate simulation of plant transients. This transient represents a typical application of coupled codes which are usually performed on powerful workstations using a single CPU. Nowadays, the availability of multi-CPUs is much easier. Indeed, powerful workstations already provide 4 to 8 CPU, computer centers give access to multi-processor systems with numbers of CPUs in the order of 16 up to several 100. Therefore, the performance of the coupled code Athlet-Quabox/Cubbox on multi-processor systems is studied. Different cases of application lead to changing requirements of the code efficiency, because the amount of computer time spent in different parts of the code is varying. This paper presents main results of the coupled code Athlet-Quabox/Cubbox for the extreme scenarios of the BWR TT Benchmark together with evaluations of the code performance on multi-processor computers. (authors)

  3. Description and application of the AERIN Code at LLNL

    International Nuclear Information System (INIS)

    King, W.C.

    1986-01-01

    The AERIN code was written at the Lawrence Livermore National Laboratory in 1976 to compute the organ burdens and absorbed dose resulting from a chronic or acute inhalation of transuranic isotopes. The code was revised in 1982 to reflect the concepts of ICRP-30. This paper will describe the AERIN code and how it has been used at LLNL to study more than 80 cases of internal deposition and obtain estimates of internal dose. A comparison with the computed values of the committed organ dose is made with ICRP-30 values. The benefits of using the code are described. 3 refs., 3 figs., 6 tabs

  4. Performance analysis of linear codes under maximum-likelihood decoding: a tutorial

    National Research Council Canada - National Science Library

    Sason, Igal; Shamai, Shlomo

    2006-01-01

    ..., upper and lower bounds on the error probability of linear codes under ML decoding are surveyed and applied to codes and ensembles of codes on graphs. For upper bounds, we discuss various bounds where focus is put on Gallager bounding techniques and their relation to a variety of other reported bounds. Within the class of lower bounds, we ad...

  5. Modified BTC Algorithm for Audio Signal Coding

    Directory of Open Access Journals (Sweden)

    TOMIC, S.

    2016-11-01

    Full Text Available This paper describes modification of a well-known image coding algorithm, named Block Truncation Coding (BTC and its application in audio signal coding. BTC algorithm was originally designed for black and white image coding. Since black and white images and audio signals have different statistical characteristics, the application of this image coding algorithm to audio signal presents a novelty and a challenge. Several implementation modifications are described in this paper, while the original idea of the algorithm is preserved. The main modifications are performed in the area of signal quantization, by designing more adequate quantizers for audio signal processing. The result is a novel audio coding algorithm, whose performance is presented and analyzed in this research. The performance analysis indicates that this novel algorithm can be successfully applied in audio signal coding.

  6. Comparison of Analytical and Measured Performance Results on Network Coding in IEEE 802.11 Ad-Hoc Networks

    DEFF Research Database (Denmark)

    Zhao, Fang; Médard, Muriel; Hundebøll, Martin

    2012-01-01

    CATWOMAN that can run on standard WiFi hardware. We present an analytical model to evaluate the performance of COPE in simple networks, and our results show the excellent predictive quality of this model. By closely examining the performance in two simple topologies, we observe that the coding gain results...

  7. Transient and fuel performance analysis with VTT's coupled code system

    International Nuclear Information System (INIS)

    Daavittila, A.; Hamalainen, A.; Raty, H.

    2005-01-01

    VTT (technical research center of Finland) maintains and further develops a comprehensive safety analysis code system ranging from the basic neutronic libraries to 3-dimensional transient analysis and fuel behaviour analysis codes. The code system is based on various types of couplings between the relevant physical phenomena. The main tools for analyses of reactor transients are presently the 3-dimensional reactor dynamics code HEXTRAN for cores with a hexagonal fuel assembly geometry and TRAB-3D for cores with a quadratic fuel assembly geometry. HEXTRAN has been applied to safety analyses of VVER type reactors since early 1990's. TRAB-3D is the latest addition to the code system, and has been applied to BWR and PWR analyses in recent years. In this paper it is shown that TRAB-3D has calculated accurately the power distribution during the Olkiluoto-1 load rejection test. The results from the 3-dimensional analysis can be used as boundary conditions for more detailed fuel rod analysis. For this purpose a general flow model GENFLO, developed at VTT, has been coupled with USNRC's FRAPTRAN fuel accident behaviour model. The example case for FRAPTRAN-GENFLO is for an ATWS at a BWR plant. The basis for the analysis is an oscillation incident in the Olkiluoto-1 BWR during reactor startup on February 22, 1987. It is shown that the new coupled code FRAPTRAN/GENFLO is quite a promising tool that can handle flow situations and give a detailed analysis of reactor transients

  8. Multimodal Code-pairing and Switching of Visual-verbal Texts in Selected Nigerian Stand-up Comedy Performances

    Directory of Open Access Journals (Sweden)

    Mufutau Temitayo Lamidi

    2017-10-01

    Full Text Available This study examines multimodal pairing and switching of codes as features of visual-verbal texts and how they are used as strategies for evoking humour in Nigerian stand-up comedy performances, an area that has not attracted much scholarly attention. Data were obtained through purposive random sampling and analysed through content analysis. Six DVDs (Vols. 3, 7, 8 & 28 of Nite of a Thousand Laughs; Vols. 27 & 28 of AY LIVE Happiness Edition and 6 video clips (downloaded from the Internet all totalling 8 hours and 20 minutes of play were selected for the study. Incongruity, Layered Meaning and Visual Semiotics serve as theoretical framework. The study identifies different multimodal strategies such as code-pairing and integration in different forms of oral codes, gestures, costume, and symbols; intertextuality; incongruous translations/ deliberate misinterpretations; and mimicry, quotes and paralanguage used to elicit laughter. It suggests that these features are also useful in other speech-making events, and concludes that the integration of codes and modes of communication serves as an effective strategy in evoking humour and laughter in stand-up comedy

  9. A multiobjective approach to the genetic code adaptability problem.

    Science.gov (United States)

    de Oliveira, Lariza Laura; de Oliveira, Paulo S L; Tinós, Renato

    2015-02-19

    The organization of the canonical code has intrigued researches since it was first described. If we consider all codes mapping the 64 codes into 20 amino acids and one stop codon, there are more than 1.51×10(84) possible genetic codes. The main question related to the organization of the genetic code is why exactly the canonical code was selected among this huge number of possible genetic codes. Many researchers argue that the organization of the canonical code is a product of natural selection and that the code's robustness against mutations would support this hypothesis. In order to investigate the natural selection hypothesis, some researches employ optimization algorithms to identify regions of the genetic code space where best codes, according to a given evaluation function, can be found (engineering approach). The optimization process uses only one objective to evaluate the codes, generally based on the robustness for an amino acid property. Only one objective is also employed in the statistical approach for the comparison of the canonical code with random codes. We propose a multiobjective approach where two or more objectives are considered simultaneously to evaluate the genetic codes. In order to test our hypothesis that the multiobjective approach is useful for the analysis of the genetic code adaptability, we implemented a multiobjective optimization algorithm where two objectives are simultaneously optimized. Using as objectives the robustness against mutation with the amino acids properties polar requirement (objective 1) and robustness with respect to hydropathy index or molecular volume (objective 2), we found solutions closer to the canonical genetic code in terms of robustness, when compared with the results using only one objective reported by other authors. Using more objectives, more optimal solutions are obtained and, as a consequence, more information can be used to investigate the adaptability of the genetic code. The multiobjective approach

  10. STUDY OF PERFORMANCES OF ORGANIC SOLAR CELLS BY ...

    African Journals Online (AJOL)

    30 juin 2011 ... results of analysis of performances of organic solar cells by using what one call the datamining materials. ... Keywords: organic solar cells, gap energie, effiency, PCA. Author Correspondence .... oubli est malencontreux car le type de données disponibles influence toujours la direction de la recherche.

  11. Conservative performance analysis of a PWR nuclear fuel rod using the FRAPCON code

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Fabio Branco Vaz de; Sabundjian, Gaiane, E-mail: fabio@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    In this paper, some of the preliminary results of the sensitivity and conservative analysis of a hypothetical pressurized water reactor fuel rod are presented, using the FRAPCON code as a basic and preparation tool for the future transient analysis, which will be carried out by the FRAPTRAN code. Emphasis is given to the evaluation of the cladding behavior, since it is one of the critical containment barriers of the fission products, generated during fuel irradiation. Sensitivity analyses were performed by the variation of the values of some parameters, which were mainly related with thermal cycle conditions, and taking into account an intermediate value between the realistic and conservative conditions for the linear heat generation rate parameter, given in literature. Time lengths were taken from typical nuclear power plant operational cycle, adjusted to the obtention of a chosen burnup. Curves of fuel and cladding temperatures, and also for their mechanical and oxidation behavior, as a function of the reactor operation's time, are presented for each one of the nodes considered, over the nuclear fuel rod. Analyzing the curves, it was possible to observe the influence of the thermal cycle on the fuel rod performance, in this preliminary step for the accident/transient analysis. (author)

  12. Stego Keys Performance on Feature Based Coding Method in Text Domain

    Directory of Open Access Journals (Sweden)

    Din Roshidi

    2017-01-01

    Full Text Available A main critical factor on embedding process in any text steganography method is a key used known as stego key. This factor will be influenced the success of the embedding process of text steganography method to hide a message from third party or any adversary. One of the important aspects on embedding process in text steganography method is the fitness performance of the stego key. Three parameters of the fitness performance of the stego key have been identified such as capacity ratio, embedded fitness ratio and saving space ratio. It is because a better as capacity ratio, embedded fitness ratio and saving space ratio offers of any stego key; a more message can be hidden. Therefore, main objective of this paper is to analyze three features coding based namely CALP, VERT and QUAD of stego keys in text steganography on their capacity ratio, embedded fitness ratio and saving space ratio. It is found that CALP method give a good effort performance compared to VERT and QUAD methods.

  13. High performance optical encryption based on computational ghost imaging with QR code and compressive sensing technique

    Science.gov (United States)

    Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan

    2015-10-01

    In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.

  14. Cooperative Coding and Caching for Streaming Data in Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Dan Wang

    2010-01-01

    Full Text Available This paper studies the distributed caching managements for the current flourish of the streaming applications in multihop wireless networks. Many caching managements to date use randomized network coding approach, which provides an elegant solution for ubiquitous data accesses in such systems. However, the encoding, essentially a combination operation, makes the coded data difficult to be changed. In particular, to accommodate new data, the system may have to first decode all the combined data segments, remove some unimportant ones, and then reencode the data segments again. This procedure is clearly expensive for continuously evolving data storage. As such, we introduce a novel Cooperative Coding and Caching (C3 scheme, which allows decoding-free data removal through a triangle-like codeword organization. Its decoding performance is very close to the conventional network coding with only a sublinear overhead. Our scheme offers a promising solution to the caching management for streaming data.

  15. Confidence building on the total system performance assessment code, MASCOT-K for permanent disposal of HLW in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H

    2002-12-01

    To perform Total System Performance Assessment(TSPA) of a potential HLW repository, it is necessary to develop the TSPA code. KAERI has developed the one-dimensional PSA code MASCOT-K since 1997 and verified special modules dedicated for the dissolution of spent nuclear fuel. In the second R and D phase, MASCOT-K is once again verified as a part of the confidence building for TSPA. The AMBER code based on the totally different mathematical approach, compartment theory is used together with MASCOT-K to assess the annual individual doses for given K- and Q- scenarios. Results indicate that both AMBER and MASCOT-K simulate the annual individual doses to a potential biosphere. And the MASCOT-K is more flexible to describe the natural barrier such as a fracture for sensitivity studies. In the third R and D phase, MASCOT-K will be actively used to check whether the proposed KAERI reference disposal concept is solid or not.

  16. Confidence building on the total system performance assessment code, MASCOT-K for permanent disposal of HLW in Korea

    International Nuclear Information System (INIS)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H.

    2002-12-01

    To perform Total System Performance Assessment(TSPA) of a potential HLW repository, it is necessary to develop the TSPA code. KAERI has developed the one-dimensional PSA code MASCOT-K since 1997 and verified special modules dedicated for the dissolution of spent nuclear fuel. In the second R and D phase, MASCOT-K is once again verified as a part of the confidence building for TSPA. The AMBER code based on the totally different mathematical approach, compartment theory is used together with MASCOT-K to assess the annual individual doses for given K- and Q- scenarios. Results indicate that both AMBER and MASCOT-K simulate the annual individual doses to a potential biosphere. And the MASCOT-K is more flexible to describe the natural barrier such as a fracture for sensitivity studies. In the third R and D phase, MASCOT-K will be actively used to check whether the proposed KAERI reference disposal concept is solid or not

  17. Construction and performance research on variable-length codes for multirate OCDMA multimedia networks

    Science.gov (United States)

    Li, Chuan-qi; Yang, Meng-jie; Luo, De-jun; Lu, Ye; Kong, Yi-pu; Zhang, Dong-chuang

    2014-09-01

    A new kind of variable-length codes with good correlation properties for the multirate asynchronous optical code division multiple access (OCDMA) multimedia networks is proposed, called non-repetition interval (NRI) codes. The NRI codes can be constructed by structuring the interval-sets with no repetition, and the code length depends on the number of users and the code weight. According to the structural characteristics of NRI codes, the formula of bit error rate (BER) is derived. Compared with other variable-length codes, the NRI codes have lower BER. A multirate OCDMA multimedia simulation system is designed and built, the longer codes are assigned to the users who need slow speed, while the shorter codes are assigned to the users who need high speed. It can be obtained by analyzing the eye diagram that the user with slower speed has lower BER, and the conclusion is the same as the actual demand in multimedia data transport.

  18. An evaluation of TRAC-PF1/MOD1 computer code performance during posttest simulations of Semiscale MOD-2C feedwater line break transients

    International Nuclear Information System (INIS)

    Hall, D.G.; Watkins, J.C.

    1987-01-01

    This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In addition to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use

  19. Assessment of the prediction capability of the TRANSURANUS fuel performance code on the basis of power ramp tested LWR fuel rods

    International Nuclear Information System (INIS)

    Pastore, G.; Botazzoli, P.; Di Marcello, V.; Luzzi, L.

    2009-01-01

    The present work is aimed at assessing the prediction capability of the TRANSURANUS code for the performance analysis of LWR fuel rods under power ramp conditions. The analysis refers to all the power ramp tested fuel rods belonging to the Studsvik PWR Super-Ramp and BWR Inter-Ramp Irradiation Projects, and is focused on some integral quantities (i.e., burn-up, fission gas release, cladding creep-down and failure due to pellet cladding interaction) through a systematic comparison between the code predictions and the experimental data. To this end, a suitable setup of the code is established on the basis of previous works. Besides, with reference to literature indications, a sensitivity study is carried out, which considers the 'ITU model' for fission gas burst release and modifications in the treatment of the fuel solid swelling and the cladding stress corrosion cracking. The performed analyses allow to individuate some issues, which could be useful for the future development of the code. Keywords: Light Water Reactors, Fuel Rod Performance, Power Ramps, Fission Gas Burst Release, Fuel Swelling, Pellet Cladding Interaction, Stress Corrosion Cracking

  20. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  1. International Training Program: 3D S. Un. Cop - Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminar

    International Nuclear Information System (INIS)

    Pertuzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). Four seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004) and at University of Zagreb (2005). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2005 was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and

  2. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.

    Science.gov (United States)

    Weems, Shelley; Heller, Pamela; Fenton, Susan H

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.

  3. International training program: 3D S.UN.COP - Scaling, uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminar

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP 2005 (Scaling, Uncertainty and 3D COuPled code calculations) seminar has been organized by University of Pisa and University of Zagreb as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). It was recognized that such a course represented both a source of continuing education for current code users and a means for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The seminar-training was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and holding the training and the final examination. A certificate (LA Code User grade) was released

  4. Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE

    Energy Technology Data Exchange (ETDEWEB)

    Camous, F.; Jacq, F.; Chatelard, P. [IPSN/DRS/SEMAR CE-Cadarache, St Paul Lez Durance (France)] [and others

    1997-07-01

    In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.

  5. Monolayer-Mediated Growth of Organic Semiconductor Films with Improved Device Performance.

    Science.gov (United States)

    Huang, Lizhen; Hu, Xiaorong; Chi, Lifeng

    2015-09-15

    Increased interest in wearable and smart electronics is driving numerous research works on organic electronics. The control of film growth and patterning is of great importance when targeting high-performance organic semiconductor devices. In this Feature Article, we summarize our recent work focusing on the growth, crystallization, and device operation of organic semiconductors intermediated by ultrathin organic films (in most cases, only a monolayer). The site-selective growth, modified crystallization and morphology, and improved device performance of organic semiconductor films are demonstrated with the help of the inducing layers, including patterned and uniform Langmuir-Blodgett monolayers, crystalline ultrathin organic films, and self-assembled polymer brush films. The introduction of the inducing layers could dramatically change the diffusion of the organic semiconductors on the surface and the interactions between the active layer with the inducing layer, leading to improved aggregation/crystallization behavior and device performance.

  6. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  7. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  8. The Analysis and the Performance Simulation of the Capacity of Bit-interleaved Coded Modulation System

    Directory of Open Access Journals (Sweden)

    Hongwei ZHAO

    2014-09-01

    Full Text Available In this paper, the capacity of the BICM system over AWGN channels is first analyzed; the curves of BICM capacity versus SNR are also got by the Monte-Carlo simulations===?=== and compared with the curves of the CM capacity. Based on the analysis results, we simulate the error performances of BICM system with LDPC codes. Simulation results show that the capacity of BICM system with LDPC codes is enormously influenced by the mapping methods. Given a certain modulation method, the BICM system can obtain about 2-3 dB gain with Gray mapping compared with Non-Gray mapping. Meanwhile, the simulation results also demonstrate the correctness of the theory analysis.

  9. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang

    2011-06-07

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  10. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  11. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  12. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang; Yu, Jun

    2011-01-01

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  13. Inter-comparison of Computer Codes for TRISO-based Fuel Micro-Modeling and Performance Assessment

    International Nuclear Information System (INIS)

    Boer, Brian; Keun Jo, Chang; Wu, Wen; Ougouag, Abderrafi M.; McEachren, Donald; Venneri, Francesco

    2010-01-01

    The Next Generation Nuclear Plant (NGNP), the Deep Burn Pebble Bed Reactor (DB-PBR) and the Deep Burn Prismatic Block Reactor (DB-PMR) are all based on fuels that use TRISO particles as their fundamental constituent. The TRISO particle properties include very high durability in radiation environments, hence the designs reliance on the TRISO to form the principal barrier to radioactive materials release. This durability forms the basis for the selection of this fuel type for applications such as Deep Bun (DB), which require exposures up to four times those expected for light water reactors. It follows that the study and prediction of the durability of TRISO particles must be carried as part of the safety and overall performance characterization of all the designs mentioned above. Such evaluations have been carried out independently by the performers of the DB project using independently developed codes. These codes, PASTA, PISA and COPA, incorporate models for stress analysis on the various layers of the TRISO particle (and of the intervening matrix material for some of them), model for fission products release and migration then accumulation within the SiC layer of the TRISO particle, just next to the layer, models for free oxygen and CO formation and migration to the same location, models for temperature field modeling within the various layers of the TRISO particle and models for the prediction of failure rates. All these models may be either internal to the code or external. This large number of models and the possibility of different constitutive data and model formulations and the possibility of a variety of solution techniques makes it highly unlikely that the model would give identical results in the modeling of identical situations. The purpose of this paper is to present the results of an inter-comparison between the codes and to identify areas of agreement and areas that need reconciliation. The inter-comparison has been carried out by the cooperating

  14. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    International Nuclear Information System (INIS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-01-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural

  15. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    Science.gov (United States)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural

  16. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solis Sanches, L. O.; Miranda, R. Castaneda; Cervantes Viramontes, J. M. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac (Mexico); Vega-Carrillo, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac., Mexico. and Unidad Academica de Estudios Nucleares. C. Cip (Mexico)

    2013-07-03

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in

  17. A new two dimensional spectral/spatial multi-diagonal code for noncoherent optical code division multiple access (OCDMA) systems

    Science.gov (United States)

    Kadhim, Rasim Azeez; Fadhil, Hilal Adnan; Aljunid, S. A.; Razalli, Mohamad Shahrazel

    2014-10-01

    A new two dimensional codes family, namely two dimensional multi-diagonal (2D-MD) codes, is proposed for spectral/spatial non-coherent OCDMA systems based on the one dimensional MD code. Since the MD code has the property of zero cross correlation, the proposed 2D-MD code also has this property. So that, the multi-access interference (MAI) is fully eliminated and the phase induced intensity noise (PIIN) is suppressed with the proposed code. Code performance is analyzed in terms of bit error rate (BER) while considering the effect of shot noise, PIIN, and thermal noise. The performance of the proposed code is compared with the related MD, modified quadratic congruence (MQC), two dimensional perfect difference (2D-PD) and two dimensional diluted perfect difference (2D-DPD) codes. The analytical and the simulation results reveal that the proposed 2D-MD code outperforms the other codes. Moreover, a large number of simultaneous users can be accommodated at low BER and high data rate.

  18. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  19. Knowledge management performance methodology regarding manufacturing organizations

    Science.gov (United States)

    Istrate, C.; Herghiligiu, I. V.

    2016-08-01

    The current business situation is extremely complicated. Business must adapt to the changes in order (a) to survive on the increasingly dynamic markets, (b) to meet customers’ new request for complex, customized and innovative products. In modern manufacturing organizations it can be seen a substantial improvement regarding the management of knowledge. This occurs due to the fact that organizations realized that knowledge and an efficient management of knowledge generates the highest value. Even it could be said that the manufacturing organizations were and are the biggest beneficiary of KM science. Knowledge management performance (KMP) evaluation in manufacturing organizations can be considered as extremely important because without measuring it, they are unable to properly assess (a) what goals, targets and activities must have continuity, (b) what must be improved and (c) what must be completed. Therefore a proper KM will generate multiple competitive advantages for organizations. This paper presents a developed methodological framework regarding the KMP importance regarding manufacturing organizations. This methodological framework was developed using as research methods: bibliographical research and a panel of specialists. The purpose of this paper is to improve the evaluation process of KMP and to provide a viable tool for manufacturing organizations managers.

  20. How Does Intrinsic and Extrinsic Motivation Drive Performance Culture in Organizations?

    Science.gov (United States)

    Turner, Arielle

    2017-01-01

    The performance culture of an organization is impacted by the motivation of an organization's employee. Determining whether or not an employee's motivation is intrinsic or extrinsic is helpful for organizations to see what is more of a drive in their performance. The following article reviews literature on the subject of employee motivation to…

  1. Maize cultivar performance under diverse organic production systems

    Science.gov (United States)

    Maize cultivar performance can vary widely among different production systems. The need for high-performing hybrids for organic systems with wide adaptation to various macroenvironments is becoming increasingly important. The goal of this study was to characterize inbred lines developed by distinc...

  2. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  3. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  4. An Analysis of the Relationship between IFAC Code of Ethics and CPI

    Directory of Open Access Journals (Sweden)

    Ayşe İrem Keskin

    2015-11-01

    Full Text Available Abstract Code of ethics has become a significant concept as regards to the business world. That is why occupational organizations have developed their own codes of ethics over time. In this study, primarily the compatibility classification of the accounting code of ethics belonging to the IFAC (The International Federation of Accountants is carried out on the basis of the action plans assessing the levels of usage by the 175 IFAC national accounting organizations. It is determined as a result of the classification that 60,6% of the member organizations are applying the IFAC code in general, the rest 39,4% on the other hand, is not applying the code at all. With this classification, the hypothesis propounding that “The national accounting organizations in highly corrupt countries would be less likely to adopt the IFAC ethic code than those in very clean countries,” is tested using the “Corruption Perception Index-CPI” data. It is determined that the findings support this relevant hypothesis.          

  5. Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code

    Science.gov (United States)

    Lavelle, Thomas M.; Curlett, Brian P.

    1994-01-01

    XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.

  6. Building energy performance analysis by an in-house developed dynamic simulation code: An investigation for different case studies

    International Nuclear Information System (INIS)

    Buonomano, Annamaria; Palombo, Adolfo

    2014-01-01

    Highlights: • A new dynamic simulation code for building energy performance analysis is presented. • The thermal behavior of each building element is modeled by a thermal RC network. • The physical models implemented in the code are illustrated. • The code was validated by the BESTEST standard procedure. • We investigate residential buildings, offices and stores in different climates. - Abstract: A novel dynamic simulation model for the building envelope energy performance analysis is presented in this paper. This tool helps the investigation of many new building technologies to increase the system energy efficiency and it can be carried out for scientific research purposes. In addition to the yearly heating and cooling load and energy demand, the obtained output is the dynamic temperature profile of indoor air and surfaces and the dynamic profile of the thermal fluxes through the building elements. The presented simulation model is also validated through the BESTEST standard procedure. Several new case studies are developed for assessing, through the presented code, the energy performance of three different building envelopes with several different weather conditions. In particular, dwelling and commercial buildings are analysed. Light and heavyweight envelopes as well as different glazed surfaces areas have been used for every case study. With the achieved results interesting design and operating guidelines can be obtained. Such data have been also compared vs. those calculated by TRNSYS and EnergyPlus. The detected deviation of the obtained results vs. those of such standard tools are almost always lower than 10%

  7. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  8. Development and Application of Subchannel Analysis Code Technology for Advanced Reactor Systems

    International Nuclear Information System (INIS)

    Hwang, Dae Hyun; Seo, K. W.

    2006-01-01

    A study has been performed for the development and assessment of a subchannel analysis code which is purposed to be used for the analysis of advanced reactor conditions with various configurations of reactor core and several kinds of reactor coolant fluids. The subchannel analysis code was developed on the basis of MATRA code which is being developed at KAERI. A GUI (Graphic User Interface) system was adopted in order to reduce input error and to enhance user convenience. The subchannel code was complemented in the property calculation modules by including various fluids such as heavy liquid metal, gas, refrigerant,and supercritical water. The subchannel code was applied to calculate the local thermal hydraulic conditions inside the non-square test bundles which was employed for the analysis of CHF. The applicability of the subchannel code was evaluated for a high temperature gas cooled reactor condition and supercritical pressure conditions with water and Freon. A subchannel analysis has been conducted for European ADS(Accelerator-Driven subcritical System) with Pb-Bi coolant through the international cooperation work between KAERI and FZK, Germany. In addition, the prediction capability of the subchannel code was evaluated for the subchannel void distribution data by participating an international code benchmark program which was organized by OECD/NRC

  9. Development and Application of Subchannel Analysis Code Technology for Advanced Reactor Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, K. W

    2006-01-15

    A study has been performed for the development and assessment of a subchannel analysis code which is purposed to be used for the analysis of advanced reactor conditions with various configurations of reactor core and several kinds of reactor coolant fluids. The subchannel analysis code was developed on the basis of MATRA code which is being developed at KAERI. A GUI (Graphic User Interface) system was adopted in order to reduce input error and to enhance user convenience. The subchannel code was complemented in the property calculation modules by including various fluids such as heavy liquid metal, gas, refrigerant,and supercritical water. The subchannel code was applied to calculate the local thermal hydraulic conditions inside the non-square test bundles which was employed for the analysis of CHF. The applicability of the subchannel code was evaluated for a high temperature gas cooled reactor condition and supercritical pressure conditions with water and Freon. A subchannel analysis has been conducted for European ADS(Accelerator-Driven subcritical System) with Pb-Bi coolant through the international cooperation work between KAERI and FZK, Germany. In addition, the prediction capability of the subchannel code was evaluated for the subchannel void distribution data by participating an international code benchmark program which was organized by OECD/NRC.

  10. The ASME Boiler and Pressure Vessel Code: overview

    International Nuclear Information System (INIS)

    Farr, J.R.

    1987-01-01

    To become familiar with the Boiler and Pressure Vessel Code of the American Society of Mechanical Engineers, it is necessary to understand the history, organization, and operation of the Boiler Code Committee as well as to become familiar with the important aspects of each Section of the Code. This chapter will review the background and contents of the Code as well as give a review of the salient contents of most sections. (author)

  11. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  12. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  13. IAMBUS, a computer code for the design and performance prediction of fast breeder fuel rods

    International Nuclear Information System (INIS)

    Toebbe, H.

    1990-05-01

    IAMBUS is a computer code for the thermal and mechanical design, in-pile performance prediction and post-irradiation analysis of fast breeder fuel rods. The code deals with steady, non-steady and transient operating conditions and enables to predict in-pile behavior of fuel rods in power reactors as well as in experimental rigs. Great effort went into the development of a realistic account of non-steady fuel rod operating conditions. The main emphasis is placed on characterizing the mechanical interaction taking place between the cladding tube and the fuel as a result of contact pressure and friction forces, with due consideration of axial and radial crack configuration within the fuel as well as the gradual transition at the elastic/plastic interface in respect to fuel behavior. IAMBUS can be readily adapted to various fuel and cladding materials. The specific models and material correlations of the reference version deal with the actual in-pile behavior and physical properties of the KNK II and SNR 300 related fuel rod design, confirmed by comparison of the fuel performance model with post-irradiation data. The comparison comprises steady, non-steady and transient irradiation experiments within the German/Belgian fuel rod irradiation program. The code is further validated by comparison of model predictions with post-irradiation data of standard fuel and breeder rods of Phenix and PFR as well as selected LWR fuel rods in non-steady operating conditions

  14. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  15. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  16. Analytical validation of the CACECO containment analysis code

    International Nuclear Information System (INIS)

    Peak, R.D.

    1979-08-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. This report covers the verification of the CACECO code by problems that can be solved by hand calculations or by reference to textbook and literature examples. The verification concentrates on the accuracy of the material and energy balances maintained by the code and on the independence of the four cells analyzed by the code so that the user can be assured that the code analyses are numerically correct and independent of the organization of the input data submitted to the code

  17. ABOUT THE ORGANIZATION OF THE LEGAL FOUNDATIONS OF THE NEW EDITION OF THE UKRAINIAN AIR CODE

    Directory of Open Access Journals (Sweden)

    R. T. Baran

    2009-06-01

    Full Text Available The authors’ own scientific and practical approaches to the issuing of the clauses of new Air Code of Ukraine are proposed. There are presented the conceptual basics of organization and legal regulation of the legislative instructions, which especially concern to the chapters regarding regulation of the conditions and order of use of the air space of Ukraine, organizational and economic aspects of activities of airports etc. The models of structuring the organizational subsystems for the commercial and state sectors of the air space and the forms of the organizationalandmanagerial structures, managerial methods and economical airport systems are also proposed.

  18. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  19. Parallelization of Subchannel Analysis Code MATRA

    International Nuclear Information System (INIS)

    Kim, Seongjin; Hwang, Daehyun; Kwon, Hyouk

    2014-01-01

    A stand-alone calculation of MATRA code used up pertinent computing time for the thermal margin calculations while a relatively considerable time is needed to solve the whole core pin-by-pin problems. In addition, it is strongly required to improve the computation speed of the MATRA code to satisfy the overall performance of the multi-physics coupling calculations. Therefore, a parallel approach to improve and optimize the computability of the MATRA code is proposed and verified in this study. The parallel algorithm is embodied in the MATRA code using the MPI communication method and the modification of the previous code structure was minimized. An improvement is confirmed by comparing the results between the single and multiple processor algorithms. The speedup and efficiency are also evaluated when increasing the number of processors. The parallel algorithm was implemented to the subchannel code MATRA using the MPI. The performance of the parallel algorithm was verified by comparing the results with those from the MATRA with the single processor. It is also noticed that the performance of the MATRA code was greatly improved by implementing the parallel algorithm for the 1/8 core and whole core problems

  20. Achievable Performance of Zero-Delay Variable-Rate Coding in Rate-Constrained Networked Control Systems with Channel Delay

    DEFF Research Database (Denmark)

    Barforooshan, Mohsen; Østergaard, Jan; Stavrou, Fotios

    2017-01-01

    This paper presents an upper bound on the minimum data rate required to achieve a prescribed closed-loop performance level in networked control systems (NCSs). The considered feedback loop includes a linear time-invariant (LTI) plant with single measurement output and single control input. Moreover......, in this NCS, a causal but otherwise unconstrained feedback system carries out zero-delay variable-rate coding, and control. Between the encoder and decoder, data is exchanged over a rate-limited noiseless digital channel with a known constant time delay. Here we propose a linear source-coding scheme...

  1. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  2. Comparison of the ENIGMA code with experimental data on thermal performance, stable fission gas and iodine release at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Killeen, J C [Nuclear Electric plc, Barnwood (United Kingdom)

    1997-08-01

    The predictions of the ENIGMA code have been compared with data from high burn-up fuel experiments from the Halden and RISO reactors. The experiments modelled were IFA-504 and IFA-558 from Halden and the test II-5 from the RISO power burnup test series. The code has well modelled the fuel thermal performance and has provided a good measure of iodine release from pre-interlinked fuel. After interlinkage the iodine predictions remain a good fit for one experiment, but there is significant overprediction for a second experiment (IFA-558). Stable fission gas release is also well modelled and the predictions are within the expected uncertainly band throughout the burn-up range. This report presents code predictions for stable fission gas release to 40GWd/tU, iodine release measurements to 50GWd/tU and thermal performance (fuel centre temperature) to 55GWd/tU. Fuel ratings of up to 38kW/m were modelled at the high burn-up levels. The code is shown to accurately or conservatively predict all these parameters. (author). 1 ref., 6 figs.

  3. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation.

  4. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    International Nuclear Information System (INIS)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation

  5. Building codes : obstacle or opportunity?

    Science.gov (United States)

    Alberto Goetzl; David B. McKeever

    1999-01-01

    Building codes are critically important in the use of wood products for construction. The codes contain regulations that are prescriptive or performance related for various kinds of buildings and construction types. A prescriptive standard might dictate that a particular type of material be used in a given application. A performance standard requires that a particular...

  6. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    International Nuclear Information System (INIS)

    Baratta, A.J.

    1997-01-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together

  7. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    Energy Technology Data Exchange (ETDEWEB)

    Baratta, A.J.

    1997-07-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.

  8. Organic vegetable proteins and oil in feed for organic rainbow trout (Oncorhynchus mykiss)

    DEFF Research Database (Denmark)

    Lund, Ivar; Dalsgaard, Anne Johanne Tang; Jokumsen, Alfred

    The demand for organic trout is increasing, stressing the need for organic, vegetable feed ingredients as replacement for fish meal, as the principles of organic aquaculture encourage the development of feed that do not deplete global fish stocks. In addition, the organic code of practice does...... not allow addition of artificial amino acids to the feed, and optimization of the amino acid profile of organically based diets must therefore derive from the protein sources alone. The aim of this study was to evaluate the digestibility and growth performance of organic vegetable dietary ingredients...... as replacement for fish meal and fish oil in feed for organic rainbow trout (Oncorhynchus mykiss). Six iso-energetic and iso- nitrogenous diets were prepared, comprising a fish meal and fish oil based control diet and three diets in which the inclusion of fish meal was gradually reduced from 59 to 35...

  9. Comparative analysis of thermodynamic performance and optimization of organic flash cycle (OFC) and organic Rankine cycle (ORC)

    International Nuclear Information System (INIS)

    Lee, Ho Yong; Park, Sang Hee; Kim, Kyoung Hoon

    2016-01-01

    A comparative thermodynamic performance and optimization analysis of basic organic flash cycle (OFCB), organic flash cycle with two-phase expander (OFCT), and organic Rankine cycle (ORC) activated by low-temperature sensible energy is carried out in the subcritical pressure regions. The three substances of R245fa, R123, and o-xylene are considered as the working fluids. Effects of cycle type, working fluid, and evaporation and source temperatures are systemically investigated on the system performance such as net power production, thermal and exergy efficiencies, and exergy destruction ratios at each component of the systems. Results show that the cycle type or working fluid which shows optimum performance depends on the source temperature, and organic flash cycle shows a potential for efficient recovery of low grade energy source.

  10. Network Coding Fundamentals and Applications

    CERN Document Server

    Medard, Muriel

    2011-01-01

    Network coding is a field of information and coding theory and is a method of attaining maximum information flow in a network. This book is an ideal introduction for the communications and network engineer, working in research and development, who needs an intuitive introduction to network coding and to the increased performance and reliability it offers in many applications. This book is an ideal introduction for the research and development communications and network engineer who needs an intuitive introduction to the theory and wishes to understand the increased performance and reliabil

  11. Performance Analysis of an Astrophysical Simulation Code on the Intel Xeon Phi Architecture

    OpenAIRE

    Noormofidi, Vahid; Atlas, Susan R.; Duan, Huaiyu

    2015-01-01

    We have developed the astrophysical simulation code XFLAT to study neutrino oscillations in supernovae. XFLAT is designed to utilize multiple levels of parallelism through MPI, OpenMP, and SIMD instructions (vectorization). It can run on both CPU and Xeon Phi co-processors based on the Intel Many Integrated Core Architecture (MIC). We analyze the performance of XFLAT on configurations with CPU only, Xeon Phi only and both CPU and Xeon Phi. We also investigate the impact of I/O and the multi-n...

  12. Description of the COMRADEX code

    International Nuclear Information System (INIS)

    Spangler, G.W.; Boling, M.; Rhoades, W.A.; Willis, C.A.

    1967-01-01

    The COMRADEX Code is discussed briefly and instructions are provided for the use of the code. The subject code was developed for calculating doses from hypothetical power reactor accidents. It permits the user to analyze four successive levels of containment with time-varying leak rates. Filtration, cleanup, fallout and plateout in each containment shell can also be analyzed. The doses calculated include the direct gamma dose from the containment building, the internal doses to as many as 14 organs including the thyroid, bone, lung, etc. from inhaling the contaminated air, and the external gamma doses from the cloud. While further improvements are needed, such as a provision for calculating doses from fallout, rainout and washout, the present code capabilities have a wide range of applicability for reactor accident analysis

  13. First vapor explosion calculations performed with MC3D thermal-hydraulic code

    Energy Technology Data Exchange (ETDEWEB)

    Brayer, C.; Berthoud, G. [CEA Centre d`Etudes de Grenoble, 38 (France). Direction des Reacteurs Nucleaires

    1998-01-01

    This paper presents the first calculations performed with the `explosion` module of the multiphase computer code MC3D, which is devoted to the fine fragmentation and explosion phase of a fuel coolant interaction. A complete description of the physical laws included in this module is given. The fragmentation models, taking into account two fragmentation mechanisms, a thermal one and an hydrodynamic one, are also developed here. Results to some calculations to test the numerical behavior of MC3D and to test the explosion models in 1D or 2D are also presented. (author)

  14. Codes of conduct: An extra suave instrument of EU governance?

    DEFF Research Database (Denmark)

    Borras, Susana

    able to coordinate actors successfully (effectiveness)? and secondly, under what conditions are codes of conduct able to generate democratically legitimate political processes? The paper examines carefully a recent case study, the “Code of Conduct for the Recruitment of Researchers” (CCRR). The code...... establishes a specific set of voluntary norms and principles that shall guide the recruiting process of researchers by European research organizations (universities, public research organizations and firms) in the 33 countries of the single market minded initiative of the European Research Area. A series...

  15. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  16. Human performance improvement in organizations: Potential application for the nuclear industry

    International Nuclear Information System (INIS)

    2005-11-01

    This publication is primarily intended for managers and specialists in nuclear facility operating organizations working in the area of human performance improvement. It is intended to provide them with practical information they can use to improve human performance in their organizations. While some of the information provided in this publication is based upon the experience of nuclear facility operating organizations, most of it comes from human performance improvement initiatives in non-nuclear organizations and industries. The nuclear industry has a long tradition of sharing good management practices in order to foster continuous improvement. However, it is not always realized that many of the practices that are now well established initially came from non-nuclear industries and were subsequently adapted for application to nuclear power plant operating organizations. There is, therefore, good reason to periodically review non-nuclear industry practices for ideas that might have direct or indirect application to the nuclear industry in order to potentially gain benefits such as the following: new approaches to certain problem areas, insights into new or impending challenges, improvements in existing practices, benchmarking of opportunities, development of learning organizations and avoidance of collective blind spots. The preparation of this report was an activity of the project on Effective Training to Achieve Excellence in the Performance of NPP Personnel. The objective of this project is to enhance the capability of Member States to utilize proven practices developed and transferred by the IAEA for improving personnel performance. The expected outcome from this project is the increased use by organizations in Members States of proven engineering and management practices and methodologies developed and transferred by the IAEA to improve personnel performance

  17. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    International Nuclear Information System (INIS)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  18. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  19. Performance analysis of organic Rankine cycles using different working fluids

    Directory of Open Access Journals (Sweden)

    Zhu Qidi

    2015-01-01

    Full Text Available Low-grade heat from renewable or waste energy sources can be effectively recovered to generate power by an organic Rankine cycle (ORC in which the working fluid has an important impact on its performance. The thermodynamic processes of ORCs using different types of organic fluids were analyzed in this paper. The relationships between the ORC’s performance parameters (including evaporation pressure, condensing pressure, outlet temperature of hot fluid, net power, thermal efficiency, exergy efficiency, total cycle irreversible loss, and total heat-recovery efficiency and the critical temperatures of organic fluids were established based on the property of the hot fluid through the evaporator in a specific working condition, and then were verified at varied evaporation temperatures and inlet temperatures of the hot fluid. Here we find that the performance parameters vary monotonically with the critical temperatures of organic fluids. The values of the performance parameters of the ORC using wet fluids are distributed more dispersedly with the critical temperatures, compared with those of using dry/isentropic fluids. The inlet temperature of the hot fluid affects the relative distribution of the exergy efficiency, whereas the evaporation temperature only has an impact on the performance parameters using wet fluid.

  20. IRSN Code of Ethics and Professional Conduct. Annex VII [TSO Mission Statement and Code of Ethics

    International Nuclear Information System (INIS)

    2018-01-01

    IRSN has adopted, in 2013, a Code of Ethics and Professional Conduct, the contents of which are summarized. As a preamble, it is indicated that the Code, which was adopted in 2013 by the Ethics Commission of IRSN and the Board of IRSN, complies with relevant constitutional and legal requirements. The introduction to the Code presents the role and missions of IRSN in the French system, as well as the various conditions and constraints that frame its action, in particular with respect to ethical issues. It states that the Code sets principles and establishes guidance for addressing these constraints and resolving conflicts that may arise, thus constituting references for the Institute and its staff, and helping IRSN’s partners in their interaction with the Institute. The stipulations of the Code are organized in four articles, reproduced and translated.

  1. LFSC - Linac Feedback Simulation Code

    International Nuclear Information System (INIS)

    Ivanov, Valentin; Fermilab

    2008-01-01

    The computer program LFSC ( ) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output

  2. Basic concept of common reactor physics code systems. Final report of working party on common reactor physics code systems (CCS)

    International Nuclear Information System (INIS)

    2004-03-01

    A working party was organized for two years (2001-2002) on common reactor physics code systems under the Research Committee on Reactor Physics of JAERI. This final report is compilation of activity of the working party on common reactor physics code systems during two years. Objectives of the working party is to clarify basic concept of common reactor physics code systems to improve convenience of reactor physics code systems for reactor physics researchers in Japan on their various field of research and development activities. We have held four meetings during 2 years, investigated status of reactor physics code systems and innovative software technologies, and discussed basic concept of common reactor physics code systems. (author)

  3. Information privacy in organizations: empowering creative and extrarole performance.

    Science.gov (United States)

    Alge, Bradley J; Ballinger, Gary A; Tangirala, Subrahmaniam; Oakley, James L

    2006-01-01

    This article examines the relationship of employee perceptions of information privacy in their work organizations and important psychological and behavioral outcomes. A model is presented in which information privacy predicts psychological empowerment, which in turn predicts discretionary behaviors on the job, including creative performance and organizational citizenship behavior (OCB). Results from 2 studies (Study 1: single organization, N=310; Study 2: multiple organizations, N=303) confirm that information privacy entails judgments of information gathering control, information handling control, and legitimacy. Moreover, a model linking information privacy to empowerment and empowerment to creative performance and OCBs was supported. Findings are discussed in light of organizational attempts to control employees through the gathering and handling of their personal information. (c) 2006 APA, all rights reserved.

  4. Development of Coolant Radioactivity Interpretation Code

    International Nuclear Information System (INIS)

    Kim, Kiyoung; Jung, Youngsuk; Kim, Kyounghyun; Kim, Jangwook

    2013-01-01

    In Korea, the coolant radioactivity analysis has been performed by using the computer codes of foreign companies such as CADE (Westinghouse), IODYNE and CESIUM (ABB-CE). However, these computer codes are too conservative and have involved considerable errors. Furthermore, since these codes are DOS-based program, their easy operability is not satisfactory. Therefore it is required development of an enhanced analysis algorithm applying an analytical method reflecting the change of operational environments of domestic nuclear power plants and a fuel failure evaluation software considering user' conveniences. We have developed a nuclear fuel failure evaluation code able to estimate the number of failed fuel rods and the burn-up of failed fuels during nuclear power plant operation cycle. A Coolant Radio-activity Interpretation Code (CRIC) for LWR has been developed as the output of the project 'Development of Fuel Reliability Enhanced Technique' organized by Korea Institute of Energy Technology Evaluation and Planning (KETEP). The CRIC is Windows based-software able to evaluate the number of failed fuel rods and the burn-up of failed fuel region by analyzing coolant radioactivity of LWR in operation. The CRIC is based on the model of fission products release commonly known as 'three region model' (pellet region, gap region, and coolant region), and we are verifying the CRIC results based on the cases of domestic fuel failures. CRIC users are able to estimate the number of failed fuel rods, burn-up and regions of failed fuel considered enrichment and power distribution of fuel region by using operational cycle data, coolant activity data, fuel loading pattern, Cs-134/Cs-137 ratio according to burn-up and U-235 enrichment provided in the code. Due to development of the CRIC, it is secured own unique fuel failure evaluation code. And, it is expected to have the following significant meaning. This is that the code reflecting a proprietary technique for quantitatively

  5. The impact of Knowledge Management Infrastructure on Performance Effectiveness in Jordanian Organizations

    Directory of Open Access Journals (Sweden)

    Nasser Mohammad Soud Jaradat, Dr.

    2014-06-01

    Full Text Available This study aims to determine the impact of knowledge management infrastructure on the performance effectiveness of the Jordanian organizations that need knowledge to perform their work and tasks. The study sample includes some public and private organizations working in Jordan and dealing with the knowledge subjects. The findings indicated that there was a strong effect for knowledge management infrastructure on the performance effectiveness. Organizations should establish knowledge directorates to discover and transmit knowledge to workers with a view to improve the creativeness and distinctiveness of organizations.

  6. Simulating the performance of a distance-3 surface code in a linear ion trap

    Science.gov (United States)

    Trout, Colin J.; Li, Muyuan; Gutiérrez, Mauricio; Wu, Yukai; Wang, Sheng-Tao; Duan, Luming; Brown, Kenneth R.

    2018-04-01

    We explore the feasibility of implementing a small surface code with 9 data qubits and 8 ancilla qubits, commonly referred to as surface-17, using a linear chain of 171Yb+ ions. Two-qubit gates can be performed between any two ions in the chain with gate time increasing linearly with ion distance. Measurement of the ion state by fluorescence requires that the ancilla qubits be physically separated from the data qubits to avoid errors on the data due to scattered photons. We minimize the time required to measure one round of stabilizers by optimizing the mapping of the two-dimensional surface code to the linear chain of ions. We develop a physically motivated Pauli error model that allows for fast simulation and captures the key sources of noise in an ion trap quantum computer including gate imperfections and ion heating. Our simulations showed a consistent requirement of a two-qubit gate fidelity of ≥99.9% for the logical memory to have a better fidelity than physical two-qubit operations. Finally, we perform an analysis of the error subsets from the importance sampling method used to bound the logical error rates to gain insight into which error sources are particularly detrimental to error correction.

  7. Introduction of SCIENCE code package

    International Nuclear Information System (INIS)

    Lu Haoliang; Li Jinggang; Zhu Ya'nan; Bai Ning

    2012-01-01

    The SCIENCE code package is a set of neutronics tools based on 2D assembly calculations and 3D core calculations. It is made up of APOLLO2F, SMART and SQUALE and used to perform the nuclear design and loading pattern analysis for the reactors on operation or under construction of China Guangdong Nuclear Power Group. The purpose of paper is to briefly present the physical and numerical models used in each computation codes of the SCIENCE code pack age, including the description of the general structure of the code package, the coupling relationship of APOLLO2-F transport lattice code and SMART core nodal code, and the SQUALE code used for processing the core maps. (authors)

  8. Zero-forcing pre-coding for MIMO WiMAX transceivers: Performance analysis and implementation issues

    Science.gov (United States)

    Cattoni, A. F.; Le Moullec, Y.; Sacchi, C.

    Next generation wireless communication networks are expected to achieve ever increasing data rates. Multi-User Multiple-Input-Multiple-Output (MU-MIMO) is a key technique to obtain the expected performance, because such a technique combines the high capacity achievable using MIMO channel with the benefits of space division multiple access. In MU-MIMO systems, the base stations transmit signals to two or more users over the same channel, for this reason every user can experience inter-user interference. This paper provides a capacity analysis of an online, interference-based pre-coding algorithm able to mitigate the multi-user interference of the MU-MIMO systems in the context of a realistic WiMAX application scenario. Simulation results show that pre-coding can significantly increase the channel capacity. Furthermore, the paper presents several feasibility considerations for implementation of the analyzed technique in a possible FPGA-based software defined radio.

  9. Development of integrated computer code for analysis of risk reduction strategy

    International Nuclear Information System (INIS)

    Kim, Dong Ha; Kim, See Darl; Kim, Hee Dong

    2002-05-01

    The development of the MIDAS/TH integrated severe accident code was performed in three main areas: 1) addition of new models derived from the national experimental programs and models for APR-1400 Korea next generation reactor, 2) improvement of the existing models using the recently available results, and 3) code restructuring for user friendliness. The unique MIDAS/TH models include: 1) a kinetics module for core power calculation during ATWS, 2) a gap cooling module between the molten corium pool and the reactor vessel wall, 3) a penetration tube failure module, 4) a PAR analysis module, and 5) a look-up table for the pressure and dynamic load during steam explosion. The improved models include: 1) a debris dispersal module considering the cavity geometry during DCH, 2) hydrogen burn and deflagration-to-detonation transition criteria, 3) a peak pressure estimation module for hydrogen detonation, and 4) the heat transfer module between the molten corium pool and the overlying water. The sparger and the ex-vessel heat transfer module were assessed. To enhance user friendliness, code restructuring was performed. In addition, a sample of severe accident analysis results was organized under the preliminary database structure

  10. Parallelization of 2-D lattice Boltzmann codes

    International Nuclear Information System (INIS)

    Suzuki, Soichiro; Kaburaki, Hideo; Yokokawa, Mitsuo.

    1996-03-01

    Lattice Boltzmann (LB) codes to simulate two dimensional fluid flow are developed on vector parallel computer Fujitsu VPP500 and scalar parallel computer Intel Paragon XP/S. While a 2-D domain decomposition method is used for the scalar parallel LB code, a 1-D domain decomposition method is used for the vector parallel LB code to be vectorized along with the axis perpendicular to the direction of the decomposition. High parallel efficiency of 95.1% by the vector parallel calculation on 16 processors with 1152x1152 grid and 88.6% by the scalar parallel calculation on 100 processors with 800x800 grid are obtained. The performance models are developed to analyze the performance of the LB codes. It is shown by our performance models that the execution speed of the vector parallel code is about one hundred times faster than that of the scalar parallel code with the same number of processors up to 100 processors. We also analyze the scalability in keeping the available memory size of one processor element at maximum. Our performance model predicts that the execution time of the vector parallel code increases about 3% on 500 processors. Although the 1-D domain decomposition method has in general a drawback in the interprocessor communication, the vector parallel LB code is still suitable for the large scale and/or high resolution simulations. (author)

  11. Parallelization of 2-D lattice Boltzmann codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Soichiro; Kaburaki, Hideo; Yokokawa, Mitsuo

    1996-03-01

    Lattice Boltzmann (LB) codes to simulate two dimensional fluid flow are developed on vector parallel computer Fujitsu VPP500 and scalar parallel computer Intel Paragon XP/S. While a 2-D domain decomposition method is used for the scalar parallel LB code, a 1-D domain decomposition method is used for the vector parallel LB code to be vectorized along with the axis perpendicular to the direction of the decomposition. High parallel efficiency of 95.1% by the vector parallel calculation on 16 processors with 1152x1152 grid and 88.6% by the scalar parallel calculation on 100 processors with 800x800 grid are obtained. The performance models are developed to analyze the performance of the LB codes. It is shown by our performance models that the execution speed of the vector parallel code is about one hundred times faster than that of the scalar parallel code with the same number of processors up to 100 processors. We also analyze the scalability in keeping the available memory size of one processor element at maximum. Our performance model predicts that the execution time of the vector parallel code increases about 3% on 500 processors. Although the 1-D domain decomposition method has in general a drawback in the interprocessor communication, the vector parallel LB code is still suitable for the large scale and/or high resolution simulations. (author).

  12. Nondestructive testing standards and the ASME code

    International Nuclear Information System (INIS)

    Spanner, J.C.

    1991-04-01

    Nondestructive testing (NDT) requirements and standards are an important part of the ASME Boiler and Pressure Vessel Code. In this paper, the evolution of these requirements and standards is reviewed in the context of the unique technical and legal stature of the ASME Code. The coherent and consistent manner by which the ASME Code rules are organized is described, and the interrelationship between the various ASME Code sections, the piping codes, and the ASTM Standards is discussed. Significant changes occurred in ASME Sections 5 and 11 during the 1980s, and these are highlighted along with projections and comments regarding future trends and changes in these important documents. 4 refs., 8 tabs

  13. Final Report for the FUMEX-III Exercise with the TRANSURANUS Code

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Schubert, A.; Van de Laar, J.; Di Marcello, V.

    2013-01-01

    This report describes the results of the fourth round robin exercise organized by the IAEA for the LWR fuel behaviour codes. The previous exercise was organized by the IAEA in 2002-2007. ITU contributed to the IAEA FUMEX- II co-ordinated research project (CRP) by: - The development of a high burn-up TRANSURANUS-WWER Version - Verification of the TRANSURANUS-WWER Version for WWER-1000 reactors - Further verification of the TRANSURANUS Code by selected irradiations from the IFPE Database - Transfer of latest ITU knowledge in the following areas: high burnup effects and MOX behaviour as far as confidentiality is not concerned. Within the FUMEX-III project this work continues by: - Knowledge transfer and release of the TRANSURANUS code to safety authorities in several neighbouring countries of the European Union - Further verification of the TRANSURANUS Code by selected irradiations from the IFPE Database: extending the verification of the TRANSURANUS code on the basis of the cases related to the behaviour of high burnup UO 2 fuel, Gd-containing UO 2 fuel and MOX fuel under normal operating conditions in LWRs, including WWER. The simulation of the pellet-cladding mechanical interaction received particular attention. - Extending the TRANSURANUS code for simulation of fuel rods under accidental conditions, such as a loss of coolant accidents (LOCA) and a Reactivity Initiated Accident (RIA), for which various code improvements have been implemented and other changes are still under development. During the entire period of the CRP, ITU has carried out all priority cases, except for IFA-519 for which insufficient data have been provided. All these results are considered in this report. However, in order to limit the redundancy of work of various TRANSURANUS users involved in the FUMEX-III CRP, ITU has co-ordinated part of the work of the partners in Bulgaria, Romania and Italy. More precisely, the work on the WWER version of the code has been carried out in collaboration

  14. Performance in Public Organizations: Clarifying the Conceptual Space

    DEFF Research Database (Denmark)

    Andersen, Lotte Bøgh; Boesen, Andreas; Holm Pedersen, Lene

    2016-01-01

    's perspective is performance being assessed? Are the criteria formal or informal? Are the criteria subjective? Which process focus and product focus do they have, if any? What is the unit of analysis? Based on these distinctions, the performance criteria of existing studies used in an empirical review...... of management and performance are classified. The results illustrate how a systematization of the conceptual space of performance in public organizations can help researchers select what to study and what to leave out with greater accuracy while also bringing greater clarity to public debates about performance....

  15. Organization of cross-section data in the Monte Carlo code SPARTAN

    International Nuclear Information System (INIS)

    Bending, R.C.

    1974-01-01

    The Monte Carlo code SPARTAN is a general-purpose code intended for neutron or gamma transport calculations. The code is designed to accept physics data from a number of external libraries (which may be used singly or in combination) and to use this data with as little alteration as possible. Data obtained from one or several libraries is placed in an interface file on magnetic tape or disk, using a general hierarchical structure which allows particular data items to be assessed in a straightforward way. The interface file, with or without additional data from cards, is regarded as a data source for the main Monte Carlo calculation. A summary of the functional forms, sampling distributions, and particle interaction laws which are available at present, and some of the mathematical methods used are described. 5 references. (U.S.)

  16. Do senior management cultures affect performance? Evidence from Italian public healthcare organizations.

    Science.gov (United States)

    Prenestini, Anna; Lega, Federico

    2013-01-01

    Healthcare organizations are often characterized by diffuse power, ambiguous goals, and a plurality of actors. In this complex and pluralistic context, senior healthcare managers are expected to provide strategic direction and lead their organizations toward their goals and performance targets. The present work explores the relationship between senior management team culture and performance by investigating Italian public healthcare organizations in the Tuscany region. Our assessment of senior management culture was accomplished through the use of an established framework and a corresponding tool, the competing values framework, which supports the idea that specific aspects of performance are related to a dominant management culture. Organizational performance was assessed using a wide range of measures collected by a multidimensional performance evaluation system, which was developed in Tuscany to measure the performance of its 12 local health authorities (LHAs) and four teaching hospitals (THs). Usable responses were received from 80 senior managers of 11 different healthcare organizations (two THs and nine LHAs). Our findings show that Tuscan healthcare organizations are characterized by various dominant cultures: developmental, clan, rational, and hierarchical. These variations in dominant culture were associated with performance measures. The implications for management theory, professional practice, and public policy are discussed.

  17. Performance Assessment and analysis of national building codes with fire safety in all wards of a hospital

    Directory of Open Access Journals (Sweden)

    M. Mahdinia

    2009-04-01

    Full Text Available Background and aimsAIDS as a re-emergent disease and Viral hepatitis (B and C as one of thBackground and objective: Fire safety is an important problem in hospitals. Movement less, lack of awareness and special situation of residents are the reasons of this subject. In more countries such as Iran, fire protection regulations have compiled within the framework of national building codes. Current building codes don't create sufficient safety for patient in the hospitals in different situations and more of the advanced countries in the world effort to establish building code, base  on performance. This study to be accomplished with this goal that determination of fire risk level in the wards of a hospital and survey the efficiency of the national building codes. Methodsfire risk assesses is done, using "engineering fire risk assessment method" with the checklist for Data gathering. In this manner, risk calculate in all compartments and in the next  stage for survey the effect of building codes, with this supposition that all compartment is  conforming to building code requirement, risk level calculate in two situation.Resultsthe results of present study reveals that, risk level in all wards is more than one and even though risk less than one is acceptable, consequently minimum of safely situations didn't  produce in most wards. The results show the national building code in the different conditions  don't have appropriate efficient for creation of suitable safety. Conclusionin order to access to a fire safety design with sufficient efficiency, suitable selection is use of risk assessment based on, design methods.

  18. Design and performance analysis for several new classes of codes for optical synchronous CDMA and for arbitrary-medium time-hopping synchronous CDMA communication systems

    Science.gov (United States)

    Kostic, Zoran; Titlebaum, Edward L.

    1994-08-01

    New families of spread-spectrum codes are constructed, that are applicable to optical synchronous code-division multiple-access (CDMA) communications as well as to arbitrary-medium time-hopping synchronous CDMA communications. Proposed constructions are based on the mappings from integer sequences into binary sequences. We use the concept of number theoretic quadratic congruences and a subset of Reed-Solomon codes similar to the one utilized in the Welch-Costas frequency-hop (FH) patterns. The properties of the codes are as good as or better than the properties of existing codes for synchronous CDMA communications: Both the number of code-sequences within a single code family and the number of code families with good properties are significantly increased when compared to the known code designs. Possible applications are presented. To evaluate the performance of the proposed codes, a new class of hit arrays called cyclical hit arrays is recalled, which give insight into the previously unknown properties of the few classes of number theoretic FH patterns. Cyclical hit arrays and the proposed mappings are used to determine the exact probability distribution functions of random variables that represent interference between users of a time-hopping or optical CDMA system. Expressions for the bit error probability in multi-user CDMA systems are derived as a function of the number of simultaneous CDMA system users, the length of signature sequences and the threshold of a matched filter detector. The performance results are compared with the results for some previously known codes.

  19. From the ethical code to the international convention. A critical panorama of the World Tourism Organization from the cosmopolitanism perspective

    Directory of Open Access Journals (Sweden)

    José L. López-González

    2018-01-01

    Full Text Available This work addresses a critical analysis of the nature of cosmopolitan order encouraged by the World Tourism Organization. Given the growth of tourism activity and the challenges that it poses, this specialized agency of the United Nations has formally promoted equitable, responsible and sustainable development of tourism. However, the criticism of the principles from which the organization shapes tourism has put the focus on the objectives pursued. The conversion of these principles, included in the Global Code of Ethics for Tourism, adopted in 1999, into an international convention, approved in September 2017, presents a new scene that this work outlines from the cosmopolitanism perspective.

  20. Performance Analysis of a Decoding Algorithm for Algebraic Geometry Codes

    DEFF Research Database (Denmark)

    Jensen, Helge Elbrønd; Nielsen, Rasmus Refslund; Høholdt, Tom

    1998-01-01

    We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is greater than or equal to [(dFR-1)/2]+1, where dFR is the Feng-Rao distance......We analyse the known decoding algorithms for algebraic geometry codes in the case where the number of errors is greater than or equal to [(dFR-1)/2]+1, where dFR is the Feng-Rao distance...

  1. LPIC++. A parallel one-dimensional relativistic electromagnetic particle-in-cell code for simulating laser-plasma-interaction

    International Nuclear Information System (INIS)

    Lichters, R.; Pfund, R.E.W.; Meyer-ter-Vehn, J.

    1997-08-01

    The code LPIC++ presented here, is based on a one-dimensional, electromagnetic, relativistic PIC code that has originally been developed by one of the authors during a PhD thesis at the Max-Planck-Institut fuer Quantenoptik for kinetic simulations of high harmonic generation from overdense plasma surfaces. The code uses essentially the algorithm of Birdsall and Langdon and Villasenor and Bunemann. It is written in C++ in order to be easily extendable and has been parallelized to be able to grow in power linearly with the size of accessable hardware, e.g. massively parallel machines like Cray T3E. The parallel LPIC++ version uses PVM for communication between processors. PVM is public domain software, can be downloaded from the world wide web. A particular strength of LPIC++ lies in its clear program and data structure, which uses chained lists for the organization of grid cells and enables dynamic adjustment of spatial domain sizes in a very convenient way, and therefore easy balancing of processor loads. Also particles belonging to one cell are linked in a chained list and are immediately accessable from this cell. In addition to this convenient type of data organization in a PIC code, the code shows excellent performance in both its single processor and parallel version. (orig.)

  2. Challenge: Code of environmental law; Herausforderung Umweltgesetzbuch

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-07-15

    Within the meeting ''Challenge: Code of environmental law'' at 16th February, 2007, in Berlin (Federal Republic of Germany) and organized by the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (Berlin, Federal Republic of Germany), the following lectures were held: (a) the new code of environmental law as a contribution to more modernness and efficiency in the environmental politics (Sigmar Gabriel); (b) The code of environmental law from the view of the economy (Martin Wansleben); (c) Significance of the code of environmental law from the view of jurisprudence (Michael Kloepfer); (d) Targets, content and utility of the code of environmental law: Summary of the panel discussion (Tanja Goenner, Klaus Mittelbach, Juergen Resch, Hans-Joachim Koch, Alfred Wirtz, Andreas Troge (moderator)); (e) Considerations to the coding of water law in the code of environmental law (Helge Wendenburg); (f) Considerations to the coding of water law: Summary of te discussion; (g) Considerations to the coding of nature conservation law (Jochen Flasbarth); (h) Considerations to the coding of nature conservation law: Summary of the discussion.

  3. Differentially Encoded LDPC Codes—Part II: General Case and Code Optimization

    Directory of Open Access Journals (Sweden)

    Jing Li (Tiffany

    2008-04-01

    Full Text Available This two-part series of papers studies the theory and practice of differentially encoded low-density parity-check (DE-LDPC codes, especially in the context of noncoherent detection. Part I showed that a special class of DE-LDPC codes, product accumulate codes, perform very well with both coherent and noncoherent detections. The analysis here reveals that a conventional LDPC code, however, is not fitful for differential coding and does not, in general, deliver a desirable performance when detected noncoherently. Through extrinsic information transfer (EXIT analysis and a modified “convergence-constraint” density evolution (DE method developed here, we provide a characterization of the type of LDPC degree profiles that work in harmony with differential detection (or a recursive inner code in general, and demonstrate how to optimize these LDPC codes. The convergence-constraint method provides a useful extension to the conventional “threshold-constraint” method, and can match an outer LDPC code to any given inner code with the imperfectness of the inner decoder taken into consideration.

  4. Cross-index to DOE-prescribed occupational safety codes and standards

    International Nuclear Information System (INIS)

    1982-01-01

    A compilation of detailed information from more than three hundred and fifty DOE-prescribed or OSHA-referenced industrial safety codes and standards is presented. Condensed data from individual code portions are listed according to reference code, section, paragraph and page. A glossary of letter initials/abbreviations for the organizations or documents whose codes or standards are contained in this Cross-Index, is listed

  5. SERPENT Monte Carlo reactor physics code

    International Nuclear Information System (INIS)

    Leppaenen, J.

    2010-01-01

    SERPENT is a three-dimensional continuous-energy Monte Carlo reactor physics burnup calculation code, developed at VTT Technical Research Centre of Finland since 2004. The code is specialized in lattice physics applications, but the universe-based geometry description allows transport simulation to be carried out in complicated three-dimensional geometries as well. The suggested applications of SERPENT include generation of homogenized multi-group constants for deterministic reactor simulator calculations, fuel cycle studies involving detailed assembly-level burnup calculations, validation of deterministic lattice transport codes, research reactor applications, educational purposes and demonstration of reactor physics phenomena. The Serpent code has been publicly distributed by the OECD/NEA Data Bank since May 2009 and RSICC in the U. S. since March 2010. The code is being used in some 35 organizations in 20 countries around the world. This paper presents an overview of the methods and capabilities of the Serpent code, with examples in the modelling of WWER-440 reactor physics. (Author)

  6. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    Science.gov (United States)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  7. Optical code division multiple access secure communications systems with rapid reconfigurable polarization shift key user code

    Science.gov (United States)

    Gao, Kaiqiang; Wu, Chongqing; Sheng, Xinzhi; Shang, Chao; Liu, Lanlan; Wang, Jian

    2015-09-01

    An optical code division multiple access (OCDMA) secure communications system scheme with rapid reconfigurable polarization shift key (Pol-SK) bipolar user code is proposed and demonstrated. Compared to fix code OCDMA, by constantly changing the user code, the performance of anti-eavesdropping is greatly improved. The Pol-SK OCDMA experiment with a 10 Gchip/s user code and a 1.25 Gb/s user data of payload has been realized, which means this scheme has better tolerance and could be easily realized.

  8. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  9. Accounting for External Turbulence of Logistics Organizations via Performance Measurement Systems

    DEFF Research Database (Denmark)

    Bühler, Andreas; Wallenburg, Carl Marcus; Wieland, Andreas

    2016-01-01

    Purpose: This paper aims to investigate the role of upper management in designing performance measurement systems (PMS) that account for external turbulence of the organization and to show how this PMS design for turbulence impacts organizational resilience and distribution service performance....... Design/methodology/approach: Hypotheses are developed by integrating management accounting and strategic management perspectives into supply chain management and subsequently tested based on data from 431 logistics organizations (i.e. both logistics companies and internal logistics departments...... distribution service performance. Originality/value: This paper is the first to introduce the concept of PMS design for turbulence to the literature and to show that it is relevant for supply chain risk management by fostering the capabilities and the performance of logistics organizations. Further...

  10. Preliminary investigation study of code of developed country for developing Korean fuel cycle code

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Ko, Won Il; Lee, Ho Hee; Cho, Dong Keun; Park, Chang Je

    2012-01-01

    In order to develop Korean fuel cycle code, the analyses has been performed with the fuel cycle codes which are used in advanced country. Also, recommendations were proposed for future development. The fuel cycle codes are AS FLOOWS: VISTA which has been developed by IAEA, DANESS code which developed by ANL and LISTO, and VISION developed by INL for the Advanced Fuel Cycle Initiative (AFCI) system analysis. The recommended items were proposed for software, program scheme, material flow model, isotope decay model, environmental impact analysis model, and economics analysis model. The described things will be used for development of Korean nuclear fuel cycle code in future

  11. Spread-spectrum communication using binary spatiotemporal chaotic codes

    International Nuclear Information System (INIS)

    Wang Xingang; Zhan Meng; Gong Xiaofeng; Lai, C.H.; Lai, Y.-C.

    2005-01-01

    We propose a scheme to generate binary code for baseband spread-spectrum communication by using a chain of coupled chaotic maps. We compare the performances of this type of spatiotemporal chaotic code with those of a conventional code used frequently in digital communication, the Gold code, and demonstrate that our code is comparable or even superior to the Gold code in several key aspects: security, bit error rate, code generation speed, and the number of possible code sequences. As the field of communicating with chaos faces doubts in terms of performance comparison with conventional digital communication schemes, our work gives a clear message that communicating with chaos can be advantageous and it deserves further attention from the nonlinear science community

  12. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  13. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  14. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  15. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  16. Iterative linear solvers in a 2D radiation-hydrodynamics code: Methods and performance

    International Nuclear Information System (INIS)

    Baldwin, C.; Brown, P.N.; Falgout, R.; Graziani, F.; Jones, J.

    1999-01-01

    Computer codes containing both hydrodynamics and radiation play a central role in simulating both astrophysical and inertial confinement fusion (ICF) phenomena. A crucial aspect of these codes is that they require an implicit solution of the radiation diffusion equations. The authors present in this paper the results of a comparison of five different linear solvers on a range of complex radiation and radiation-hydrodynamics problems. The linear solvers used are diagonally scaled conjugate gradient, GMRES with incomplete LU preconditioning, conjugate gradient with incomplete Cholesky preconditioning, multigrid, and multigrid-preconditioned conjugate gradient. These problems involve shock propagation, opacities varying over 5--6 orders of magnitude, tabular equations of state, and dynamic ALE (Arbitrary Lagrangian Eulerian) meshes. They perform a problem size scalability study by comparing linear solver performance over a wide range of problem sizes from 1,000 to 100,000 zones. The fundamental question they address in this paper is: Is it more efficient to invert the matrix in many inexpensive steps (like diagonally scaled conjugate gradient) or in fewer expensive steps (like multigrid)? In addition, what is the answer to this question as a function of problem size and is the answer problem dependent? They find that the diagonally scaled conjugate gradient method performs poorly with the growth of problem size, increasing in both iteration count and overall CPU time with the size of the problem and also increasing for larger time steps. For all problems considered, the multigrid algorithms scale almost perfectly (i.e., the iteration count is approximately independent of problem size and problem time step). For pure radiation flow problems (i.e., no hydrodynamics), they see speedups in CPU time of factors of ∼15--30 for the largest problems, when comparing the multigrid solvers relative to diagonal scaled conjugate gradient

  17. The HELIOS-2 lattice physics code

    International Nuclear Information System (INIS)

    Wemple, C.A.; Gheorghiu, H-N.M.; Stamm'ler, R.J.J.; Villarino, E.A.

    2008-01-01

    Major advances have been made in the HELIOS code, resulting in the impending release of a new version, HELIOS-2. The new code includes a method of characteristics (MOC) transport solver to supplement the existing collision probabilities (CP) solver. A 177-group, ENDF/B-VII nuclear data library has been developed for inclusion with the new code package. Computational tests have been performed to verify the performance of the MOC solver against the CP solver, and validation testing against computational and measured benchmarks is underway. Results to-date of the verification and validation testing are presented, demonstrating the excellent performance of the new transport solver and nuclear data library. (Author)

  18. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  19. Individual Training, Performance Improvement, and the Future for Organizations

    Science.gov (United States)

    Kaufman, Roger

    2015-01-01

    Human competence is a vital element for any organization that expects to survive and then thrive. Developing individual performance ability is necessary but not sufficient because trained people alone will not make an organization successful. We must determine what people should deliver and why it should be delivered in order to add measurable…

  20. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.