WorldWideScience

Sample records for significantly improve coding

  1. Deep Learning Methods for Improved Decoding of Linear Codes

    Science.gov (United States)

    Nachmani, Eliya; Marciano, Elad; Lugosch, Loren; Gross, Warren J.; Burshtein, David; Be'ery, Yair

    2018-02-01

    The problem of low complexity, close to optimal, channel decoding of linear codes with short to moderate block length is considered. It is shown that deep learning methods can be used to improve a standard belief propagation decoder, despite the large example space. Similar improvements are obtained for the min-sum algorithm. It is also shown that tying the parameters of the decoders across iterations, so as to form a recurrent neural network architecture, can be implemented with comparable results. The advantage is that significantly less parameters are required. We also introduce a recurrent neural decoder architecture based on the method of successive relaxation. Improvements over standard belief propagation are also observed on sparser Tanner graph representations of the codes. Furthermore, we demonstrate that the neural belief propagation decoder can be used to improve the performance, or alternatively reduce the computational complexity, of a close to optimal decoder of short BCH codes.

  2. Accurate coding in sepsis: clinical significance and financial implications.

    Science.gov (United States)

    Chin, Y T; Scattergood, N; Thornber, M; Thomas, S

    2016-09-01

    Sepsis is a major healthcare problem and leading cause of death worldwide. UK hospital mortality statistics and payments for patient episodes of care are calculated on clinical coding data. The accuracy of these data depends on the quality of coding. This study aimed to investigate whether patients with significant bacteraemia are coded for sepsis and to estimate the financial costs of miscoding. Of 54 patients over a one-month period with a significant bacteraemia, only 19% had been coded for sepsis. This is likely to lead to falsely high calculated hospital mortality. Furthermore, this resulted in an underpayment of £21,000 for one month alone. Copyright © 2016 The Healthcare Infection Society. All rights reserved.

  3. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  4. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  5. Recent Improvements in the SHIELD-HIT Code

    DEFF Research Database (Denmark)

    Hansen, David Christoffer; Lühr, Armin Christian; Herrmann, Rochus

    2012-01-01

    Purpose: The SHIELD-HIT Monte Carlo particle transport code has previously been used to study a wide range of problems for heavy-ion treatment and has been benchmarked extensively against other Monte Carlo codes and experimental data. Here, an improved version of SHIELD-HIT is developed concentra......Purpose: The SHIELD-HIT Monte Carlo particle transport code has previously been used to study a wide range of problems for heavy-ion treatment and has been benchmarked extensively against other Monte Carlo codes and experimental data. Here, an improved version of SHIELD-HIT is developed...

  6. Improvements to SOIL: An Eulerian hydrodynamics code

    International Nuclear Information System (INIS)

    Davis, C.G.

    1988-04-01

    Possible improvements to SOIL, an Eulerian hydrodynamics code that can do coupled radiation diffusion and strength of materials, are presented in this report. Our research is based on the inspection of other Eulerian codes and theoretical reports on hydrodynamics. Several conclusions from the present study suggest that some improvements are in order, such as second-order advection, adaptive meshes, and speedup of the code by vectorization and/or multitasking. 29 refs., 2 figs

  7. Improvement of Parallel Algorithm for MATRA Code

    International Nuclear Information System (INIS)

    Kim, Seong-Jin; Seo, Kyong-Won; Kwon, Hyouk; Hwang, Dae-Hyun

    2014-01-01

    The feasibility study to parallelize the MATRA code was conducted in KAERI early this year. As a result, a parallel algorithm for the MATRA code has been developed to decrease a considerably required computing time to solve a bigsize problem such as a whole core pin-by-pin problem of a general PWR reactor and to improve an overall performance of the multi-physics coupling calculations. It was shown that the performance of the MATRA code was greatly improved by implementing the parallel algorithm using MPI communication. For problems of a 1/8 core and whole core for SMART reactor, a speedup was evaluated as about 10 when the numbers of used processor were 25. However, it was also shown that the performance deteriorated as the axial node number increased. In this paper, the procedure of a communication between processors is optimized to improve the previous parallel algorithm.. To improve the performance deterioration of the parallelized MATRA code, the communication algorithm between processors was newly presented. It was shown that the speedup was improved and stable regardless of the axial node number

  8. Latest improvements on TRACPWR six-equations thermohydraulic code

    International Nuclear Information System (INIS)

    Rivero, N.; Batuecas, T.; Martinez, R.; Munoz, J.; Lenhardt, G.; Serrano, P.

    1999-01-01

    The paper presents the latest improvements on TRACPWR aimed at adapting the code to present trends on computer platforms, architectures and training requirements as well as extending the scope of the code itself and its applicability to other technologies different from Westinghouse PWR one. Firstly major features of TRACPWR as best estimate and real time simulation code are summed, then the areas where TRACPWR is being improved are presented. These areas comprising: (1) Architecture: integrating TRACPWR and RELAP5 codes, (2) Code scope enhancement: modelling the Mid-Loop operation, (3) Code speed-up: applying parallelization techniques, (4) Code platform downswing: porting to Windows N1 platform, (5) On-line performance: allowing simulation initialisation from a Plant Process Computer, and (6) Code scope extension: using the code for modelling VVER and PHWR technology. (author)

  9. Improving coding accuracy in an academic practice.

    Science.gov (United States)

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  10. Improved Algorithms Speed It Up for Codes

    International Nuclear Information System (INIS)

    Hazi, A

    2005-01-01

    Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leader for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics

  11. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  12. Improved Iterative Decoding of Network-Channel Codes for Multiple-Access Relay Channel.

    Science.gov (United States)

    Majumder, Saikat; Verma, Shrish

    2015-01-01

    Cooperative communication using relay nodes is one of the most effective means of exploiting space diversity for low cost nodes in wireless network. In cooperative communication, users, besides communicating their own information, also relay the information of other users. In this paper we investigate a scheme where cooperation is achieved using a common relay node which performs network coding to provide space diversity for two information nodes transmitting to a base station. We propose a scheme which uses Reed-Solomon error correcting code for encoding the information bit at the user nodes and convolutional code as network code, instead of XOR based network coding. Based on this encoder, we propose iterative soft decoding of joint network-channel code by treating it as a concatenated Reed-Solomon convolutional code. Simulation results show significant improvement in performance compared to existing scheme based on compound codes.

  13. Improved decoding for a concatenated coding system

    DEFF Research Database (Denmark)

    Paaske, Erik

    1990-01-01

    The concatenated coding system recommended by CCSDS (Consultative Committee for Space Data Systems) uses an outer (255,233) Reed-Solomon (RS) code based on 8-b symbols, followed by the block interleaver and an inner rate 1/2 convolutional code with memory 6. Viterbi decoding is assumed. Two new...... decoding procedures based on repeated decoding trials and exchange of information between the two decoders and the deinterleaver are proposed. In the first one, where the improvement is 0.3-0.4 dB, only the RS decoder performs repeated trials. In the second one, where the improvement is 0.5-0.6 dB, both...... decoders perform repeated decoding trials and decoding information is exchanged between them...

  14. Improvement of blow down model for LEAP code

    International Nuclear Information System (INIS)

    Itooka, Satoshi; Fujimata, Kazuhiro

    2003-03-01

    In Japan Nuclear Cycle Development Institute, the improvement of analysis method for overheating tube rapture was studied for the accident of sodium-water reactions in the steam generator of a fast breeder reactor and the evaluation of heat transfer condition in the tube were carried out based on study of critical heat flux (CHF) and post-CHF heat transfer equation in Light Water Reactors. In this study, the improvement of blow down model for the LEAP code was carried out taking into consideration the above-mentioned evaluation of heat transfer condition. Improvements of the LEAP code were following items. Calculations and verification were performed with the improved LEAP code in order to confirm the code functions. The addition of critical heat flux (CHF) by the formula of Katto and the formula of Tong. The addition of post-CHF heat transfer equation by the formula of Condie-BengstonIV and the formula of Groeneveld 5.9. The physical properties of the water and steam are expanded to the critical conditions of the water. The expansion of the total number of section and the improvement of the input form. The addition of the function to control the valve setting by the PID control model. (author)

  15. Recent developments in KTF. Code optimization and improved numerics

    International Nuclear Information System (INIS)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin

    2012-01-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  16. Recent developments in KTF. Code optimization and improved numerics

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin [Karlsruhe Institute of Technology (KIT) (Germany). Inst. for Neutron Physics and Reactor Technology (INR)

    2012-11-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  17. Improved mesh generator for the POISSON Group Codes

    International Nuclear Information System (INIS)

    Gupta, R.C.

    1987-01-01

    This paper describes the improved mesh generator of the POISSON Group Codes. These improvements enable one to have full control over the way the mesh is generated and in particular the way the mesh density is distributed throughout this model. A higher mesh density in certain regions coupled with a successively lower mesh density in others keeps the accuracy of the field computation high and the requirements on the computer time and computer memory low. The mesh is generated with the help of codes AUTOMESH and LATTICE; both have gone through a major upgrade. Modifications have also been made in the POISSON part of these codes. We shall present an example of a superconducting dipole magnet to explain how to use this code. The results of field computations are found to be reliable within a few parts in a hundred thousand even in such complex geometries

  18. Improvements, verifications and validations of the BOW code

    International Nuclear Information System (INIS)

    Yu, S.D.; Tayal, M.; Singh, P.N.

    1995-01-01

    The BOW code calculates the lateral deflections of a fuel element consisting of sheath and pellets, due to temperature gradients, hydraulic drag and gravity. the fuel element is subjected to restraint from endplates, neighboring fuel elements and the pressure tube. Many new features have been added to the BOW code since its original release in 1985. This paper outlines the major improvements made to the code and verification/validation results. (author)

  19. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  20. Decoding of interleaved Reed-Solomon codes using improved power decoding

    DEFF Research Database (Denmark)

    Puchinger, Sven; Rosenkilde ne Nielsen, Johan

    2017-01-01

    We propose a new partial decoding algorithm for m-interleaved Reed-Solomon (IRS) codes that can decode, with high probability, a random error of relative weight 1 − Rm/m+1 at all code rates R, in time polynomial in the code length n. For m > 2, this is an asymptotic improvement over the previous...... state-of-the-art for all rates, and the first improvement for R > 1/3 in the last 20 years. The method combines collaborative decoding of IRS codes with power decoding up to the Johnson radius....

  1. Improved Design of Unequal Error Protection LDPC Codes

    Directory of Open Access Journals (Sweden)

    Sandberg Sara

    2010-01-01

    Full Text Available We propose an improved method for designing unequal error protection (UEP low-density parity-check (LDPC codes. The method is based on density evolution. The degree distribution with the best UEP properties is found, under the constraint that the threshold should not exceed the threshold of a non-UEP code plus some threshold offset. For different codeword lengths and different construction algorithms, we search for good threshold offsets for the UEP code design. The choice of the threshold offset is based on the average a posteriori variable node mutual information. Simulations reveal the counter intuitive result that the short-to-medium length codes designed with a suitable threshold offset all outperform the corresponding non-UEP codes in terms of average bit-error rate. The proposed codes are also compared to other UEP-LDPC codes found in the literature.

  2. Use of diagnosis codes for detection of clinically significant opioid poisoning in the emergency department: A retrospective analysis of a surveillance case definition.

    Science.gov (United States)

    Reardon, Joseph M; Harmon, Katherine J; Schult, Genevieve C; Staton, Catherine A; Waller, Anna E

    2016-02-08

    Although fatal opioid poisonings tripled from 1999 to 2008, data describing nonfatal poisonings are rare. Public health authorities are in need of tools to track opioid poisonings in near real time. We determined the utility of ICD-9-CM diagnosis codes for identifying clinically significant opioid poisonings in a state-wide emergency department (ED) surveillance system. We sampled visits from four hospitals from July 2009 to June 2012 with diagnosis codes of 965.00, 965.01, 965.02 and 965.09 (poisoning by opiates and related narcotics) and/or an external cause of injury code of E850.0-E850.2 (accidental poisoning by opiates and related narcotics), and developed a novel case definition to determine in which cases opioid poisoning prompted the ED visit. We calculated the percentage of visits coded for opioid poisoning that were clinically significant and compared it to the percentage of visits coded for poisoning by non-opioid agents in which there was actually poisoning by an opioid agent. We created a multivariate regression model to determine if other collected triage data can improve the positive predictive value of diagnosis codes alone for detecting clinically significant opioid poisoning. 70.1 % of visits (Standard Error 2.4 %) coded for opioid poisoning were primarily prompted by opioid poisoning. The remainder of visits represented opioid exposure in the setting of other primary diseases. Among non-opioid poisoning codes reviewed, up to 36 % were reclassified as an opioid poisoning. In multivariate analysis, only naloxone use improved the positive predictive value of ICD-9-CM codes for identifying clinically significant opioid poisoning, but was associated with a high false negative rate. This surveillance mechanism identifies many clinically significant opioid overdoses with a high positive predictive value. With further validation, it may help target control measures such as prescriber education and pharmacy monitoring.

  3. Improved Power Decoding of One-Point Hermitian Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Bouw, Irene; Rosenkilde, Johan Sebastian Heesemann

    2017-01-01

    We propose a new partial decoding algorithm for one-point Hermitian codes that can decode up to the same number of errors as the Guruswami–Sudan decoder. Simulations suggest that it has a similar failure probability as the latter one. The algorithm is based on a recent generalization of the power...... decoding algorithm for Reed–Solomon codes and does not require an expensive root-finding step. In addition, it promises improvements for decoding interleaved Hermitian codes....

  4. Improvements of MCOR: A Monte Carlo depletion code system for fuel assembly reference calculations

    Energy Technology Data Exchange (ETDEWEB)

    Tippayakul, C.; Ivanov, K. [Pennsylvania State Univ., Univ. Park (United States); Misu, S. [AREVA NP GmbH, An AREVA and SIEMENS Company, Erlangen (Germany)

    2006-07-01

    This paper presents the improvements of MCOR, a Monte Carlo depletion code system for fuel assembly reference calculations. The improvements of MCOR were initiated by the cooperation between the Penn State Univ. and AREVA NP to enhance the original Penn State Univ. MCOR version in order to be used as a new Monte Carlo depletion analysis tool. Essentially, a new depletion module using KORIGEN is utilized to replace the existing ORIGEN-S depletion module in MCOR. Furthermore, the online burnup cross section generation by the Monte Carlo calculation is implemented in the improved version instead of using the burnup cross section library pre-generated by a transport code. Other code features have also been added to make the new MCOR version easier to use. This paper, in addition, presents the result comparisons of the original and the improved MCOR versions against CASMO-4 and OCTOPUS. It was observed in the comparisons that there were quite significant improvements of the results in terms of k{sub inf}, fission rate distributions and isotopic contents. (authors)

  5. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation.

    Science.gov (United States)

    Dyers, Robin; Ward, Grant; Du Plooy, Shane; Fourie, Stephanus; Evans, Juliet; Mahomed, Hassan

    2017-05-24

    The proposed National Health Insurance policy for South Africa (SA) requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD) codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2) adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  6. Improvements to the code alpha

    International Nuclear Information System (INIS)

    Aleman, J.R.; Domenech, J.

    1992-01-01

    The Computer Code ALFA was created for the calculation of teh heat transfer coefficient and the hydraulic friction factor in various points of a Nuclear Power Plant with VVER-type reactor. In this paper the algorithm of the program is described. At the same time highlight several improvements introduced to the original model. Results are reported for a wide range of temperature and pressure within a cooling channel of the core

  7. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation

    Directory of Open Access Journals (Sweden)

    Robin Dyers

    2017-06-01

    Full Text Available Background. The proposed National Health Insurance policy for South Africa (SA requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. Objective. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. Methods. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. Results. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2 adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. Conclusion. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  8. Improvement of a combustion model in MELCOR code

    International Nuclear Information System (INIS)

    Ogino, Masao; Hashimoto, Takashi

    1999-01-01

    NUPEC has been improving a hydrogen combustion model in MELCOR code for severe accident analysis. In the proposed combustion model, the flame velocity in a node was predicted using five different flame front shapes of fireball, prism, bubble, spherical jet, and plane jet. For validation of the proposed model, the results of the Battelle multi-compartment hydrogen combustion test were used. The selected test cases for the study were Hx-6, 13, 14, 20 and Ix-2 which had two, three or four compartments under homogeneous hydrogen concentration of 5 to 10 vol%. The proposed model could predict well the combustion behavior in multi-compartment containment geometry on the whole. MELCOR code, incorporating the present combustion model, can simulate combustion behavior during severe accident with acceptable computing time and some degree of accuracy. The applicability study of the improved MELCOR code to the actual reactor plants will be further continued. (author)

  9. Improvements and validation of the transient analysis code MOREL for molten salt reactors

    International Nuclear Information System (INIS)

    Zhuang Kun; Zheng Youqi; Cao Liangzhi; Hu Tianliang; Wu Hongchun

    2017-01-01

    The liquid fuel salt used in the molten salt reactors (MSRs) serves as the fuel and coolant simultaneously. On the one hand, the delayed neutron precursors circulate in the whole primary loop and part of them decay outside the core. On the other hand, the fission heat is carried off directly by the fuel flow. These two features require new analysis method with the coupling of fluid flow, heat transfer and neutronics. In this paper, the recent update of MOREL code is presented. The update includes: (1) the improved quasi-static method for the kinetics equation with convection term is developed. (2) The multi-channel thermal hydraulic model is developed based on the geometric feature of MSR. (3) The Variational Nodal Method is used to solve the neutron diffusion equation instead of the original analytic basis functions expansion nodal method. The update brings significant improvement on the efficiency of MOREL code. And, the capability of MOREL code is extended for the real core simulation with feedback. The numerical results and experiment data gained from molten salt reactor experiment (MSRE) are used to verify and validate the updated MOREL code. The results agree well with the experimental data, which prove the new development of MOREL code is correct and effective. (author)

  10. Improvement of group collapsing in TRANSX code

    International Nuclear Information System (INIS)

    Jeong, Hyun Tae; Kim, Young Cheol; Kim, Young In; Kim, Young Kyun

    1996-07-01

    A cross section generating and processing computer code TRANSX version 2.15 in the K-CORE system, being developed by the KAERI LMR core design technology development team produces various cross section input files appropriated for flux calculation options from the cross section library MATXS. In this report, a group collapsing function of TRANSX has been improved to utilize the zone averaged flux file RZFLUX written in double precision as flux weighting functions. As a result, an iterative calculation system using double precision RZFLUX consisting of the cross section data library file MATXS, the effective cross section producing and processing code TRANSX, and the transport theory calculation code TWODANT has been set up and verified through a sample model calculation. 4 refs. (Author)

  11. Recent SCDAP/RELAP5 code applications and improvements

    International Nuclear Information System (INIS)

    Harvego, E.A.; Ghan, L.S.; Knudson, D.L.; Siefken, L.J.

    1998-01-01

    This paper summarizes (1) a recent application of the severe accident analysis code SCDAP/RELAP5/MOD3.1, and (2) development and assessment activities associated with the release of SACDAP/RELAP5/MOD3.2. The Nuclear Regulatory Commission (NRC) has been evaluating the integrity of steam generator tubes during severe accidents. MOD3.1 has been used to support that evaluation. Studies indicate that the pressurizer surge line will fail before any steam generator tubes are damaged. Thus, core decay energy would be released as steam through the surge line and the tube wall would be spared from exposure to prolonged flow of high temperature steam. The latest code version, MOD3.2, contains several improvements to models that address both the early phase and late phase of a severe accident. The impact of these improvements to the overall code capabilities has been assessed. Results of the assessment are summarized in this paper

  12. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  13. Evaluation of three coding schemes designed for improved data communication

    Science.gov (United States)

    Snelsire, R. W.

    1974-01-01

    Three coding schemes designed for improved data communication are evaluated. Four block codes are evaluated relative to a quality function, which is a function of both the amount of data rejected and the error rate. The Viterbi maximum likelihood decoding algorithm as a decoding procedure is reviewed. This evaluation is obtained by simulating the system on a digital computer. Short constraint length rate 1/2 quick-look codes are studied, and their performance is compared to general nonsystematic codes.

  14. Improved predictions of nuclear reaction rates with the TALYS reaction code for astrophysical applications

    International Nuclear Information System (INIS)

    Goriely, S.; Hilaire, S.; Koning, A.J

    2008-01-01

    Context. Nuclear reaction rates of astrophysical applications are traditionally determined on the basis of Hauser-Feshbach reaction codes. These codes adopt a number of approximations that have never been tested, such as a simplified width fluctuation correction, the neglect of delayed or multiple-particle emission during the electromagnetic decay cascade, or the absence of the pre-equilibrium contribution at increasing incident energies. Aims. The reaction code TALYS has been recently updated to estimate the Maxwellian-averaged reaction rates that are of astrophysical relevance. These new developments enable the reaction rates to be calculated with increased accuracy and reliability and the approximations of previous codes to be investigated. Methods. The TALYS predictions for the thermonuclear rates of relevance to astrophysics are detailed and compared with those derived by widely-used codes for the same nuclear ingredients. Results. It is shown that TALYS predictions may differ significantly from those of previous codes, in particular for nuclei for which no or little nuclear data is available. The pre-equilibrium process is shown to influence the astrophysics rates of exotic neutron-rich nuclei significantly. For the first time, the Maxwellian- averaged (n, 2n) reaction rate is calculated for all nuclei and its competition with the radiative capture rate is discussed. Conclusions. The TALYS code provides a new tool to estimate all nuclear reaction rates of relevance to astrophysics with improved accuracy and reliability. (authors)

  15. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs

  16. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs.

  17. Improvement and test calculation on basic code or sodium-water reaction jet

    Energy Technology Data Exchange (ETDEWEB)

    Saito, Yoshinori; Itooka, Satoshi [Advanced Reactor Engineering Center, Hitachi Works, Hitachi Ltd., Hitachi, Ibaraki (Japan); Okabe, Ayao; Fujimata, Kazuhiro; Sakurai, Tomoo [Consulting Engineering Dept., Hitachi Engineering Co., Ltd., Hitachi, Ibaraki (Japan)

    1999-03-01

    In selecting the reasonable DBL (design basis water leak rate) on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on the basic code for sodium-water reaction (SWR) jet was performed for an actual scale SG. The improvement points of the code are as follows; (1) introduction of advanced model such as heat transfer between the jet and structure (tube array), cooling effect of the structure, heat transfer between analytic cells, and (2) model improvement for heat transfer between two-phase flow and porous-media. The test calculation using the improved code (LEAP-JET ver.1.30) were carried out with conditions of the SWAT-3{center_dot}Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results is reasonable and Influence to analysis result of a model. Code integration with the blow down analytic code (LEAP-BLOW) was also studied. It is suitable that LEAP-JET was improved as one of the LEAP-BLOW's models, and it was integrated into this. In addition to above, the improvement for setting of boundary condition and the development of the interface program to transfer the analytical results of LEAP-BLOW have been performed in order to consider the cooling effect of coolant in the tube simply. However, verification of the code by new SWAT-1 and SWAT-3 test data planned in future is necessary because LEAP-JET is under development. And furthermore advancement needs to be planned. (author)

  18. Improvement and test calculation on basic code or sodium-water reaction jet

    International Nuclear Information System (INIS)

    Saito, Yoshinori; Itooka, Satoshi; Okabe, Ayao; Fujimata, Kazuhiro; Sakurai, Tomoo

    1999-03-01

    In selecting the reasonable DBL (design basis water leak rate) on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on the basic code for sodium-water reaction (SWR) jet was performed for an actual scale SG. The improvement points of the code are as follows; (1) introduction of advanced model such as heat transfer between the jet and structure (tube array), cooling effect of the structure, heat transfer between analytic cells, and (2) model improvement for heat transfer between two-phase flow and porous-media. The test calculation using the improved code (LEAP-JET ver.1.30) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results is reasonable and Influence to analysis result of a model. Code integration with the blow down analytic code (LEAP-BLOW) was also studied. It is suitable that LEAP-JET was improved as one of the LEAP-BLOW's models, and it was integrated into this. In addition to above, the improvement for setting of boundary condition and the development of the interface program to transfer the analytical results of LEAP-BLOW have been performed in order to consider the cooling effect of coolant in the tube simply. However, verification of the code by new SWAT-1 and SWAT-3 test data planned in future is necessary because LEAP-JET is under development. And furthermore advancement needs to be planned. (author)

  19. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    Science.gov (United States)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  20. Improvement of JRR-4 core management code system

    International Nuclear Information System (INIS)

    Izumo, H.; Watanabe, S.; Nagatomi, H.; Hori, N.

    2000-01-01

    In the modification of JRR-4, the fuel was changed from 93% high enrichment uranium aluminized fuel to 20% low enriched uranium silicide fuel in conformity with the framework of reduced enrichment program on JAERI research reactors. As changing of this, JRR-4 core management code system which estimates excess reactivity of core, fuel burn-up and so on, was improved too. It had been difficult for users to operate the former code system because its input-output form was text-form. But, in the new code system (COMMAS-JRR), users are able to operate the code system without using difficult text-form input. The estimation results of excess reactivity of JRR-4 LEU fuel core were showed very good agreements with the measured value. It is the strong points of this new code system to be operated simply by using the windows form pictures act on a personal workstation equip with the graphical-user-interface (GUI), and to estimate accurately the specific characteristics of the LEU core. (author)

  1. Developing improved MD codes for understanding processive cellulases

    International Nuclear Information System (INIS)

    Crowley, M F; Nimlos, M R; Himmel, M E; Uberbacher, E C; Iii, C L Brooks; Walker, R C

    2008-01-01

    The mechanism of action of cellulose-degrading enzymes is illuminated through a multidisciplinary collaboration that uses molecular dynamics (MD) simulations and expands the capabilities of MD codes to allow simulations of enzymes and substrates on petascale computational facilities. There is a class of glycoside hydrolase enzymes called cellulases that are thought to decrystallize and processively depolymerize cellulose using biochemical processes that are largely not understood. Understanding the mechanisms involved and improving the efficiency of this hydrolysis process through computational models and protein engineering presents a compelling grand challenge. A detailed understanding of cellulose structure, dynamics and enzyme function at the molecular level is required to direct protein engineers to the right modifications or to understand if natural thermodynamic or kinetic limits are in play. Much can be learned about processivity by conducting carefully designed molecular dynamics (MD) simulations of the binding and catalytic domains of cellulases with various substrate configurations, solvation models and thermodynamic protocols. Most of these numerical experiments, however, will require significant modification of existing code and algorithms in order to efficiently use current (terascale) and future (petascale) hardware to the degree of parallelism necessary to simulate a system of the size proposed here. This work will develop MD codes that can efficiently use terascale and petascale systems, not just for simple classical MD simulations, but also for more advanced methods, including umbrella sampling with complex restraints and reaction coordinates, transition path sampling, steered molecular dynamics, and quantum mechanical/molecular mechanical simulations of systems the size of cellulose degrading enzymes acting on cellulose

  2. Improved virtual channel noise model for transform domain Wyner-Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2009-01-01

    Distributed video coding (DVC) has been proposed as a new video coding paradigm to deal with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. A virtual channel noise model is utilized at the decoder to estimate...... the noise distribution between the side information frame and the original frame. This is one of the most important aspects influencing the coding performance of DVC. Noise models with different granularity have been proposed. In this paper, an improved noise model for transform domain Wyner-Ziv video...... coding is proposed, which utilizes cross-band correlation to estimate the Laplacian parameters more accurately. Experimental results show that the proposed noise model can improve the rate-distortion (RD) performance....

  3. Improvement on reaction model for sodium-water reaction jet code and application analysis

    International Nuclear Information System (INIS)

    Itooka, Satoshi; Saito, Yoshinori; Okabe, Ayao; Fujimata, Kazuhiro; Murata, Shuuichi

    2000-03-01

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  4. Development of improved methods for the LWR lattice physics code EPRI-CELL

    International Nuclear Information System (INIS)

    Williams, M.L.; Wright, R.Q.; Barhen, J.

    1982-07-01

    A number of improvements have been made by ORNL to the lattice physics code EPRI-CELL (E-C) which is widely used by utilities for analysis of power reactors. The code modifications were made mainly in the thermal and epithermal routines and resulted in improved reactor physics approximations and more efficient running times. The improvements in the thermal flux calculation included implementation of a group-dependent rebalance procedure to accelerate the iterative process and a more rigorous calculation of interval-to-interval collision probabilities. The epithermal resonance shielding methods used in the code have been extensively studied to determine its major approximations and to examine the sensitivity of computed results to these approximations. The study has resulted in several improvements in the original methodology

  5. Improved accuracy of co-morbidity coding over time after the introduction of ICD-10 administrative data.

    Science.gov (United States)

    Januel, Jean-Marie; Luthi, Jean-Christophe; Quan, Hude; Borst, François; Taffé, Patrick; Ghali, William A; Burnand, Bernard

    2011-08-18

    Co-morbidity information derived from administrative data needs to be validated to allow its regular use. We assessed evolution in the accuracy of coding for Charlson and Elixhauser co-morbidities at three time points over a 5-year period, following the introduction of the International Classification of Diseases, 10th Revision (ICD-10), coding of hospital discharges. Cross-sectional time trend evaluation study of coding accuracy using hospital chart data of 3'499 randomly selected patients who were discharged in 1999, 2001 and 2003, from two teaching and one non-teaching hospital in Switzerland. We measured sensitivity, positive predictive and Kappa values for agreement between administrative data coded with ICD-10 and chart data as the 'reference standard' for recording 36 co-morbidities. For the 17 the Charlson co-morbidities, the sensitivity - median (min-max) - was 36.5% (17.4-64.1) in 1999, 42.5% (22.2-64.6) in 2001 and 42.8% (8.4-75.6) in 2003. For the 29 Elixhauser co-morbidities, the sensitivity was 34.2% (1.9-64.1) in 1999, 38.6% (10.5-66.5) in 2001 and 41.6% (5.1-76.5) in 2003. Between 1999 and 2003, sensitivity estimates increased for 30 co-morbidities and decreased for 6 co-morbidities. The increase in sensitivities was statistically significant for six conditions and the decrease significant for one. Kappa values were increased for 29 co-morbidities and decreased for seven. Accuracy of administrative data in recording clinical conditions improved slightly between 1999 and 2003. These findings are of relevance to all jurisdictions introducing new coding systems, because they demonstrate a phenomenon of improved administrative data accuracy that may relate to a coding 'learning curve' with the new coding system.

  6. Improvement of level-1 PSA computer code package

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on `The improvement of level-1 PSA Computer Codes` is divided into two main activities : (1) improvement of level-1 PSA methodology, (2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs.

  7. Improvement of level-1 PSA computer code package

    International Nuclear Information System (INIS)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The improvement of level-1 PSA Computer Codes' is divided into two main activities : 1) improvement of level-1 PSA methodology, 2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs

  8. APPLE-3: improvement of APPLE for neutron and gamma-ray flux, spectrum and reaction rate plotting code, and of its code manual

    International Nuclear Information System (INIS)

    Kawasaki, Hiromitu; Maki, Koichi; Seki, Yasushi.

    1991-03-01

    A code APPLE was produced in 1976 for calculating and plotting tritium breeding ratio and tritium production rate distributions. That code was improved as 'APPLE-2' in 1982, to calculate and plot not only tritium breeding ratio but also distributions of neutron and gamma-ray fluxes, their spectra, nuclear heating rates and other reaction rates, and dose rate distributions during operation and after shutdown in 1982. The code APPLE-2 can calculate and plot these nuclear properties derived from neutron and gamma-ray fluxes by ANISN (one dimensional transport code), DOT3.5 (two dimensional transport code) and MORSE (three dimensional Monte Carlo code). We revised the code APPLE-2 as 'APPLE-3' by adding many functions to the APPLE-2 code in accordance with users' requirements proposed in recent progress of fusion reaction nuclear design. With minor modification of APPLE-2, a number of inconsistencies have been found between the code manual and the input data in the code. In the present report, the new functions added to APPLE-2 and improved users' manual are explained. (author)

  9. Improvements to the nuclear model code GNASH for cross section calculations at higher energies

    International Nuclear Information System (INIS)

    Young, P.G.; Chadwick, M.B.

    1994-01-01

    The nuclear model code GNASH, which in the past has been used predominantly for incident particle energies below 20 MeV, has been modified extensively for calculations at higher energies. The model extensions and improvements are described in this paper, and their significance is illustrated by comparing calculations with experimental data for incident energies up to 160 MeV

  10. Determining coding CpG islands by identifying regions significant for pattern statistics on Markov chains.

    Science.gov (United States)

    Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior

    2011-09-23

    Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.

  11. Improving the quality of clinical coding: a comprehensive audit model

    Directory of Open Access Journals (Sweden)

    Hamid Moghaddasi

    2014-04-01

    Full Text Available Introduction: The review of medical records with the aim of assessing the quality of codes has long been conducted in different countries. Auditing medical coding, as an instructive approach, could help to review the quality of codes objectively using defined attributes, and this in turn would lead to improvement of the quality of codes. Method: The current study aimed to present a model for auditing the quality of clinical codes. The audit model was formed after reviewing other audit models, considering their strengths and weaknesses. A clear definition was presented for each quality attribute and more detailed criteria were then set for assessing the quality of codes. Results: The audit tool (based on the quality attributes included legibility, relevancy, completeness, accuracy, definition and timeliness; led to development of an audit model for assessing the quality of medical coding. Delphi technique was then used to reassure the validity of the model. Conclusion: The inclusive audit model designed could provide a reliable and valid basis for assessing the quality of codes considering more quality attributes and their clear definition. The inter-observer check suggested in the method of auditing is of particular importance to reassure the reliability of coding.

  12. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  13. Large-Signal Code TESLA: Improvements in the Implementation and in the Model

    National Research Council Canada - National Science Library

    Chernyavskiy, Igor A; Vlasov, Alexander N; Anderson, Jr., Thomas M; Cooke, Simon J; Levush, Baruch; Nguyen, Khanh T

    2006-01-01

    We describe the latest improvements made in the large-signal code TESLA, which include transformation of the code to a Fortran-90/95 version with dynamical memory allocation and extension of the model...

  14. Improved entropy encoding for high efficient video coding standard

    Directory of Open Access Journals (Sweden)

    B.S. Sunil Kumar

    2018-03-01

    Full Text Available The High Efficiency Video Coding (HEVC has better coding efficiency, but the encoding performance has to be improved to meet the growing multimedia applications. This paper improves the standard entropy encoding by introducing the optimized weighing parameters, so that higher rate of compression can be accomplished over the standard entropy encoding. The optimization is performed using the recently introduced firefly algorithm. The experimentation is carried out using eight benchmark video sequences and the PSNR for varying rate of data transmission is investigated. Comparative analysis based on the performance statistics is made with the standard entropy encoding. From the obtained results, it is clear that the originality of the decoded video sequence is preserved far better than the proposed method, though the compression rate is increased. Keywords: Entropy, Encoding, HEVC, PSNR, Compression

  15. Improvement of FLOWER code and its application in Daya Bay

    International Nuclear Information System (INIS)

    Zhang Shaodong; Zhang Yongxing

    1995-01-01

    FLOWER, a computer code recommended by USNRC for assessing the environmental impact in tidal regions, was adapted and improved so as to be suitable to deal with the influence of drift stream along seashore to the dilution of contaminants and heat in the bay mouth. And the code outputs were presented with more mid-results such as average concentrations and temperature values for all tides considered. Finally, the modified code is applied to the dispersion calculation of heat and liquid effluents from Daya Bay Nuclear Power Plant, and the impacts from routine operation of the plant on Daya Bay sea waters were given

  16. An improved UO2 thermal conductivity model in the ELESTRES computer code

    International Nuclear Information System (INIS)

    Chassie, G.G.; Tochaie, M.; Xu, Z.

    2010-01-01

    This paper describes the improved UO 2 thermal conductivity model for use in the ELESTRES (ELEment Simulation and sTRESses) computer code. The ELESTRES computer code models the thermal, mechanical and microstructural behaviour of a CANDU® fuel element under normal operating conditions. The main purpose of the code is to calculate fuel temperatures, fission gas release, internal gas pressure, fuel pellet deformation, and fuel sheath strains for fuel element design and assessment. It is also used to provide initial conditions for evaluating fuel behaviour during high temperature transients. The thermal conductivity of UO 2 fuel is one of the key parameters that affect ELESTRES calculations. The existing ELESTRES thermal conductivity model has been assessed and improved based on a large amount of thermal conductivity data from measurements of irradiated and un-irradiated UO 2 fuel with different densities. The UO 2 thermal conductivity data cover 90% to 99% theoretical density of UO 2 , temperature up to 3027 K, and burnup up to 1224 MW·h/kg U. The improved thermal conductivity model, which is recommended for a full implementation in the ELESTRES computer code, has reduced the ELESTRES code prediction biases of temperature, fission gas release, and fuel sheath strains when compared with the available experimental data. This improved thermal conductivity model has also been checked with a test version of ELESTRES over the full ranges of fuel temperature, fuel burnup, and fuel density expected in CANDU fuel. (author)

  17. Improvement of Off-site Dose Assessment Code for Operating Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Juyub; Kim, Juyoul; Shin, Kwangyoung [FNC Technology Co. Ltd., Yongin (Korea, Republic of); You, Songjae; Moon, Jongyi [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    XOQDOQ code which calculates atmospheric Dispersion factor was included into INDAC also. A research on the improvement of off-site dose assessment system for an operating nuclear power plant was performed by KINS in 2011. As a result, following improvements were derived: - Separation of dose assessment for new and existing facilities - Update of food ingestion data - Consideration of multi-unit operation and so on In order to reflect the results, INDAC is under modification. INDAC is an integrated dose assessment code for an operating nuclear power plant and consists of three main modules: XOQDOQ, GASDOS and LIQDOS. The modules are under modification in order to improve the accuracy of assessment and usability. Assessment points for multi-unit release can be calculated through the improved code and the method on dose assessment for multi-unit release has been modified, so that the dose assessment result of multi-unit site becomes more realistic by relieving excessive conservatism. Finally, as the accuracy of calculation modules has been improved, the reliability of dose assessment result has been strengthened.

  18. Improved Side Information Generation for Distributed Video Coding by Exploiting Spatial and Temporal Correlations

    Directory of Open Access Journals (Sweden)

    Ye Shuiming

    2009-01-01

    Full Text Available Distributed video coding (DVC is a video coding paradigm allowing low complexity encoding for emerging applications such as wireless video surveillance. Side information (SI generation is a key function in the DVC decoder, and plays a key-role in determining the performance of the codec. This paper proposes an improved SI generation for DVC, which exploits both spatial and temporal correlations in the sequences. Partially decoded Wyner-Ziv (WZ frames, based on initial SI by motion compensated temporal interpolation, are exploited to improve the performance of the whole SI generation. More specifically, an enhanced temporal frame interpolation is proposed, including motion vector refinement and smoothing, optimal compensation mode selection, and a new matching criterion for motion estimation. The improved SI technique is also applied to a new hybrid spatial and temporal error concealment scheme to conceal errors in WZ frames. Simulation results show that the proposed scheme can achieve up to 1.0 dB improvement in rate distortion performance in WZ frames for video with high motion, when compared to state-of-the-art DVC. In addition, both the objective and perceptual qualities of the corrupted sequences are significantly improved by the proposed hybrid error concealment scheme, outperforming both spatial and temporal concealments alone.

  19. Vectorization and improvement of nuclear codes. 3. DGR, STREAM V3.1, Cella, GGR

    Energy Technology Data Exchange (ETDEWEB)

    Nemoto, Toshiyuki; Eguchi, Norikuni; Watanabe, Hideo; Machida, Masahiko; Yokokawa, Mitsuo; Fujii, Minoru [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1995-01-01

    Four nuclear codes have been vectorized and improved in order to realize the high speed performance on the VP2600 supercomputer at Computing and Information Systems Center of JAERI in the fiscal year 1993. Molecular Dynamics simulation code DGR which simulates the irradiation damage on diamond crystalline, three-dimensional non-steady compressible fluid dynamics code STREAM V3.1, two-dimensional fluid simulation code using Cell Automaton model Cella and Molecular Dynamics code GGR which simulates the irradiation damage on black carbon crystalline have been vectorized and improved, respectively. Speed up ratios by the vectorization to scalar mode on VP2600 show 2.8, 6.8-14.8, 15-16 and 1.23 times for DGR, STREAM V3.1, Cella and GGR, respectively. In this report, we present vectorization techniques, vectorization effects, evaluations of the numerical results and techniques for the improvement. (author).

  20. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    Science.gov (United States)

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem

  1. The improvement of the heat transfer model for sodium-water reaction jet code

    International Nuclear Information System (INIS)

    Hashiguchi, Yoshirou; Yamamoto, Hajime; Kamoshida, Norio; Murata, Shuuichi

    2001-02-01

    For confirming the reasonable DBL (Design Base Leak) on steam generator (SG), it is necessary to evaluate phenomena of sodium-water reaction (SWR) in an actual steam generator realistically. The improvement of a heat transfer model on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.40) and application analysis to the water injection tests for confirmation of propriety for the code were performed. On the improvement of the code, the heat transfer model between a inside fluid and a tube wall was introduced instead of the prior model which was heat capacity model including both heat capacity of the tube wall and inside fluid. And it was considered that the fluid of inside the heat exchange tube was able to treat as water or sodium and typical heat transfer equations used in SG design were also introduced in the new heat transfer model. Further additional work was carried out in order to improve the stability of the calculation for long calculation time. The test calculation using the improved code (LEAP-JET ver.1.50) were carried out with conditions of the SWAT-IR·Run-HT-2 test. It was confirmed that the SWR jet behavior on the result and the influence to the result of the heat transfer model were reasonable. And also on the improved code (LEAP-JET ver.1.50), user's manual was revised with additional I/O manual and explanation of the heat transfer model and new variable name. (author)

  2. Improved Iterative Hard- and Soft-Reliability Based Majority-Logic Decoding Algorithms for Non-Binary Low-Density Parity-Check Codes

    Science.gov (United States)

    Xiong, Chenrong; Yan, Zhiyuan

    2014-10-01

    Non-binary low-density parity-check (LDPC) codes have some advantages over their binary counterparts, but unfortunately their decoding complexity is a significant challenge. The iterative hard- and soft-reliability based majority-logic decoding algorithms are attractive for non-binary LDPC codes, since they involve only finite field additions and multiplications as well as integer operations and hence have significantly lower complexity than other algorithms. In this paper, we propose two improvements to the majority-logic decoding algorithms. Instead of the accumulation of reliability information in the existing majority-logic decoding algorithms, our first improvement is a new reliability information update. The new update not only results in better error performance and fewer iterations on average, but also further reduces computational complexity. Since existing majority-logic decoding algorithms tend to have a high error floor for codes whose parity check matrices have low column weights, our second improvement is a re-selection scheme, which leads to much lower error floors, at the expense of more finite field operations and integer operations, by identifying periodic points, re-selecting intermediate hard decisions, and changing reliability information.

  3. Evaluating the Coding and Workload Accounting Improvement Initiative of Madigan Army Medical Center

    National Research Council Canada - National Science Library

    Bewley, Lee W; Bender, Brian J

    2007-01-01

    ... documentation, provider coding accuracy and education, and clinic electronic medical record (AHLTA) usage. The desired end state of the CWAI is improved medical documentation and coding accuracy at MAMC...

  4. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    Ito, Masahiro; Uwaba, Tomoyuki

    2005-04-01

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  5. Recent Improvements to the IMPACT-T Parallel Particle Tracking Code

    International Nuclear Information System (INIS)

    Qiang, J.; Pogorelov, I.V.; Ryne, R.

    2006-01-01

    The IMPACT-T code is a parallel three-dimensional quasi-static beam dynamics code for modeling high brightness beams in photoinjectors and RF linacs. Developed under the US DOE Scientific Discovery through Advanced Computing (SciDAC) program, it includes several key features including a self-consistent calculation of 3D space-charge forces using a shifted and integrated Green function method, multiple energy bins for beams with large energy spread, and models for treating RF standing wave and traveling wave structures. In this paper, we report on recent improvements to the IMPACT-T code including modeling traveling wave structures, short-range transverse and longitudinal wakefields, and longitudinal coherent synchrotron radiation through bending magnets

  6. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  7. A simple clinical coding strategy to improve recording of child maltreatment concerns: an audit study.

    Science.gov (United States)

    McGovern, Andrew Peter; Woodman, Jenny; Allister, Janice; van Vlymen, Jeremy; Liyanage, Harshana; Jones, Simon; Rafi, Imran; de Lusignan, Simon; Gilbert, Ruth

    2015-01-14

    Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC) and National Institute for Health and Clinical Excellence (NICE) but there is evidence of substantial under-recording. To determine whether a simple coding strategy improved recording of maltreatment-related concerns in electronic primary care records. Clinical audit of rates of maltreatment-related coding before January 2010-December 2011 and after January-December 2012 implementation of a simple coding strategy in 11 English family practices. The strategy included encouraging general practitioners to use, always and as a minimum, the Read code 'Child is cause for concern'. A total of 25,106 children aged 0-18 years were registered with these practices. We also undertook a qualitative service evaluation to investigate barriers to recording. Outcomes were recording of 1) any maltreatment-related codes, 2) child protection proceedings and 3) child was a cause for concern. We found increased recording of any maltreatment-related code (rate ratio 1.4; 95% CI 1.1-1.6), child protection procedures (RR 1.4; 95% CI 1.1-1.6) and cause for concern (RR 2.5; 95% CI 1.8-3.4) after implementation of the coding strategy. Clinicians cited the simplicity of the coding strategy as the most important factor assisting implementation. This simple coding strategy improved clinician's recording of maltreatment-related concerns in a small sample of practices with some 'buy-in'. Further research should investigate how recording can best support the doctor-patient relationship. HOW THIS FITS IN: Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC) and National Institute for Health and Clinical Excellence (NICE), but there is evidence of substantial under-recording. We describe a simple clinical coding strategy that helped general practitioners to improve recording of maltreatment-related concerns

  8. A simple clinical coding strategy to improve recording of child maltreatment concerns: an audit study

    Directory of Open Access Journals (Sweden)

    Andrew Peter McGovern

    2015-01-01

    Full Text Available Background Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC and National Institute for Health and Clinical Excellence (NICE but there is evidence of substantial under-recording.Aim To determine whether a simple coding strategy improved recording of maltreatment-related concerns in electronic primary care records.Design and Setting Clinical audit of rates of maltreatment-related coding before January 2010–December 2011 and after January–December 2012 implementation of a simple coding strategy in 11 English family practices. The strategy included encouraging general practitioners to use, always and as a minimum, the Read code ‘Child is cause for concern’. A total of 25,106 children aged 0–18 years were registered with these practices. We also undertook a qualitative service evaluation to investigate barriers to recording.Method Outcomes were recording of 1 any maltreatment-related codes, 2 child protection proceedings and 3 child was a cause for concern.Results We found increased recording of any maltreatment-related code (rate ratio 1.4; 95% CI 1.1–1.6, child protection procedures (RR 1.4; 95% CI 1.1–1.6 and cause for concern (RR 2.5; 95% CI 1.8–3.4 after implementation of the coding strategy. Clinicians cited the simplicity of the coding strategy as the most important factor assisting implementation.Conclusion This simple coding strategy improved clinician’s recording of maltreatment-related concerns in a small sample of practices with some ‘buy-in’. Further research should investigate how recording can best support the doctor–patient relationshipHow this fits in Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC and National Institute for Health and Clinical Excellence (NICE, but there is evidence of substantial underrecording. We describe a simple clinical coding

  9. An improved thermal model for the computer code NAIAD

    International Nuclear Information System (INIS)

    Rainbow, M.T.

    1982-12-01

    An improved thermal model, based on the concept of heat slabs, has been incorporated as an option into the thermal hydraulic computer code NAIAD. The heat slabs are one-dimensional thermal conduction models with temperature independent thermal properties which may be internal and/or external to the fluid. Thermal energy may be added to or removed from the fluid via heat slabs and passed across the external boundary of external heat slabs at a rate which is a linear function of the external surface temperatures. The code input for the new option has been restructured to simplify data preparation. A full description of current input requirements is presented

  10. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  11. GAMERA - The New Magnetospheric Code

    Science.gov (United States)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  12. Research on the improvement of nuclear safety -Improvement of level 1 PSA computer code package-

    International Nuclear Information System (INIS)

    Park, Chang Kyoo; Kim, Tae Woon; Kim, Kil Yoo; Han, Sang Hoon; Jung, Won Dae; Jang, Seung Chul; Yang, Joon Un; Choi, Yung; Sung, Tae Yong; Son, Yung Suk; Park, Won Suk; Jung, Kwang Sub; Kang Dae Il; Park, Jin Heui; Hwang, Mi Jung; Hah, Jae Joo

    1995-07-01

    This year is the third year of the Government-sponsored mid- and long-term nuclear power technology development project. The scope of this sub project titled on 'The improvement of level-1 PSA computer codes' is divided into three main activities : (1) Methodology development on the underdeveloped fields such as risk assessment technology for plant shutdown and low power situations, (2) Computer code package development for level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in this area of shutdown risk assessment technology development, plant outage experiences of domestic plants are reviewed and plant operating states (POS) are decided. A sample core damage frequency is estimated for over draining event in RCS low water inventory i.e. mid-loop operation. Human reliability analysis and thermal hydraulic support analysis are identified to be needed to reduce uncertainty. Two design improvement alternatives are evaluated using PSA technique for mid-loop operation situation: one is use of containment spray system as backup of shutdown cooling system and the other is installation of two independent level indication system. Procedure change is identified more preferable option to hardware modification in the core damage frequency point of view. Next, level-1 PSA code KIRAP is converted to PC-windows environment. For the improvement of efficiency in performing PSA, the fast cutest generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. 48 figs, 15 tabs, 59 refs. (Author)

  13. Research on the improvement of nuclear safety -Improvement of level 1 PSA computer code package-

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Kyoo; Kim, Tae Woon; Kim, Kil Yoo; Han, Sang Hoon; Jung, Won Dae; Jang, Seung Chul; Yang, Joon Un; Choi, Yung; Sung, Tae Yong; Son, Yung Suk; Park, Won Suk; Jung, Kwang Sub; Kang Dae Il; Park, Jin Heui; Hwang, Mi Jung; Hah, Jae Joo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This year is the third year of the Government-sponsored mid- and long-term nuclear power technology development project. The scope of this sub project titled on `The improvement of level-1 PSA computer codes` is divided into three main activities : (1) Methodology development on the underdeveloped fields such as risk assessment technology for plant shutdown and low power situations, (2) Computer code package development for level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in this area of shutdown risk assessment technology development, plant outage experiences of domestic plants are reviewed and plant operating states (POS) are decided. A sample core damage frequency is estimated for over draining event in RCS low water inventory i.e. mid-loop operation. Human reliability analysis and thermal hydraulic support analysis are identified to be needed to reduce uncertainty. Two design improvement alternatives are evaluated using PSA technique for mid-loop operation situation: one is use of containment spray system as backup of shutdown cooling system and the other is installation of two independent level indication system. Procedure change is identified more preferable option to hardware modification in the core damage frequency point of view. Next, level-1 PSA code KIRAP is converted to PC-windows environment. For the improvement of efficiency in performing PSA, the fast cutest generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. 48 figs, 15 tabs, 59 refs. (Author).

  14. Development and improvement of safety analysis code for geological disposal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    In order to confirm the long-term safety concerning geological disposal, probabilistic safety assessment code and other analysis codes, which can evaluate possibility of each event and influence on engineered barrier and natural barrier by the event, were introduced. We confirmed basic functions of those codes and studied the relation between those functions and FEP/PID which should be taken into consideration in safety assessment. We are planning to develop 'Nuclide Migration Assessment System' for the purpose of realizing improvement in efficiency of assessment work, human error prevention for analysis, and quality assurance of the analysis environment and analysis work for safety assessment by using it. As the first step, we defined the system requirements and decided the system composition and functions which should be mounted in them based on those requirements. (author)

  15. Improvement of calculation method for temperature coefficient of HTTR by neutronics calculation code based on diffusion theory. Analysis for temperature coefficient by SRAC code system

    International Nuclear Information System (INIS)

    Goto, Minoru; Takamatsu, Kuniyoshi

    2007-03-01

    The HTTR temperature coefficients required for the core dynamics calculations had been calculated from the HTTR core calculation results by the diffusion code with which the corrections had been performed using the core calculation results by the Monte-Carlo code MVP. This calculation method for the temperature coefficients was considered to have some issues to be improved. Then, the calculation method was improved to obtain the temperature coefficients in which the corrections by the Monte-Carlo code were not required. Specifically, from the point of view of neutron spectrum calculated by lattice calculations, the lattice model was revised which had been used for the calculations of the temperature coefficients. The HTTR core calculations were performed by the diffusion code with the group constants which were generated by the lattice calculations with the improved lattice model. The core calculations and the lattice calculations were performed by the SRAC code system. The HTTR core dynamics calculation was performed with the temperature coefficient obtained from the core calculation results. In consequence, the core dynamics calculation result showed good agreement with the experimental data and the valid temperature coefficient could be calculated only by the diffusion code without the corrections by Monte-Carlo code. (author)

  16. Methods tuned on the physical problem. A way to improve numerical codes

    International Nuclear Information System (INIS)

    Ixaru, L.Gr.

    2010-01-01

    We consider the problem on how the numerical methods tuned on the physical problem can contribute to the enhancement of the performance of the codes. We illustrate this on two simple cases: solution of time independent one-dimensional Schroedinger equation, and the computation of integrals with oscillatory integrands. In both cases the tuned versions bring a massive gain in accuracy at negligible extra cost. We presented two simple problems where successive levels of tuning enhance significantly the accuracy at negligible extra cost. These problems should be seen as representing only some illustrations on how the codes can be improved but we must also mention that in many cases tuned versions still have to be developed. Just for a suggestion, quadrature formulae which involve the integrand and a number of successive derivatives of this exist, but no formula is available when some of these derivatives are missing, for example when we dispose of y and y'' but not of y'. A direct application will be on the case when the integrand involves the solution of the Schrodinger equation by the method of Numerov. (author)

  17. Improvement of Level-1 PSA computer code package -A study for nuclear safety improvement-

    International Nuclear Information System (INIS)

    Park, Chang Kyu; Kim, Tae Woon; Ha, Jae Joo; Han, Sang Hoon; Cho, Yeong Kyun; Jeong, Won Dae; Jang, Seung Cheol; Choi, Young; Seong, Tae Yong; Kang, Dae Il; Hwang, Mi Jeong; Choi, Seon Yeong; An, Kwang Il

    1994-07-01

    This year is the second year of the Government-sponsored Mid- and Long-Term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The Improvement of Level-1 PSA Computer Codes' is divided into three main activities : (1) Methodology development on the under-developed fields such as risk assessment technology for plant shutdown and external events, (2) Computer code package development for Level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in the area of PSA methodology development, foreign PSA reports on shutdown and external events have been reviewed and various PSA methodologies have been compared. Level-1 PSA code KIRAP and CCF analysis code COCOA are converted from KOS to Windows. Human reliability database has been also established in this year. In the area of new technology applications, fuzzy set theory and entropy theory are used to estimate component life and to develop a new measure of uncertainty importance. Finally, in the field of application study of PSA technique to reactor regulation, a strategic study to develop a dynamic risk management tool PEPSI and the determination of inspection and test priority of motor operated valves based on risk importance worths have been studied. (Author)

  18. An improved method for storing and retrieving tabulated data in a scalar Monte Carlo code

    International Nuclear Information System (INIS)

    Hollenbach, D.F.; Reynolds, K.H.; Dodds, H.L.; Landers, N.F.; Petrie, L.M.

    1990-01-01

    The KENO-Va code is a production-level criticality safety code used to calculate the k eff of a system. The code is stochastic in nature, using a Monte Carlo algorithm to track individual particles one at a time through the system. The advent of computers with vector processors has generated an interest in improving KENO-Va to take advantage of the potential speed-up associated with these new processors. Unfortunately, the original Monte Carlo algorithm and method of storing and retrieving cross-section data is not adaptable to vector processing. This paper discusses an alternate method for storing and retrieving data that not only is readily vectorizable but also improves the efficiency of the current scalar code

  19. Improved diffusion coefficients generated from Monte Carlo codes

    International Nuclear Information System (INIS)

    Herman, B. R.; Forget, B.; Smith, K.; Aviles, B. N.

    2013-01-01

    Monte Carlo codes are becoming more widely used for reactor analysis. Some of these applications involve the generation of diffusion theory parameters including macroscopic cross sections and diffusion coefficients. Two approximations used to generate diffusion coefficients are assessed using the Monte Carlo code MC21. The first is the method of homogenization; whether to weight either fine-group transport cross sections or fine-group diffusion coefficients when collapsing to few-group diffusion coefficients. The second is a fundamental approximation made to the energy-dependent P1 equations to derive the energy-dependent diffusion equations. Standard Monte Carlo codes usually generate a flux-weighted transport cross section with no correction to the diffusion approximation. Results indicate that this causes noticeable tilting in reconstructed pin powers in simple test lattices with L2 norm error of 3.6%. This error is reduced significantly to 0.27% when weighting fine-group diffusion coefficients by the flux and applying a correction to the diffusion approximation. Noticeable tilting in reconstructed fluxes and pin powers was reduced when applying these corrections. (authors)

  20. Iterative List Decoding of Concatenated Source-Channel Codes

    Directory of Open Access Journals (Sweden)

    Hedayat Ahmadreza

    2005-01-01

    Full Text Available Whenever variable-length entropy codes are used in the presence of a noisy channel, any channel errors will propagate and cause significant harm. Despite using channel codes, some residual errors always remain, whose effect will get magnified by error propagation. Mitigating this undesirable effect is of great practical interest. One approach is to use the residual redundancy of variable length codes for joint source-channel decoding. In this paper, we improve the performance of residual redundancy source-channel decoding via an iterative list decoder made possible by a nonbinary outer CRC code. We show that the list decoding of VLC's is beneficial for entropy codes that contain redundancy. Such codes are used in state-of-the-art video coders, for example. The proposed list decoder improves the overall performance significantly in AWGN and fully interleaved Rayleigh fading channels.

  1. An improved Zircaloy-steam reaction model for use with the March 2 (Meltdown Accident Response Characteristics) code

    International Nuclear Information System (INIS)

    Manahan, M.P.

    1983-01-01

    An improved Zircaloy-steam oxidation reaction model has been incorporated into the MARCH 2 code which includes: (1) improved physical modeling for solid-state process oxidation, (2) improved geometric modeling for gaseous diffusion oxidation, (3) chemisorption/dissociation retardation due to high hydrogen partial pressures, and (4) laminar and turbulent flow conditions. Several accident sequences have been analyzed using the model, and for the sequences considered, the results indicate that the integrated and averaged variables are not significantly altered for the current level of fuel modeling, however, the localized variables such as nodal temperature and oxide thickness are affected

  2. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    Nowadays, emerging applications such as wireless visual sensor networks and wireless video surveillance are requiring lightweight video encoding with high coding efficiency and error-resilience. Distributed Video Coding (DVC) is a new coding paradigm which exploits the source statistics...... and noise modeling and also learn from the previous decoded Wyner-Ziv (WZ) frames, side information and noise learning (SING) is proposed. The SING scheme introduces an optical flow technique to compensate the weaknesses of the block based SI generation and also utilizes clustering of DCT blocks to capture...... cross band correlation and increase local adaptivity in noise modeling. During decoding, the updated information is used to iteratively reestimate the motion and reconstruction in the proposed motion and reconstruction reestimation (MORE) scheme. The MORE scheme not only reestimates the motion vectors...

  3. Source Code Stylometry Improvements in Python

    Science.gov (United States)

    2017-12-14

    grant (Caliskan-Islam et al. 2015) ............. 1 Fig. 2 Corresponding abstract syntax tree from de-anonymizing programmers’ paper (Caliskan-Islam et...person can be identified via their handwriting or an author identified by their style or prose, programmers can be identified by their code...Provided a labelled training set of code samples (example in Fig. 1), the techniques used in stylometry can identify the author of a piece of code or even

  4. BetaShape: A new code for improved analytical calculations of beta spectra

    Directory of Open Access Journals (Sweden)

    Mougeot Xavier

    2017-01-01

    Full Text Available The new code BetaShape has been developed in order to improve the nuclear data related to beta decays. An analytical model was considered, except for the relativistic electron wave functions, for ensuring fast calculations. Output quantities are mean energies, log ft values and beta and neutrino spectra for single and multiple transitions. The uncertainties from the input parameters, read from an ENSDF file, are propagated. A database of experimental shape factors is included. A comparison over the entire ENSDF database with the standard code currently used in nuclear data evaluations shows consistent results for the vast majority of the transitions and highlights the improvements that can be expected with the use of BetaShape.

  5. Accuracy improvement of SPACE code using the optimization for CHF subroutine

    International Nuclear Information System (INIS)

    Yang, Chang Keun; Kim, Yo Han; Park, Jong Eun; Ha, Sang Jun

    2010-01-01

    Typically, a subroutine to calculate the CHF (Critical Heat Flux) is loaded in code for safety analysis of nuclear power plant. CHF subroutine calculates CHF phenomenon using arbitrary condition (Temperature, pressure, flow rate, power, etc). When safety analysis for nuclear power plant is performed using major factor, CHF parameter is one of the most important factor. But the subroutines used in most codes, such as Biasi method, etc., estimate some different values from experimental data. Most CHF subroutines in the codes could predict only in their specification area, such as pressure, mass flow, void fraction, etc. Even though the most accurate CHF subroutine is used in the high quality nuclear safety analysis code, it is not assured that the valued predicted values by the subroutine are acceptable out of their application area. To overcome this hardship, various approaches to estimate the CHF have been examined during the code developing stage of SPACE. And the six sigma technique was adopted for the examination as mentioned this study. The objective of this study is to improvement of CHF prediction accuracy for nuclear power plant safety analysis code using the CHF database and Six Sigma technique. Through the study, it was concluded that the six sigma technique was useful to quantify the deviation of prediction values to experimental data and the implemented CHF prediction method in SPACE code had well-predict capabilities compared with those from other methods

  6. Coding of significant comorbidities and complications for stroke in rehabilitation.

    Science.gov (United States)

    Murray, Joanne; Pfeiffer, Rhonda; Scholten, Ingrid

    2017-09-01

    Comorbidities and complications of stroke have implications for level of care and hospital resources. It is critical, therefore, that hospital morbidity data accurately reflect the prevalence of these additional diagnoses. This study aimed to measure and describe the concordance between stroke clinicians/researchers and medical record coders when recording stroke and related diagnoses. Diagnoses recorded prospectively, according to defined criteria by a clinical research team, were compared with the coding of stroke comorbidities and complications as per the Australian Coding Standards (ACS) from the separations of 100 inpatients from three rehabilitation facilities in South Australia. Percentage agreement, kappa coefficient, sensitivity and specificity values were calculated. Kappa coefficients for agreement of prospective diagnoses with coding ranged from 0.08 to 0.819. The diagnoses with the highest agreement were stroke, aspiration pneumonia (nil cases), aphasia and dysphagia. The diagnoses with the lowest agreement were apraxia, cognitive impairment, constipation and dehydration. Not all stroke comorbidities are represented accurately in hospital morbidity datasets. Education of stroke clinicians about the current ACS may clarify expectations about medical record documentation for coding purposes which in turn may result in more accurate morbidity data and therefore costings for the rehabilitation sector.

  7. UEP LT Codes with Intermediate Feedback

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Popovski, Petar; Østergaard, Jan

    2013-01-01

    We analyze a class of rateless codes, called Luby transform (LT) codes with unequal error protection (UEP). We show that while these codes successfully provide UEP, there is a significant price in terms of redundancy in the lower prioritized segments. We propose a modification with a single inter...... intermediate feedback message. Our analysis shows a dramatic improvement on the decoding performance of the lower prioritized segment....

  8. Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits

    Directory of Open Access Journals (Sweden)

    Lieberman Rebecca M

    2008-04-01

    Full Text Available Abstract Background Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. Methods This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3. We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. Results We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64% cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8, often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2 identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2% true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86–92 for

  9. An Improved BeiDou-2 Satellite-Induced Code Bias Estimation Method

    Directory of Open Access Journals (Sweden)

    Jingyang Fu

    2018-04-01

    Full Text Available Different from GPS, GLONASS, GALILEO and BeiDou-3, it is confirmed that the code multipath bias (CMB, which originate from the satellite end and can be over 1 m, are commonly found in the code observations of BeiDou-2 (BDS IGSO and MEO satellites. In order to mitigate their adverse effects on absolute precise applications which use the code measurements, we propose in this paper an improved correction model to estimate the CMB. Different from the traditional model which considering the correction values are orbit-type dependent (estimating two sets of values for IGSO and MEO, respectively and modeling the CMB as a piecewise linear function with a elevation node separation of 10°, we estimate the corrections for each BDS IGSO + MEO satellite on one hand, and a denser elevation node separation of 5° is used to model the CMB variations on the other hand. Currently, the institutions such as IGS-MGEX operate over 120 stations which providing the daily BDS observations. These large amounts of data provide adequate support to refine the CMB estimation satellite by satellite in our improved model. One month BDS observations from MGEX are used for assessing the performance of the improved CMB model by means of precise point positioning (PPP. Experimental results show that for the satellites on the same orbit type, obvious differences can be found in the CMB at the same node and frequency. Results show that the new correction model can improve the wide-lane (WL ambiguity usage rate for WL fractional cycle bias estimation, shorten the WL and narrow-lane (NL time to first fix (TTFF in PPP ambiguity resolution (AR as well as improve the PPP positioning accuracy. With our improved correction model, the usage of WL ambiguity is increased from 94.1% to 96.0%, the WL and NL TTFF of PPP AR is shorten from 10.6 to 9.3 min, 67.9 to 63.3 min, respectively, compared with the traditional correction model. In addition, both the traditional and improved CMB model have

  10. Developmental assessment of the SCDAP/RELAP5 code

    International Nuclear Information System (INIS)

    Harvego, E.A.; Slefken, L.J.; Coryell, E.W.

    1997-01-01

    The development and assessment of the late-phase damage progression models in the current version (designated MOD3.2) of the SCDAP/RELAP5 code are described. The SCDAP/RELAP5 code is being developed at the Idaho National Engineering and Environmental Laboratory under the primary sponsorship of the US Nuclear Regulatory Commission (NRC) to provide best-estimate transient simulations of light water reactor coolant systems (RCS) during severe accident conditions. Recent modeling improvements made to the MOD3.2 version of the code include (1) molten pool formation and heat up, including the transient start-up of natural circulation heat transfer, (2) in-core molten pool thermal-mechanical crust failure, (3) the melting and relocation of upper plenum structures, and (4) improvements in the modeling of lower plenum debris behavior and the potential for failure of the lower head. Finally, to eliminate abrupt transitions between core damage states and provide more realistic predictions of late phase accident progression phenomena, a transition smoothing methodology was developed and implemented that results in the calculation of a gradual transition from an intact core geometry through the different core damage states leading to molten pool formation. A wide range of experiments and modeling tools were used to assess the capabilities of MOD3.2. The results of the SCDAP/RELAP5/MOD3.2 assessment indicate that modeling improvements have significantly enhanced the code capabilities and performance in several areas compared to the earlier code version. New models for transition smoothing between core damage states, and modeling improvements/additions for cladding oxide failure, molten pool behavior, and molten pool crust failure have significantly improved the code usability for a wide range of applications and have significantly improved the prediction of hydrogen production, molten pool melt mass and core melt relocation time

  11. Code Help: Can This Unique State Regulatory Intervention Improve Emergency Department Crowding?

    Science.gov (United States)

    Michael, Sean S; Broach, John P; Kotkowski, Kevin A; Brush, D Eric; Volturo, Gregory A; Reznek, Martin A

    2018-05-01

    Emergency department (ED) crowding adversely affects multiple facets of high-quality care. The Commonwealth of Massachusetts mandates specific, hospital action plans to reduce ED boarding via a mechanism termed "Code Help." Because implementation appears inconsistent even when hospital conditions should have triggered its activation, we hypothesized that compliance with the Code Help policy would be associated with reduction in ED boarding time and total ED length of stay (LOS) for admitted patients, compared to patients seen when the Code Help policy was not followed. This was a retrospective analysis of data collected from electronic, patient-care, timestamp events and from a prospective Code Help registry for consecutive adult patients admitted from the ED at a single academic center during a 15-month period. For each patient, we determined whether the concurrent hospital status complied with the Code Help policy or violated it at the time of admission decision. We then compared ED boarding time and overall ED LOS for patients cared for during periods of Code Help policy compliance and during periods of Code Help policy violation, both with reference to patients cared for during normal operations. Of 89,587 adult patients who presented to the ED during the study period, 24,017 (26.8%) were admitted to an acute care or critical care bed. Boarding time ranged from zero to 67 hours 30 minutes (median 4 hours 31 minutes). Total ED LOS for admitted patients ranged from 11 minutes to 85 hours 25 minutes (median nine hours). Patients admitted during periods of Code Help policy violation experienced significantly longer boarding times (median 20 minutes longer) and total ED LOS (median 46 minutes longer), compared to patients admitted under normal operations. However, patients admitted during Code Help policy compliance did not experience a significant increase in either metric, compared to normal operations. In this single-center experience, implementation of the

  12. Testing Quick Response (QR) Codes as an Innovation to Improve Feedback Among Geographically-Separated Clerkship Sites.

    Science.gov (United States)

    Snyder, Matthew J; Nguyen, Dana R; Womack, Jasmyne J; Bunt, Christopher W; Westerfield, Katie L; Bell, Adriane E; Ledford, Christy J W

    2018-03-01

    Collection of feedback regarding medical student clinical experiences for formative or summative purposes remains a challenge across clinical settings. The purpose of this study was to determine whether the use of a quick response (QR) code-linked online feedback form improves the frequency and efficiency of rater feedback. In 2016, we compared paper-based feedback forms, an online feedback form, and a QR code-linked online feedback form at 15 family medicine clerkship sites across the United States. Outcome measures included usability, number of feedback submissions per student, number of unique raters providing feedback, and timeliness of feedback provided to the clerkship director. The feedback method was significantly associated with usability, with QR code scoring the highest, and paper second. Accessing feedback via QR code was associated with the shortest time to prepare feedback. Across four rotations, separate repeated measures analyses of variance showed no effect of feedback system on the number of submissions per student or the number of unique raters. The results of this study demonstrate that preceptors in the family medicine clerkship rate QR code-linked feedback as a high usability platform. Additionally, this platform resulted in faster form completion than paper or online forms. An overarching finding of this study is that feedback forms must be portable and easily accessible. Potential implementation barriers and the social norm for providing feedback in this manner need to be considered.

  13. Use of computer codes to improve nuclear power plant operation

    International Nuclear Information System (INIS)

    Misak, J.; Polak, V.; Filo, J.; Gatas, J.

    1985-01-01

    For safety and economic reasons, the scope for carrying out experiments on operational nuclear power plants (NPPs) is very limited and any changes in technical equipment and operating parameters or conditions have to be supported by theoretical calculations. In the Nuclear Power Plant Scientific Research Institute (NIIAEhS), computer codes are systematically used to analyse actual operating events, assess safety aspects of changes in equipment and operating conditions, optimize the conditions, preparation and analysis of NPP startup trials and review and amend operating instructions. In addition, calculation codes are gradually being introduced into power plant computer systems to perform real time processing of the parameters being measured. The paper describes a number of specific examples of the use of calculation codes for the thermohydraulic analysis of operating and accident conditions aimed at improving the operation of WWER-440 units at the Jaslovske Bohunice V-1 and V-2 nuclear power plants. These examples confirm that computer calculations are an effective way of solving operating problems and of further increasing the level of safety and economic efficiency of NPP operation. (author)

  14. Improving Code Quality of the Compact Muon Solenoid Electromagnetic Calorimeter Control Software to Increase System Maintainability

    CERN Multimedia

    Holme, Oliver; Dissertori, Günther; Djambazov, Lubomir; Lustermann, Werner; Zelepoukine, Serguei

    2013-01-01

    The Detector Control System (DCS) software of the Electromagnetic Calorimeter (ECAL) of the Compact Muon Solenoid (CMS) experiment at CERN is designed primarily to enable safe and efficient operation of the detector during Large Hadron Collider (LHC) data-taking periods. Through a manual analysis of the code and the adoption of ConQAT [1], a software quality assessment toolkit, the CMS ECAL DCS team has made significant progress in reducing complexity and improving code quality, with observable results in terms of a reduction in the effort dedicated to software maintenance. This paper explains the methodology followed, including the motivation to adopt ConQAT, the specific details of how this toolkit was used and the outcomes that have been achieved. [1] ConQAT, Continuous Quality Assessment Toolkit; https://www.conqat.org/

  15. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  16. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    Science.gov (United States)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  17. Improved core-edge tokamak transport simulations with the CORSICA 2 code

    International Nuclear Information System (INIS)

    Tarditi, A.; Cohen, R.H.; Crotinger, J.A.

    1996-01-01

    The CORSICA 2 code models the nonlinear transport between the core and the edge of a tokamak plasma. The code couples a 2D axisymmetric edge/SOL model (UEDGE) to a 1D model for the radial core transport in toroidal flux coordinates (the transport module from the CORSICA 1 code). The core density and temperature profiles are joined to the flux-surface average profiles from the 2D code sufficiently inside the magnetic separatrix, at a flux surface on which the edge profiles are approximately constant. In the present version of the code, the deuterium density and electron and ion temperatures are coupled. The electron density is determined by imposing quasi-neutrality, both in the core and in the edge. The model allows the core-edge coupling of multiple ion densities while retaining a single temperature (corresponding to the equilibration value) for the all ion species. Applications of CORSICA 2 to modeling the DIII-D tokamak are discussed. This work will focus on the simulation of the L-H transition, coupling a single ion species (deuterium) and the two (electron and ion) temperatures. These simulations will employ a new self-consistent model for the L-H transition that is being implemented in the UEDGE code. Applications to the modeling of ITER ignition scenarios are also discussed. This will involve coupling a second density species (the thermal alphas), bringing the total number of coupled variables up to four. Finally, the progress in evolving the magnetic geometry is discussed. Currently, this geometry is calculated by CORSICA's MHD equilibrium module (TEQ) at the beginning of the run and fixed thereafter. However, CORSICA 1 can evolve this geometry quasistatically, and this quasistatic treatment is being extended to include the edge/SOL geometry. Recent improvements for code speed-up are also presented

  18. Improvements to the COBRA-TF (EPRI) computer code for steam generator analysis. Final report

    International Nuclear Information System (INIS)

    Stewart, C.W.; Barnhart, J.S.; Koontz, A.S.

    1980-09-01

    The COBRA-TF (EPRI) code has been improved and extended for pressurized water reactor steam generator analysis. New features and models have been added in the areas of subcooled boiling and heat transfer, turbulence, numerics, and global steam generator modeling. The code's new capabilities are qualified against selected experimental data and demonstrated for typical global and microscale steam generator analysis

  19. Improved predictions of nuclear reaction rates for astrophysics applications with the TALYS reaction code

    International Nuclear Information System (INIS)

    Goriely, S.; Hilaire, S.; Koning, A.J.

    2008-01-01

    Nuclear reaction rates for astrophysics applications are traditionally determined on the basis of Hauser-Feshbach reaction codes, like MOST. These codes use simplified schemes to calculate the capture reaction cross section on a given target nucleus, not only in its ground state but also on the different thermally populated states of the stellar plasma at a given temperature. Such schemes include a number of approximations that have never been tested, such as an approximate width fluctuation correction, the neglect of delayed particle emission during the electromagnetic decay cascade or the absence of the pre-equilibrium contribution at increasing incident energies. New developments have been brought to the reaction code TALYS to estimate the Maxwellian-averaged reaction rates of astrophysics relevance. These new developments give us the possibility to calculate with an improved accuracy the reaction cross sections and the corresponding astrophysics rates. The TALYS predictions for the thermonuclear rates of astrophysics relevance are presented and compared with those obtained with the MOST code on the basis of the same nuclear ingredients for nuclear structure properties, optical model potential, nuclear level densities and γ-ray strength. It is shown that, in particular, the pre-equilibrium process significantly influences the astrophysics rates of exotic neutron-rich nuclei. The reciprocity theorem traditionally used in astrophysics to determine photo-rates is also shown no to be valid for exotic nuclei. The predictions obtained with different nuclear inputs are also analyzed to provide an estimate of the theoretical uncertainties still affecting the reaction rate prediction far away from the experimentally known regions. (authors)

  20. Uniform emergency codes: will they improve safety?

    Science.gov (United States)

    2005-01-01

    There are pros and cons to uniform code systems, according to emergency medicine experts. Uniformity can be a benefit when ED nurses and other staff work at several facilities. It's critical that your staff understand not only what the codes stand for, but what they must do when codes are called. If your state institutes a new system, be sure to hold regular drills to familiarize your ED staff.

  1. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  2. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  3. Thermomechanical DART code improvements for LEU VHD dispersion and monolithic fuel element analysis

    International Nuclear Information System (INIS)

    Taboada, H.; Saliba, R.; Moscarda, M.V.; Rest, J.

    2005-01-01

    A collaboration agreement between ANL/US DOE and CNEA Argentina in the area of Low Enriched Uranium Advanced Fuels has been in place since October 16, 1997 under the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy. An annex concerning DART code optimization has been operative since February 8, 1999. Previously, as a part of this annex a visual FASTDART version and also a DART THERMAL version were presented during RERTR 2000, 2002 and RERTR 2003 Meetings. During this past year the following activities were completed: Optimization of DART TM code Al diffusion parameters by testing predictions against reliable data from RERTR experiments. Improvements on the 3-D thermo-mechanical version of the code for modeling the irradiation behavior of LEU U-Mo monolithic fuel. Concerning the first point, by means of an optimization of parameters of the Al diffusion through the interaction product theoretical expression, a reasonable agreement between DART temperature calculations with reliable RERTR PIE data was reached. The 3-D thermomechanical code complex is based upon a finite element thermal-elastic code named TERMELAS, and irradiation behavior provided by the DART code. An adequate and progressive process of coupling calculations of both codes at each time step is currently developed. Compatible thermal calculation between both codes was reached. This is the first stage to benchmark and validate against RERTR PIE data the coupling process. (author)

  4. Basic research and industrialization of CANDU advanced fuel - A research for the improvement of RFSP code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Jang, Chang Sun; Han, Tae Young [Seoul National University, Seoul (Korea)

    2000-03-01

    The objective of this project is to improve the RFSP code by adopting three dimensional two neutron energy group model and accelerated iterative solution scheme (FDM3D) to 2 group diffusion equations as well. The major contents of this research are the derivation of the finite difference equation to three dimensional two neutron energy group diffusion equation, application of accelerated iterative solution scheme to the finite difference diffusion equation and validation of the improved RFSP code (FDM3D) through benchmark tests. We have shown that SOR/Chebyshev two parameter method and BICG-STAB/Wielandt method are more effective than that of RFSP in terms of computing speed. SOR/Chebyshev two parameter method shows better efficiency than BICG-STAB/Wielandt method. Because calculation efficiency of the latter depends on the right choice of pre-conditioner, however, it is considered that more studies are necessary to improve the efficiency of this latter method and to validate it. We have incorporated the new efficient method into the existing RFSP so that the resulting RFSP becomes much faster and more accurate. RFSP currently uses POWDERPUFS code as main lattice code, which is adequate to the neutron energy group model of RFSP. Because of this, we can not make the full advantage of advanced RFSP without adopting lattice code WIMS-AECL which can generate exact two neutron energy group constants. Therefore, we suggest developing a new CANDU design and analysis code which incorporate WIMS-AECL into FDM3D. 16 refs., 10 figs., 23 tabs. (Author)

  5. Use the Bar Code System to Improve Accuracy of the Patient and Sample Identification.

    Science.gov (United States)

    Chuang, Shu-Hsia; Yeh, Huy-Pzu; Chi, Kun-Hung; Ku, Hsueh-Chen

    2018-01-01

    In time and correct sample collection were highly related to patient's safety. The sample error rate was 11.1%, because misbranded patient information and wrong sample containers during January to April, 2016. We developed a barcode system of "Specimens Identify System" through process of reengineering of TRM, used bar code scanners, add sample container instructions, and mobile APP. Conclusion, the bar code systems improved the patient safety and created green environment.

  6. Methods for Using Small Non-Coding RNAs to Improve Recombinant Protein Expression in Mammalian Cells

    Directory of Open Access Journals (Sweden)

    Sarah Inwood

    2018-01-01

    Full Text Available The ability to produce recombinant proteins by utilizing different “cell factories” revolutionized the biotherapeutic and pharmaceutical industry. Chinese hamster ovary (CHO cells are the dominant industrial producer, especially for antibodies. Human embryonic kidney cells (HEK, while not being as widely used as CHO cells, are used where CHO cells are unable to meet the needs for expression, such as growth factors. Therefore, improving recombinant protein expression from mammalian cells is a priority, and continuing effort is being devoted to this topic. Non-coding RNAs are RNA segments that are not translated into a protein and often have a regulatory role. Since their discovery, major progress has been made towards understanding their functions. Non-coding RNA has been investigated extensively in relation to disease, especially cancer, and recently they have also been used as a method for engineering cells to improve their protein expression capability. In this review, we provide information about methods used to identify non-coding RNAs with the potential of improving recombinant protein expression in mammalian cell lines.

  7. The hierarchy-by-interval approach to identifying important models that need improvement in severe-accident simulation codes

    International Nuclear Information System (INIS)

    Heames, T.J.; Khatib-Rahbar, M.; Kelly, J.E.

    1995-01-01

    The hierarchy-by-interval (HBI) methodology was developed to determine an appropriate phenomena identification and ranking table for an independent peer review of severe-accident computer codes. The methodology is described, and the results of a specific code review are presented. Use of this systematic and structured approach ensures that important code models that need improvement are identified and prioritized, which allows code sponsors to more effectively direct limited resources in future code development. In addition, critical phenomenological areas that need more fundamental work, such as experimentation, are identified

  8. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  9. Improving radiopharmaceutical supply chain safety by implementing bar code technology.

    Science.gov (United States)

    Matanza, David; Hallouard, François; Rioufol, Catherine; Fessi, Hatem; Fraysse, Marc

    2014-11-01

    The aim of this study was to describe and evaluate an approach for improving radiopharmaceutical supply chain safety by implementing bar code technology. We first evaluated the current situation of our radiopharmaceutical supply chain and, by means of the ALARM protocol, analysed two dispensing errors that occurred in our department. Thereafter, we implemented a bar code system to secure selected key stages of the radiopharmaceutical supply chain. Finally, we evaluated the cost of this implementation, from overtime, to overheads, to additional radiation exposure to workers. An analysis of the events that occurred revealed a lack of identification of prepared or dispensed drugs. Moreover, the evaluation of the current radiopharmaceutical supply chain showed that the dispensation and injection steps needed to be further secured. The bar code system was used to reinforce product identification at three selected key stages: at usable stock entry; at preparation-dispensation; and during administration, allowing to check conformity between the labelling of the delivered product (identity and activity) and the prescription. The extra time needed for all these steps had no impact on the number and successful conduct of examinations. The investment cost was reduced (2600 euros for new material and 30 euros a year for additional supplies) because of pre-existing computing equipment. With regard to the radiation exposure to workers there was an insignificant overexposure for hands with this new organization because of the labelling and scanning processes of radiolabelled preparation vials. Implementation of bar code technology is now an essential part of a global securing approach towards optimum patient management.

  10. One way quantum repeaters with quantum Reed-Solomon codes

    OpenAIRE

    Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang

    2018-01-01

    We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of $d$-level systems for large dimension $d$. We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generation of quantum repeaters using quan...

  11. Greater physician involvement improves coding outcomes in endobronchial ultrasound-guided transbronchial needle aspiration procedures.

    Science.gov (United States)

    Pillai, Anilkumar; Medford, Andrew R L

    2013-01-01

    Correct coding is essential for accurate reimbursement for clinical activity. Published data confirm that significant aberrations in coding occur, leading to considerable financial inaccuracies especially in interventional procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA). Previous data reported a 15% coding error for EBUS-TBNA in a U.K. service. We hypothesised that greater physician involvement with coders would reduce EBUS-TBNA coding errors and financial disparity. The study was done as a prospective cohort study in the tertiary EBUS-TBNA service in Bristol. 165 consecutive patients between October 2009 and March 2012 underwent EBUS-TBNA for evaluation of unexplained mediastinal adenopathy on computed tomography. The chief coder was prospectively electronically informed of all procedures and cross-checked on a prospective database and by Trust Informatics. Cost and coding analysis was performed using the 2010-2011 tariffs. All 165 procedures (100%) were coded correctly as verified by Trust Informatics. This compares favourably with the 14.4% coding inaccuracy rate for EBUS-TBNA in a previous U.K. prospective cohort study [odds ratio 201.1 (1.1-357.5), p = 0.006]. Projected income loss was GBP 40,000 per year in the previous study, compared to a GBP 492,195 income here with no coding-attributable loss in revenue. Greater physician engagement with coders prevents coding errors and financial losses which can be significant especially in interventional specialties. The intervention can be as cheap, quick and simple as a prospective email to the coding team with cross-checks by Trust Informatics and against a procedural database. We suggest that all specialties should engage more with their coders using such a simple intervention to prevent revenue losses. Copyright © 2013 S. Karger AG, Basel.

  12. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Marshall, N.H.; Marwil, E.S.; Matthews, S.D.; Stacey, B.J.

    1990-01-01

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  13. LDPC coded OFDM over the atmospheric turbulence channel.

    Science.gov (United States)

    Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A

    2007-05-14

    Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).

  14. Improving 3D-Turbo Code's BER Performance with a BICM System over Rayleigh Fading Channel

    Directory of Open Access Journals (Sweden)

    R. Yao

    2016-12-01

    Full Text Available Classical Turbo code suffers from high error floor due to its small Minimum Hamming Distance (MHD. Newly-proposed 3D-Turbo code can effectively increase the MHD and achieve a lower error floor by adding a rate-1 post encoder. In 3D-Turbo codes, part of the parity bits from the classical Turbo encoder are further encoded through the post encoder. In this paper, a novel Bit-Interleaved Coded Modulation (BICM system is proposed by combining rotated mapping Quadrature Amplitude Modulation (QAM and 3D-Turbo code to improve the Bit Error Rate (BER performance of 3D-Turbo code over Raleigh fading channel. A key-bit protection scheme and a Two-Dimension (2D iterative soft demodulating-decoding algorithm are developed for the proposed BICM system. Simulation results show that the proposed system can obtain about 0.8-1.0 dB gain at BER of 10^{-6}, compared with the existing BICM system with Gray mapping QAM.

  15. One-way quantum repeaters with quantum Reed-Solomon codes

    Science.gov (United States)

    Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang

    2018-05-01

    We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of d -level systems for large dimension d . We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generations of quantum repeaters using quantum Reed-Solomon codes and identify parameter regimes where each generation performs the best.

  16. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  17. Surviving "Payment by Results": a simple method of improving clinical coding in burn specialised services in the United Kingdom.

    Science.gov (United States)

    Wallis, Katy L; Malic, Claudia C; Littlewood, Sonia L; Judkins, Keith; Phipps, Alan R

    2009-03-01

    Coding inpatient episodes plays an important role in determining the financial remuneration of a clinical service. Insufficient or incomplete data may have very significant consequences on its viability. We created a document that improves the coding process in our Burns Centre. At Yorkshire Regional Burns Centre an inpatient summary sheet was designed to prospectively record and present essential information on a daily basis, for use in the coding process. The level of care was also recorded. A 3-month audit was conducted to assess the efficacy of the new forms. Forty-nine patients were admitted to the Burns Centre with a mean age of 27.6 years and TBSA ranging from 0.5% to 65%. The total stay in the Burns Centre was 758 days, of which 22% were at level B3-B5 and 39% at level B2. The use of the new discharge document identified potential income of about 500,000 GB pound sterling at our local daily tariffs for high dependency and intensive care. The new form is able to ensure a high quality of coding with a possible direct impact on the financial resources accrued for burn care.

  18. EchoSeed Model 6733 Iodine-125 brachytherapy source: Improved dosimetric characterization using the MCNP5 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Mosleh-Shirazi, M. A.; Hadad, K.; Faghihi, R.; Baradaran-Ghahfarokhi, M.; Naghshnezhad, Z.; Meigooni, A. S. [Center for Research in Medical Physics and Biomedical Engineering and Physics Unit, Radiotherapy Department, Shiraz University of Medical Sciences, Shiraz 71936-13311 (Iran, Islamic Republic of); Radiation Research Center and Medical Radiation Department, School of Engineering, Shiraz University, Shiraz 71936-13311 (Iran, Islamic Republic of); Comprehensive Cancer Center of Nevada, Las Vegas, Nevada 89169 (United States)

    2012-08-15

    This study primarily aimed to obtain the dosimetric characteristics of the Model 6733 {sup 125}I seed (EchoSeed) with improved precision and accuracy using a more up-to-date Monte-Carlo code and data (MCNP5) compared to previously published results, including an uncertainty analysis. Its secondary aim was to compare the results obtained using the MCNP5, MCNP4c2, and PTRAN codes for simulation of this low-energy photon-emitting source. The EchoSeed geometry and chemical compositions together with a published {sup 125}I spectrum were used to perform dosimetric characterization of this source as per the updated AAPM TG-43 protocol. These simulations were performed in liquid water material in order to obtain the clinically applicable dosimetric parameters for this source model. Dose rate constants in liquid water, derived from MCNP4c2 and MCNP5 simulations, were found to be 0.993 cGyh{sup -1} U{sup -1} ({+-}1.73%) and 0.965 cGyh{sup -1} U{sup -1} ({+-}1.68%), respectively. Overall, the MCNP5 derived radial dose and 2D anisotropy functions results were generally closer to the measured data (within {+-}4%) than MCNP4c and the published data for PTRAN code (Version 7.43), while the opposite was seen for dose rate constant. The generally improved MCNP5 Monte Carlo simulation may be attributed to a more recent and accurate cross-section library. However, some of the data points in the results obtained from the above-mentioned Monte Carlo codes showed no statistically significant differences. Derived dosimetric characteristics in liquid water are provided for clinical applications of this source model.

  19. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  20. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  1. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  2. APC-II: an electron beam propagation code

    International Nuclear Information System (INIS)

    Iwan, D.C.; Freeman, J.R.

    1984-05-01

    The computer code APC-II simulates the propagation of a relativistic electron beam through air. APC-II is an updated version of the APC envelope model code. It incorporates an improved conductivity model which significantly extends the range of stable calculations. A number of test cases show that these new models are capable of reproducing the simulations of the original APC code. As the result of a major restructuring and reprogramming of the code, APC-II is now friendly to both the occasional user and the experienced user who wishes to make modifications. Most of the code is in standard ANS-II Fortran 77 so that it can be easily transported between machines

  3. VIPRE-01: A thermal-hydraulic code for reactor cores

    International Nuclear Information System (INIS)

    Cuta, J.M.; Koontz, A.S.; Stewart, C.W.; Montgomery, S.D.; Nomura, K.K.

    1989-08-01

    The VIPRE-01 thermal hydraulics code for PWR and BWR analysis has undergone significant modifications and error correction. This manual for the updated code, designated as VIPRE-01 Mod-02, describes improvements that eliminate problems of slow convergence with the drift flux model in transient simulation. To update the VIPRE-01 code and its documentation the drift flux model of two-phase flow was implemented and error corrections developed during VIPRE-01 application were included. The project team modified the existing VIPRE-01 equations into drift flux model equations by developing additional terms. They also developed and implemented corrections for the errors identified during the last four years. They then validated the modified code against standard test data using selected test cases. The project team prepared documentation revisions reflecting code improvements and corrections to replace the corresponding sections in the original VIPRE documents. The revised VIPRE code, designated VIPRE-01 Mod-02, incorporates improvements that eliminate many shortcomings of the previous version. During the validation, the code produced satisfactory output compared with test data. The revised documentation is in the form of binder pages to replace existing pages in three of the original manuals

  4. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  5. Preparation of functions of computer code GENGTC and improvement for two-dimensional heat transfer calculations for irradiation capsules

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Someya, Hiroyuki; Ito, Haruhiko.

    1992-11-01

    Capsules for irradiation tests in the JMTR (Japan Materials Testing Reactor), consist of irradiation specimens surrounded by a cladding tube, holders, an inner tube and a container tube (from 30mm to 65mm in diameter). And the annular gaps between these structural materials in the capsule are filled with liquids or gases. Cooling of the capsule is done by reactor primary coolant flowing down outside the capsule. Most of the heat generated by fission in fuel specimens and gamma absorption in structural materials is directed radially to the capsule container outer surface. In thermal performance calculations for capsule design, an one(r)-dimensional heat transfer computer code entitled (Generalyzed Gap Temperature Calculation), GENGTC, originally developed in Oak Ridge National Laboratory, U.S.A., has been frequently used. In designing a capsule, are needed many cases of parametric calculations with respect to changes materials and gap sizes. And in some cases, two(r,z)-dimensional heat transfer calculations are needed for irradiation test capsules with short length fuel rods. Recently the authors improved the original one-dimensional code GENGTC, (1) to simplify preparation of input data, (2) to perform automatic calculations for parametric survey based on design temperatures, ect. Moreover, the computer code has been improved to perform r-z two-dimensional heat transfer calculation. This report describes contents of the preparation of the one-dimensional code GENGTC and the improvement for the two-dimensional code GENGTC-2, together with their code manuals. (author)

  6. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  7. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  8. Colour-coded three-dimensional reconstruction from spiral CT data sets: Improvement from the physical point of view

    International Nuclear Information System (INIS)

    Wunderlich, A.P.; Lenz, M.; Kirsten, R.; Gerhardt, P.

    1993-01-01

    This paper demonstrates the possibility of improving the spatial depth impression of colour-coded three-dimensional reconstructions by modulation of colour saturation. Patients were observed with spiral computed tomography (slice thickness 10 mm, table feed 10 mm/s, reconstruction of overlapping axial images at 2 mm increment). Interesting anatomical and pathological objects (vessels, organs, tumours, metastases) were segmented, colour-coded, and reconstructed three-dimensionally. Spatial depth impression of the coloured objects could be improved by modulating not only the brightness, but also the colour saturation. (orig.) [de

  9. Does a colour-coded blood pressure diary improve blood pressure control for patients in general practice: the CoCo trial.

    Science.gov (United States)

    Steurer-Stey, Claudia; Zoller, Marco; Chmiel Moshinsky, Corinne; Senn, Oliver; Rosemann, Thomas

    2010-04-14

    Insufficient blood pressure control is a frequent problem despite the existence of effective treatment. Insufficient adherence to self-monitoring as well as to therapy is a common reason. Blood pressure self-measurement at home (Home Blood Pressure Measurement, HBPM) has positive effects on treatment adherence and is helpful in achieving the target blood pressure. Only a few studies have investigated whether adherence to HBPM can be improved through simple measures resulting also in better blood pressure control. Improvement of self-monitoring and improved blood pressure control by using a new colour-coded blood pressure diary. Change in systolic and/or diastolic blood pressure 6 months after using the new colour-coded blood pressure diary.Secondary outcome: Adherence to blood pressure self-measurement (number of measurements/entries). Randomised controlled study. 138 adult patients in primary care with uncontrolled hypertension despite therapy. The control group uses a conventional blood pressure diary; the intervention group uses the new colour-coded blood pressure diary (green, yellow, red according a traffic light system). EXPECTED RESULTS/CONCLUSION: The visual separation and entries in three colour-coded areas reflecting risk (green: blood pressure in the target range 140/>90 mmHg, red: blood pressure in danger zone > 180 mmHg/>110 mmHg) lead to better self-monitoring compared with the conventional (non-colour-coded) blood pressure booklet. The colour-coded, visualised information supports improved perception (awareness and interpretation) of blood pressure and triggers correct behaviour, in the means of improved adherence to the recommended treatment as well as better communication between patients and doctors resulting in improved blood pressure control. ClinicalTrials.gov ID NCT01013467.

  10. Application of improved air transport data and wall transmission/reflection data in the SKYSINE code to typical BWR turbine skyshine

    Energy Technology Data Exchange (ETDEWEB)

    Tayama, Ryuichi; Hayashi, Katsumi [Hitachi Engineering Co. Ltd., Ibaraki (Japan); Hirayama, Hideo [High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan); Sakamoto, Yukio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Harima, Yoshiko; Ishikawa, Satoshi [CRC Research Institute Inc., Tokyo (Japan); Hayashida, Yoshihisa [Toshiba Corp., Kawasaki, Kanagawa (Japan); Nemoto, Makoto [Visible Information Center, Tokai, Ibaraki (Japan); Sato, Osamu [Mitsubishi Research Inst., Inc., Tokyo (Japan)

    2000-03-01

    Three basic sets of data, i.e. air transport data and material transmission/reflection data, included in the SKYSHINE program have been improved using up-to-data and methods, and applied to skyshine dose calculations for a typical BWR turbine building. The direct and skyshine dose rates with the original SKYSHINE code show good agreements with MCNP Monte-Carlo calculations except for the distances less than 0.1 km. The results for the improved SKYSHINE code also have agreements with the MCNP code within 10-20%. The discrepancy of 10-20% can be due to the improved concrete transmission data at small incident and exit angles. We still improve the three sets of data and investigate with different calculational models to get more accurate results. (author)

  11. Using self-similarity compensation for improving inter-layer prediction in scalable 3D holoscopic video coding

    Science.gov (United States)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2013-09-01

    Holoscopic imaging, also known as integral imaging, has been recently attracting the attention of the research community, as a promising glassless 3D technology due to its ability to create a more realistic depth illusion than the current stereoscopic or multiview solutions. However, in order to gradually introduce this technology into the consumer market and to efficiently deliver 3D holoscopic content to end-users, backward compatibility with legacy displays is essential. Consequently, to enable 3D holoscopic content to be delivered and presented on legacy displays, a display scalable 3D holoscopic coding approach is required. Hence, this paper presents a display scalable architecture for 3D holoscopic video coding with a three-layer approach, where each layer represents a different level of display scalability: Layer 0 - a single 2D view; Layer 1 - 3D stereo or multiview; and Layer 2 - the full 3D holoscopic content. In this context, a prediction method is proposed, which combines inter-layer prediction, aiming to exploit the existing redundancy between the multiview and the 3D holoscopic layers, with self-similarity compensated prediction (previously proposed by the authors for non-scalable 3D holoscopic video coding), aiming to exploit the spatial redundancy inherent to the 3D holoscopic enhancement layer. Experimental results show that the proposed combined prediction can improve significantly the rate-distortion performance of scalable 3D holoscopic video coding with respect to the authors' previously proposed solutions, where only inter-layer or only self-similarity prediction is used.

  12. Rate adaptive multilevel coded modulation with high coding gain in intensity modulation direct detection optical communication

    Science.gov (United States)

    Xiao, Fei; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Zhang, Qi; Tian, Qinghua; Tian, Feng; Wang, Yongjun; Rao, Lan; Ullah, Rahat; Zhao, Feng; Li, Deng'ao

    2018-02-01

    A rate-adaptive multilevel coded modulation (RA-MLC) scheme based on fixed code length and a corresponding decoding scheme is proposed. RA-MLC scheme combines the multilevel coded and modulation technology with the binary linear block code at the transmitter. Bits division, coding, optional interleaving, and modulation are carried out by the preset rule, then transmitted through standard single mode fiber span equal to 100 km. The receiver improves the accuracy of decoding by means of soft information passing through different layers, which enhances the performance. Simulations are carried out in an intensity modulation-direct detection optical communication system using MATLAB®. Results show that the RA-MLC scheme can achieve bit error rate of 1E-5 when optical signal-to-noise ratio is 20.7 dB. It also reduced the number of decoders by 72% and realized 22 rate adaptation without significantly increasing the computing time. The coding gain is increased by 7.3 dB at BER=1E-3.

  13. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    Science.gov (United States)

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  14. Improved SAFARI-1 research reactor irradiation position modeling in OSCAR-3 code system

    International Nuclear Information System (INIS)

    Moloko, L. E.; Belal, M. G. A. H.

    2009-01-01

    The demand on the availability of irradiation positions in the SAFARI-1 reactor is continuously increasing due to the commercial pressure to produce isotopes more efficiently. This calls for calculational techniques and modeling methods to be improved regularly to optimize irradiation services. The irradiation position models are improved using the OSCAR-3 code system, and results are compared to experimental measurements. It is concluded that the irradiation position models are essential if realistic core follow and reload studies are to be performed and most importantly, for the realization of improved agreement between experimental data and calculated results. (authors)

  15. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  16. Validation and Improvement of the FEMAXI-JNES Code by Using PIE Data at Extended Burnup. Final Report for FUMEX-III

    International Nuclear Information System (INIS)

    Hirose, Tsutomu; Miura, Hiromichi; Kitamura, Toshiya; Kamimura, Katsuichiro

    2013-01-01

    Japan Nuclear Energy Safety Organization (JNES) has participated in the IAEA FUMEX-III Coordinated Research Project (CRP) on the Improvement of Computer Codes Used for Fuel Behaviour Simulation for the following purpose. 1. Cooperate between member states and exchange information and expertise for understanding of fuel modelling and improvement 2. Develop and improve the FEMAXI-JNES code as an audit code for Japanese safety licensing review of fuel rod design, especially, - High burnup fuel - MOX fuel 3. Set the standard models for the FEMAXI-JNES code to provide best-estimate predictions of the thermal and mechanical performance of LWR fuel rod This is the JNES's final report for the FUMEX-III CRP. During the period of the CRP, JNES has modified pellet swelling and fission gas release models, and demonstrated the predictive capability relative to fuel centerline temperature, fission gas release, fuel rod internal gas pressure, cladding diametral deformation and cladding elongation by comparisons of integral code predictions of these parameters to experimental (measured) data from OECD/NEA IFPE database. (author)

  17. Implantation, evaluation and improvement of the diffusion code package developed by the RIS0 Research Center

    International Nuclear Information System (INIS)

    Koide, M.C.M.

    1983-01-01

    The evaluation and improvement of the diffusion code package developed by the RIS0 Research Center of Denmark have been performed. The improvements made in the package consisted in the presentation of their manuals. In order to reduce the process time of the codes an analitical boundary condition capable of representing the effects of the baffle and the reflector on the flux distribution has been calculated. Such boundary condition was obtained using a one-dimensional medium in the framework of the two group diffusion theory. The results showed that the application of this boundary condition produces very accurate results and an appreciable economy of processing time. (author) [pt

  18. Does a colour-coded blood pressure diary improve blood pressure control for patients in general practice: The CoCo trial

    Directory of Open Access Journals (Sweden)

    Senn Oliver

    2010-04-01

    Full Text Available Abstract Background Insufficient blood pressure control is a frequent problem despite the existence of effective treatment. Insufficient adherence to self-monitoring as well as to therapy is a common reason. Blood pressure self-measurement at home (Home Blood Pressure Measurement, HBPM has positive effects on treatment adherence and is helpful in achieving the target blood pressure. Only a few studies have investigated whether adherence to HBPM can be improved through simple measures resulting also in better blood pressure control. Objective Improvement of self-monitoring and improved blood pressure control by using a new colour-coded blood pressure diary. Outcome Primary outcome: Change in systolic and/or diastolic blood pressure 6 months after using the new colour-coded blood pressure diary. Secondary outcome: Adherence to blood pressure self-measurement (number of measurements/entries. Methods/Design Randomised controlled study. Population: 138 adult patients in primary care with uncontrolled hypertension despite therapy. The control group uses a conventional blood pressure diary; the intervention group uses the new colour-coded blood pressure diary (green, yellow, red according a traffic light system. Expected results/conclusion The visual separation and entries in three colour-coded areas reflecting risk (green: blood pressure in the target range ≤ 140/≤ 90 mmHg, yellow: blood pressure >140/>90 mmHg, red: blood pressure in danger zone > 180 mmHg/>110 mmHg lead to better self-monitoring compared with the conventional (non-colour-coded blood pressure booklet. The colour-coded, visualised information supports improved perception (awareness and interpretation of blood pressure and triggers correct behaviour, in the means of improved adherence to the recommended treatment as well as better communication between patients and doctors resulting in improved blood pressure control. Trial registration ClinicalTrials.gov ID NCT01013467

  19. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  20. An improved version of the HULLAC code

    Energy Technology Data Exchange (ETDEWEB)

    Busquet, M.; Bar-Shalom, A.; Klapisch, M.; Oreg, J. [ARTEPARTEP is a contractor to the Naval Research Lab., Washington, DC (United States)

    2006-06-15

    Accurate and detailed atomic structure codes are needed for simulation of spectrally resolved X-ray output of laser driven target. As such, the HULLAC code has already been presented several times. First of all, an overhaul was performed, modernizing many parts to make them easier to understand and adding many comments. The source, in Fortran-77, was compiled and checked on many different systems with different compilers. In the new version, we have added the possibility to directly compute the relativistic configuration averages, skipping the fine structure. However in this case configuration interactions can be accounted for only within each non-relativistic configuration. Therefore we added the possibility of a mixed description, where not all configurations are described at the fine structure level. Recently, cooperation was proposed to anyone interested in extending or developing the code. HULLAC is now ready to be distributed on a basis of collaboration.

  1. An improved version of the HULLAC code

    International Nuclear Information System (INIS)

    Busquet, M.; Bar-Shalom, A.; Klapisch, M.; Oreg, J.

    2006-01-01

    Accurate and detailed atomic structure codes are needed for simulation of spectrally resolved X-ray output of laser driven target. As such, the HULLAC code has already been presented several times. First of all, an overhaul was performed, modernizing many parts to make them easier to understand and adding many comments. The source, in Fortran-77, was compiled and checked on many different systems with different compilers. In the new version, we have added the possibility to directly compute the relativistic configuration averages, skipping the fine structure. However in this case configuration interactions can be accounted for only within each non-relativistic configuration. Therefore we added the possibility of a mixed description, where not all configurations are described at the fine structure level. Recently, cooperation was proposed to anyone interested in extending or developing the code. HULLAC is now ready to be distributed on a basis of collaboration

  2. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  3. Improving the accuracy of operation coding in surgical discharge summaries

    Science.gov (United States)

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  4. The role of the PIRT process in identifying code improvements and executing code development

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1997-01-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a low probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications

  5. Improved Intra-coding Methods for H.264/AVC

    Directory of Open Access Journals (Sweden)

    Li Song

    2009-01-01

    Full Text Available The H.264/AVC design adopts a multidirectional spatial prediction model to reduce spatial redundancy, where neighboring pixels are used as a prediction for the samples in a data block to be encoded. In this paper, a recursive prediction scheme and an enhanced (block-matching algorithm BMA prediction scheme are designed and integrated into the state-of-the-art H.264/AVC framework to provide a new intra coding model. Extensive experiments demonstrate that the coding efficiency can be on average increased by 0.27 dB with comparison to the performance of the conventional H.264 coding model.

  6. MICROX-2: an improved two-region flux spectrum code for the efficient calculation of group cross sections

    International Nuclear Information System (INIS)

    Mathews, D.; Koch, P.

    1979-12-01

    The MICROX-2 code is an improved version of the MICROX code. The improvements allow MICROX-2 to be used for the efficient and rigorous preparation of broad group neutron cross sections for poorly moderated systems such as fast breeder reactors in addition to the well moderated thermal reactors for which MICROX was designed. MICROX-2 is an integral transport theory code which solves the neutron slowing down and thermalization equations on a detailed energy grid for two-region lattice cells. The fluxes in the two regions are coupled by transport corrected collision probabilities. The inner region may include two different types of grains (particles). Neutron leakage effects are treated by performing B 1 slowing down and P 0 plus DB 2 thermalization calculations in each region. Cell averaged diffusion coefficients are prepared with the Benoist cell homogenization prescription

  7. The impact of three discharge coding methods on the accuracy of diagnostic coding and hospital reimbursement for inpatient medical care.

    Science.gov (United States)

    Tsopra, Rosy; Peckham, Daniel; Beirne, Paul; Rodger, Kirsty; Callister, Matthew; White, Helen; Jais, Jean-Philippe; Ghosh, Dipansu; Whitaker, Paul; Clifton, Ian J; Wyatt, Jeremy C

    2018-07-01

    Coding of diagnoses is important for patient care, hospital management and research. However coding accuracy is often poor and may reflect methods of coding. This study investigates the impact of three alternative coding methods on the inaccuracy of diagnosis codes and hospital reimbursement. Comparisons of coding inaccuracy were made between a list of coded diagnoses obtained by a coder using (i)the discharge summary alone, (ii)case notes and discharge summary, and (iii)discharge summary with the addition of medical input. For each method, inaccuracy was determined for the primary, secondary diagnoses, Healthcare Resource Group (HRG) and estimated hospital reimbursement. These data were then compared with a gold standard derived by a consultant and coder. 107 consecutive patient discharges were analysed. Inaccuracy of diagnosis codes was highest when a coder used the discharge summary alone, and decreased significantly when the coder used the case notes (70% vs 58% respectively, p coded from the discharge summary with medical support (70% vs 60% respectively, p coding with case notes, and 35% for coding with medical support. The three coding methods resulted in an annual estimated loss of hospital remuneration of between £1.8 M and £16.5 M. The accuracy of diagnosis codes and percentage of correct HRGs improved when coders used either case notes or medical support in addition to the discharge summary. Further emphasis needs to be placed on improving the standard of information recorded in discharge summaries. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. FISP5 - an extended and improved version of the fission product inventory code FISP

    International Nuclear Information System (INIS)

    Tobias, A.

    1978-05-01

    In order to accommodate the UKFPDD-1 fission product data library the CEGB fission product inventory code FISP4 has been modified and extended. The opportunity was taken to revise the algorithm used for calculating the nuclide concentrations during irradiation in order to reduce the problem of rounding errors which arise as a result of the computer limitation to a finite word length. The resulting code FISP5 is shown in addition to offer considerable improvement in execution time in comparison with FISP4. Details of the revised algorithm are given together with a brief users' guide to FISP5. (author)

  9. Numerical method improvement for a subchannel code

    Energy Technology Data Exchange (ETDEWEB)

    Ding, W.J.; Gou, J.L.; Shan, J.Q. [Xi' an Jiaotong Univ., Shaanxi (China). School of Nuclear Science and Technology

    2016-07-15

    Previous studies showed that the subchannel codes need most CPU time to solve the matrix formed by the conservation equations. Traditional matrix solving method such as Gaussian elimination method and Gaussian-Seidel iteration method cannot meet the requirement of the computational efficiency. Therefore, a new algorithm for solving the block penta-diagonal matrix is designed based on Stone's incomplete LU (ILU) decomposition method. In the new algorithm, the original block penta-diagonal matrix will be decomposed into a block upper triangular matrix and a lower block triangular matrix as well as a nonzero small matrix. After that, the LU algorithm is applied to solve the matrix until the convergence. In order to compare the computational efficiency, the new designed algorithm is applied to the ATHAS code in this paper. The calculation results show that more than 80 % of the total CPU time can be saved with the new designed ILU algorithm for a 324-channel PWR assembly problem, compared with the original ATHAS code.

  10. Improved choked flow model for MARS code

    International Nuclear Information System (INIS)

    Chung, Moon Sun; Lee, Won Jae; Ha, Kwi Seok; Hwang, Moon Kyu

    2002-01-01

    Choked flow calculation is improved by using a new sound speed criterion for bubbly flow that is derived by the characteristic analysis of hyperbolic two-fluid model. This model was based on the notion of surface tension for the interfacial pressure jump terms in the momentum equations. Real eigenvalues obtained as the closed-form solution of characteristic polynomial represent the sound speed in the bubbly flow regime that agrees well with the existing experimental data. The present sound speed shows more reasonable result in the extreme case than the Nguyens did. The present choked flow criterion derived by the present sound speed is employed in the MARS code and assessed by using the Marviken choked flow tests. The assessment results without any adjustment made by some discharge coefficients demonstrate more accurate predictions of choked flow rate in the bubbly flow regime than those of the earlier choked flow calculations. By calculating the Typical PWR (SBLOCA) problem, we make sure that the present model can reproduce the reasonable transients of integral reactor system

  11. Improved inter-assembly heat transfer modeling under low flow conditions for the Super System Code (SSC)

    International Nuclear Information System (INIS)

    Horak, W.C.; Guppy, J.G.

    1984-01-01

    The Super System Code (SSC) was developed at the Brookhaven National Laboratory (BNL) for the thermal hydraulic analysis of natural circulation transients, operational transients, and other system wide transients in nuclear power plants. SSC is a generic, best estimate code that models the in-vessel components, heat transport loops, plant protection systems and plant control systems. SSC also simulates the balance of plant when interfaced with the MINET code. SSC has been validated against both numerical and experimental data bases and is now used by several outside users. An important area of interest in LMFBR transient analysis is the prediction of the response of the reactor core under low flow conditions, such as experienced during a natural circulation event. Under these circumstances there are many physical phenomena which must be modeled to provide an adequate representation by a computer code simulation. The present version of SSC contains numerous models which account for most of the major phenomena. However, one area where the present model in SSC is being improved is in the representation of heat transfer and buoyancy effects under low flow operation. To properly improve the present version, the addition of models to represent certain inter-assembly effects is required

  12. SWAT2: The improved SWAT code system by incorporating the continuous energy Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Okuno, Hiroshi

    2003-01-01

    SWAT is a code system, which performs the burnup calculation by the combination of the neutronics calculation code, SRAC95 and the one group burnup calculation code, ORIGEN2.1. The SWAT code system can deal with the cell geometry in SRAC95. However, a precise treatment of resonance absorptions by the SRAC95 code using the ultra-fine group cross section library is not directly applicable to two- or three-dimensional geometry models, because of restrictions in SRAC95. To overcome this problem, SWAT2 which newly introduced the continuous energy Monte Carlo code, MVP into SWAT was developed. Thereby, the burnup calculation by the continuous energy in any geometry became possible. Moreover, using the 147 group cross section library called SWAT library, the reactions which are not dealt with by SRAC95 and MVP can be treated. OECD/NEA burnup credit criticality safety benchmark problems Phase-IB (PWR, a single pin cell model) and Phase-IIIB (BWR, fuel assembly model) were calculated as a verification of SWAT2, and the results were compared with the average values of calculation results of burnup calculation code of each organization. Through two benchmark problems, it was confirmed that SWAT2 was applicable to the burnup calculation of the complicated geometry. (author)

  13. Improvements to the National Transport Code Collaboration Data Server

    Science.gov (United States)

    Alexander, David A.

    2001-10-01

    The data server of the National Transport Code Colaboration Project provides a universal network interface to interpolated or raw transport data accessible by a universal set of names. Data can be acquired from a local copy of the Iternational Multi-Tokamak (ITER) profile database as well as from TRANSP trees of MDS Plus data systems on the net. Data is provided to the user's network client via a CORBA interface, thus providing stateful data server instances, which have the advantage of remembering the desired interpolation, data set, etc. This paper will review the status and discuss the recent improvements made to the data server, such as the modularization of the data server and the addition of hdf5 and MDS Plus data file writing capability.

  14. An improved version of the MICROX-2 code

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, D. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-11-01

    The MICROX-2 code prepares broad group neutron cross sections for use in diffusion- and/or transport-theory codes from an input library of fine group and pointwise cross sections. The neutron weighting spectrum is obtained by solving the B{sub 1} neutron balance equations at about 10000 energies in a one-dimensional (planar, spherical or cylindrical), two-region unit cell. The regions are coupled by collision probabilities based upon spatially flat neutron emission. Energy dependent Dancoff factors and bucklings correct the one-dimensional calculations for multi-dimensional lattice effects. A critical buckling search option is also included. The inner region may include two different types of fuel particles (grains). This report describes the present PSI FORTRAN 90 version of the MICROX-2 code which operates on CRAY computers and IBM PC`s. The equations which are solved in the various energy ranges are given along with descriptions of various changes that have been made in the present PSI version of the code. A completely re-written description of the user input is also included. (author) 7 figs., 4 tabs., 59 refs.

  15. Feedback in LT codes for prioritized and non-prioritized data

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Popovski, Petar; Østergaard, Jan

    2012-01-01

    In this paper feedback in LT codes is investigated. The considered type of feedback is acknowledgments, where information on which symbols have been decoded is given to the transmitter. Our analysis reveals that acknowledgments has a very low potential in LT codes with standard degree distributions....... Motivated by this, we analyze the impact of acknowledgments on multi-layer LT codes. In this case, feedback proves advantageous. By using only a single feedback message, it is possible to achieve a significant performance improvement compared to traditional LT codes....

  16. The role of the PIRT process in identifying code improvements and executing code development

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.E. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Boyack, B.E. [Los Alamos National Lab., NM (United States)

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.

  17. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  18. Vectorization and improvement of nuclear codes (MEUDAS4, FORCE, STREAM V2.6, HEATING7-VP, SCDAP/RELAP5/MOD2.5, NBI3DGFN)

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Isobe, Nobuo; Machida, Masahiko; Osanai, Seiji; Yokokawa, Mitsuo

    1992-09-01

    Eight nuclear codes have been vectorized and modified to improve their performance. These codes are magnetic fluid equilibrium code MEUDAS4 (CR and FFT versions), the magnetic field analysis code FORCE, the three-dimensional heat fluid analysis code STREAM V2.6, the three-dimensional heat analysis code HEATING 7-VP, the severe accident transient analysis code SCDAP/RELAP 5/MOD 2.5 for light water reactors, the ion beam orbital analysis code NBI3DGFN, and a free electron laser analysis code. The speedup ratios of the vectorized versions to the original ones in scalar mode are 2.3-4.9, 1.9-5.4, 2.6-6.2, and 1.9 for the MEUDAS4, STREAM, FORCE, and free electron laser analysis code, respectively. The definition method of the computational regions in the HEATING7-VP is improved. The SCDAP/RELAP5/MOD2.5 is modified to use extended memory regions of the computer. In this report, outlines of the codes, techniques used in the vectorization and reorganization of the codes, verification of computed results, and improvement on the performance are presented. (author)

  19. Development and using computer codes for improvement of defect assembly detection on Russian WWER NPPs

    International Nuclear Information System (INIS)

    Likhanskii, V.; Evdokimov, I.; Zborovskii, V.; Kanukova, V.; Sorokin, A.; Taran, M.; Ugrumov, A.; Riabinin, Y.

    2009-01-01

    Diagnostic methods of fuel failure detection for improving the radiation safety and shortening of fuel reload time at Russian WWERs are currently in development . The works include creation new computer means for increase of effectiveness of fuel monitoring and reliability of leakage tests. Reliability of failure detection can be noticeably improved when we apply an integrated approach including the following methods. The first is fuel failure analysis under operating conditions. Analysis is performed with the pilot version of the expert system, which has been developed on the basis of the mechanistic code RTOP-CA. The second stage of failure monitoring is 'sipping' tests in the mast of the refueling machine. The leakage tests are the final stage of failure monitoring. A new technique with pressure cycling in the specialized casks was introduced to meet the requirements of higher reliability in detection/confirmation of the leakages. Measurements of the activity release kinetics during the pressure cycling and handling of the acquired data with the RTOP-LT code enable to evaluate a defect size in leaking fuel assembly. The mechanistic codes RTOP-CA and RTOP-LT were verified on a base of specialized experimental data and currently the code were certified by Russian authorities Rostechnadzor. Now the pressure cycling method in the specialized casks has official status and is utilized at the all Russian WWER units. Some results of application of the integrated approach to fuel failure monitoring at several Russian NPPs with WWER units are reported in the present paper. Predictions of the current version of the expert system are compared with the results of the leakage tests and with the estimations of the defect size by the pressure cycling technique. Using the RTOP-CA code the level of activity is assessed for the following fuel campaign if the leaking fuel assembly was decided to be reloaded into the core. A project of the automated computer system on the basis of

  20. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  1. Significant issues and changes for ANSI/ASME OM-1 1981, part 1, ASME OMc code-1994, and ASME OM Code-1995, Appendix I, inservice testing of pressure relief devices in light water reactor power plants

    Energy Technology Data Exchange (ETDEWEB)

    Seniuk, P.J.

    1996-12-01

    This paper identifies significant changes to the ANSI/ASME OM-1 1981, Part 1, and ASME Omc Code-1994 and ASME OM Code-1995, Appendix I, {open_quotes}Inservice Testing of Pressure Relief Devices in Light-Water Reactor Power Plants{close_quotes}. The paper describes changes to different Code editions and presents insights into the direction of the code committee and selected topics to be considered by the ASME O&M Working Group on pressure relief devices. These topics include scope issues, thermal relief valve issues, as-found and as-left set-pressure determinations, exclusions from testing, and cold setpoint bench testing. The purpose of this paper is to describe some significant issues being addressed by the O&M Working Group on Pressure Relief Devices (OM-1). The writer is currently the chair of OM-1 and the statements expressed herein represents his personal opinion.

  2. Significant issues and changes for ANSI/ASME OM-1 1981, part 1, ASME OMc code-1994, and ASME OM Code-1995, Appendix I, inservice testing of pressure relief devices in light water reactor power plants

    International Nuclear Information System (INIS)

    Seniuk, P.J.

    1996-01-01

    This paper identifies significant changes to the ANSI/ASME OM-1 1981, Part 1, and ASME Omc Code-1994 and ASME OM Code-1995, Appendix I, open-quotes Inservice Testing of Pressure Relief Devices in Light-Water Reactor Power Plantsclose quotes. The paper describes changes to different Code editions and presents insights into the direction of the code committee and selected topics to be considered by the ASME O ampersand M Working Group on pressure relief devices. These topics include scope issues, thermal relief valve issues, as-found and as-left set-pressure determinations, exclusions from testing, and cold setpoint bench testing. The purpose of this paper is to describe some significant issues being addressed by the O ampersand M Working Group on Pressure Relief Devices (OM-1). The writer is currently the chair of OM-1 and the statements expressed herein represents his personal opinion

  3. Streamlining of the RELAP5-3D Code

    International Nuclear Information System (INIS)

    Mesina, George L; Hykes, Joshua; Guillen, Donna Post

    2007-01-01

    RELAP5-3D is widely used by the nuclear community to simulate general thermal hydraulic systems and has proven to be so versatile that the spectrum of transient two-phase problems that can be analyzed has increased substantially over time. To accommodate the many new types of problems that are analyzed by RELAP5-3D, both the physics and numerical methods of the code have been continuously improved. In the area of computational methods and mathematical techniques, many upgrades and improvements have been made decrease code run time and increase solution accuracy. These include vectorization, parallelization, use of improved equation solvers for thermal hydraulics and neutron kinetics, and incorporation of improved library utilities. In the area of applied nuclear engineering, expanded capabilities include boron and level tracking models, radiation/conduction enclosure model, feedwater heater and compressor components, fluids and corresponding correlations for modeling Generation IV reactor designs, and coupling to computational fluid dynamics solvers. Ongoing and proposed future developments include improvements to the two-phase pump model, conversion to FORTRAN 90, and coupling to more computer programs. This paper summarizes the general improvements made to RELAP5-3D, with an emphasis on streamlining the code infrastructure for improved maintenance and development. With all these past, present and planned developments, it is necessary to modify the code infrastructure to incorporate modifications in a consistent and maintainable manner. Modifying a complex code such as RELAP5-3D to incorporate new models, upgrade numerics, and optimize existing code becomes more difficult as the code grows larger. The difficulty of this as well as the chance of introducing errors is significantly reduced when the code is structured. To streamline the code into a structured program, a commercial restructuring tool, FOR( ) STRUCT, was applied to the RELAP5-3D source files. The

  4. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  5. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  6. A Method for Improving the Progressive Image Coding Algorithms

    Directory of Open Access Journals (Sweden)

    Ovidiu COSMA

    2014-12-01

    Full Text Available This article presents a method for increasing the performance of the progressive coding algorithms for the subbands of images, by representing the coefficients with a code that reduces the truncation error.

  7. Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.

    Science.gov (United States)

    Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R

    2006-02-28

    The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.

  8. Advanced codes and methods supporting improved fuel cycle economics - 5493

    International Nuclear Information System (INIS)

    Curca-Tivig, F.; Maupin, K.; Thareau, S.

    2015-01-01

    AREVA's code development program was practically completed in 2014. The basic codes supporting a new generation of advanced methods are the followings. GALILEO is a state-of-the-art fuel rod performance code for PWR and BWR applications. Development is completed, implementation started in France and the U.S.A. ARCADIA-1 is a state-of-the-art neutronics/ thermal-hydraulics/ thermal-mechanics code system for PWR applications. Development is completed, implementation started in Europe and in the U.S.A. The system thermal-hydraulic codes S-RELAP5 and CATHARE-2 are not really new but still state-of-the-art in the domain. S-RELAP5 was completely restructured and re-coded such that its life cycle increases by further decades. CATHARE-2 will be replaced in the future by the new CATHARE-3. The new AREVA codes and methods are largely based on first principles modeling with an extremely broad international verification and validation data base. This enables AREVA and its customers to access more predictable licensing processes in a fast evolving regulatory environment (new safety criteria, requests for enlarged qualification databases, statistical applications, uncertainty propagation...). In this context, the advanced codes and methods and the associated verification and validation represent the key to avoiding penalties on products, on operational limits, or on methodologies themselves

  9. Delay-Aware Program Codes Dissemination Scheme in Internet of Everything

    Directory of Open Access Journals (Sweden)

    Yixuan Xu

    2016-01-01

    Full Text Available Due to recent advancements in big data, connection technologies, and smart devices, our environment is transforming into an “Internet of Everything” (IoE environment. These smart devices can obtain new or special functions by reprogramming: upgrade their soft systems through receiving new version of program codes. However, bulk codes dissemination suffers from large delay, energy consumption, and number of retransmissions because of the unreliability of wireless links. In this paper, a delay-aware program dissemination (DAPD scheme is proposed to disseminate program codes with fast, reliable, and energy-efficient style. We observe that although total energy is limited in wireless sensor network, there exists residual energy in nodes deployed far from the base station. Therefore, DAPD scheme improves the performance of bulk codes dissemination through the following two aspects. (1 Due to the fact that a high transmitting power can significantly improve the quality of wireless links, transmitting power of sensors with more residual energy is enhanced to improve link quality. (2 Due to the fact that performance of correlated dissemination tends to degrade in a highly dynamic environment, link correlation is autonomously updated in DAPD during codes dissemination to maintain improvements brought by correlated dissemination. Theoretical analysis and experimental results show that, compared with previous work, DAPD scheme improves the dissemination performance in terms of completion time, transmission cost, and the efficiency of energy utilization.

  10. Evaluating the benefits of commercial building energy codes and improving federal incentives for code adoption.

    Science.gov (United States)

    Gilbraith, Nathaniel; Azevedo, Inês L; Jaramillo, Paulina

    2014-12-16

    The federal government has the goal of decreasing commercial building energy consumption and pollutant emissions by incentivizing the adoption of commercial building energy codes. Quantitative estimates of code benefits at the state level that can inform the size and allocation of these incentives are not available. We estimate the state-level climate, environmental, and health benefits (i.e., social benefits) and reductions in energy bills (private benefits) of a more stringent code (ASHRAE 90.1-2010) relative to a baseline code (ASHRAE 90.1-2007). We find that reductions in site energy use intensity range from 93 MJ/m(2) of new construction per year (California) to 270 MJ/m(2) of new construction per year (North Dakota). Total annual benefits from more stringent codes total $506 million for all states, where $372 million are from reductions in energy bills, and $134 million are from social benefits. These total benefits range from $0.6 million in Wyoming to $49 million in Texas. Private benefits range from $0.38 per square meter in Washington State to $1.06 per square meter in New Hampshire. Social benefits range from $0.2 per square meter annually in California to $2.5 per square meter in Ohio. Reductions in human/environmental damages and future climate damages account for nearly equal shares of social benefits.

  11. Improving system modeling accuracy with Monte Carlo codes

    International Nuclear Information System (INIS)

    Johnson, A.S.

    1996-01-01

    The use of computer codes based on Monte Carlo methods to perform criticality calculations has become common-place. Although results frequently published in the literature report calculated k eff values to four decimal places, people who use the codes in their everyday work say that they only believe the first two decimal places of any result. The lack of confidence in the computed k eff values may be due to the tendency of the reported standard deviation to underestimate errors associated with the Monte Carlo process. The standard deviation as reported by the codes is the standard deviation of the mean of the k eff values for individual generations in the computer simulation, not the standard deviation of the computed k eff value compared with the physical system. A more subtle problem with the standard deviation of the mean as reported by the codes is that all the k eff values from the separate generations are not statistically independent since the k eff of a given generation is a function of k eff of the previous generation, which is ultimately based on the starting source. To produce a standard deviation that is more representative of the physical system, statistically independent values of k eff are needed

  12. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  13. Do code of conduct audits improve chemical safety in garment factories? Lessons on corporate social responsibility in the supply chain from Fair Wear Foundation.

    Science.gov (United States)

    Lindholm, Henrik; Egels-Zandén, Niklas; Rudén, Christina

    2016-10-01

    In managing chemical risks to the environment and human health in supply chains, voluntary corporate social responsibility (CSR) measures, such as auditing code of conduct compliance, play an important role. To examine how well suppliers' chemical health and safety performance complies with buyers' CSR policies and whether audited factories improve their performance. CSR audits (n = 288) of garment factories conducted by Fair Wear Foundation (FWF), an independent non-profit organization, were analyzed using descriptive statistics and statistical modeling. Forty-three per cent of factories did not comply with the FWF code of conduct, i.e. received remarks on chemical safety. Only among factories audited 10 or more times was there a significant increase in the number of factories receiving no remarks. Compliance with chemical safety requirements in garment supply chains is low and auditing is statistically correlated with improvements only at factories that have undergone numerous audits.

  14. Research on pre-processing of QR Code

    Science.gov (United States)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  15. Governance codes: facts or fictions? a study of governance codes in colombia1,2

    Directory of Open Access Journals (Sweden)

    Julián Benavides Franco

    2010-10-01

    Full Text Available This article studies the effects on accounting performance and financing decisions of Colombian firms after issuing a corporate governance code. We assemble a database of Colombian issuers and test the hypotheses of improved performance and higher leverage after issuing a code. The results show that the firms’ return on assets after the code introduction improves in excess of 1%; the effect is amplified by the code quality. Additionally, the firms leverage increased, in excess of 5%, when the code quality was factored into the analysis. These results suggest that controlling parties commitment to self restrain, by reducing their private benefits and/or the expropriation of non controlling parties, through the code introduction, is indeed an effective measure and that the financial markets agree, increasing the supply of funds to the firms.

  16. Development of best estimate auditing code for CANDU thermal-hydraulic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Won Jae; Hwang, Moon Kyu; Lim, Hong Sik [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    The main purpose of this study is to develop a thermal hydraulic auditing code for the CANDU reactor, modifying the model of existing PWR auditing tool, i.e. RELAP5/MOD3.The study was performed by reconsideration of the previous code assessment works and phenomena identification for essential accident scenario. Improvement areas of model development for auditing tool were identified based on the code comparison and PIRT results. Nine models have been improved significantly for the analysis of LOCA and Mon LOCA event. Conceptual problem or separate effect assessment have been performed to verify the model improvement. The linking calculation with CONTAIN 2.0 has been also enabled to establish the unified auditing code system. Analysis for the CANDU plant real transient and hypothetical LOCA bas been performed using the improved version. It has been concluded that the developed version can be utilized for the auditing analysis of LOCA and non-LOCA event for the CANDU reactor. 25 refs., 84 figs., 36 tabs. (Author)

  17. ASME nuclear codes and standards: Recent technical initiatives

    International Nuclear Information System (INIS)

    Feigel, R. E.

    1995-01-01

    Although nuclear power construction is currently in a hiatus in the US, ASME and its volunteer committees remain committed to continual improvements in the technical requirements in its nuclear codes. This paper provides an overview of several significant recent revisions to ASME' s nuclear codes. Additionally, other important initiatives currently being addressed by ASME committees will be described. With the largest population of operating light water nuclear plants in the world and worldwide use of its nuclear codes, ASME continues to support technical advancements in its nuclear codes and standards. While revisions of various magnitude are an ongoing process, several recent revisions embody significant changes based on state of the art design philosophy and substantial industry experience. In the design area, a significant revisions has recently been approved which will significantly reduce conservatisms in seismic piping design as well as provide simplified design rules. Major revisions have also been made to the requirements for nuclear material manufacturers and suppliers, which should result in clearer understanding of this difficult administrative area of the code. In the area of Section XI inservice rules, substantial studies are underway to investigate the application of probabilistic, risked based inspection in lieu of the current deterministic inspection philosophy. While much work still is required in this area, it is an important potential application of the emerging field of risk based inspection

  18. Reliability improvement: where do we go from here. The role of codes and standards

    International Nuclear Information System (INIS)

    Davidson, R.H.

    1976-01-01

    The role of codes and standards in contributing to the future improvement is discussed. The Nuclear Plant Reliability Data System is examined. It is suggested that two systems of the type are needed. One system should focus on component and system reliability while the other should focus on system availability, capacity factor, and fixed outage rate assessment

  19. Improvement of Computer Codes Used for Fuel Behaviour Simulation (FUMEX-III). Report of a Coordinated Research Project 2008-2012

    International Nuclear Information System (INIS)

    2013-03-01

    It is fundamental to the future of nuclear power that reactors can be run safely and economically to compete with other forms of power generation. As a consequence, it is essential to develop the understanding of fuel performance and to embody that knowledge in codes to provide best estimate predictions of fuel behaviour. This in turn leads to a better understanding of fuel performance, a reduction in operating margins, flexibility in fuel management and improved operating economics. The IAEA has therefore embarked on a series of programmes addressing different aspects of fuel behaviour modelling with the following objectives: - To assess the maturity and prediction capabilities of fuel performance codes, and to support interaction and information exchange between countries with code development and application needs (FUMEX series); - To build a database of well defined experiments suitable for code validation in association with the OECD Nuclear Energy Agency (OECD/NEA); - To transfer a mature fuel modelling code to developing countries, to support teams in these countries in their efforts to adapt the code to the requirements of particular reactors, and to provide guidance on applying the code to reactor operation and safety assessments; - To provide guidelines for code quality assurance, code licensing and code application to fuel licensing. This report describes the results of the coordinated research project on the ''Improvement of computer codes used for fuel behaviour simulation (FUMEX-III)''. This programme was initiated in 2008 and completed in 2012. It followed previous programmes on fuel modelling: D-COM 1982-1984, FUMEX 1993-1996 and FUMEX-II 2002-2006. The participants used a mixture of data derived from commercial and experimental irradiation histories, in particular data designed to investigate the mechanical interactions occurring in fuel during normal, transient and severe transient operation. All participants carried out calculations on priority

  20. Recent improvements to TRIGLAV code

    International Nuclear Information System (INIS)

    Zagar, T.; Ravnik, M.; Persic, A.

    1998-01-01

    TRIGLAV code was developed for TRIGA research reactor calculations and is based on two-dimensional diffusion equation. The main purpose of the program is calculation of the fuel elements burn-up. Calculated core burn-up and excess reactivity results are compared with experimental values. New control rod model is introduced and tested in this paper. Calculated integral control rod worth and calculated integral reactivity curves are presented and compared with measured values. Comparison with measured fuel element worth values is presented as a test for two-dimensional flux distribution calculations.(author)

  1. Improved lossless intra coding for H.264/MPEG-4 AVC.

    Science.gov (United States)

    Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J

    2006-09-01

    A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.

  2. Advanced Best-Estimate Methodologies for Thermal-Hydraulics Stability Analyses with TRACG code and Improvements on Operating Boiling Water Reactors

    International Nuclear Information System (INIS)

    Vedovi, J.; Trueba, M.; Ibarra, L; Espino, M.; Hoang, H.

    2016-01-01

    In recent years GE Hitachi has introduced two advanced methodologies to address the thermal-hydraulics instabilities in Boiling Water Reactors (BWRs); the “Detect and Suppress Solution - Confirmation Density (DSS-CD)” and the “GEH Simplified Stability Solution (GS3).” These two methodologies are based on Best-Estimate Plus Uncertainty (BEPU) analyses and provide significant improvement on safety, plant maneuvering and fuel economics with respect to existing solutions. DSS-CD and GS3 solutions have been recently approved by the United States Nuclear Regulatory Commission. This paper describes the main characteristics of these two stability methodologies and shares the experience of their recent implementation in operating BWRs. The BEPU approach provided a much deeper understanding of the parameters affecting instabilities in operating BWRs and allowed for better calculation of plant setpoints by improving plant manoeuvring restrictions and reducing manual operator actions. DSS-CD and GS3 methodologies are both based on safety analyses performed with the best-estimate system code TRACG. The assessment of uncertainty is performed following the Code Scaling, Applicability and Uncertainty (CSAU) methodology documented in NUREG/CR-5249. The two solutions have been already implemented in a combined 18 BWR units with 7 more units in the process of transitioning. The main results demonstrate a significant decrease (>0.1) in the stability based Operating Limit Minimum Critical Power Ratio (OLMCPR), which possibly results in significant fuel savings and the increase in allowable stability plant setpoints that address instability events such as the one occurred at the Fermi 2 plant in 2015 and can help prevent unnecessary Scrams. The paper also describes the advantages of reduced plant manoeuvring as a result to transitioning to these solutions; in particular the history of a BWR/6 transition to DSS-CD is discussed.

  3. APPLE-2: an improved version of APPLE code for plotting neutron and gamma ray spectra and reaction rates

    International Nuclear Information System (INIS)

    Kawasaki, Hiromitsu; Seki, Yasushi.

    1982-07-01

    A computer code APPLE-2 which plots the spatial distribution of energy spectra of multi-group neutron and/or gamma ray fluxes, and reaction rates has been developed. This code is an improved version of the previously developed APPLE code and has the following features: (1) It plots energy spectra of neutron and/or gamma ray fluxes calculated by ANISN, DOT and MORSE. (2) It calculates and plots the spatial distribution of neutron and gamma ray fluxes and various types of reaction rates such as nuclear heating rates, operational dose rates, displacement damage rates. (3) Input data specification is greatly simplified by the use of standard, response libraries and by close coupling with radiation transport calculation codes. (4) Plotting outputs are given in camera ready form. (author)

  4. Comparing the coding of complications in Queensland and Victorian admitted patient data.

    Science.gov (United States)

    Michel, Jude L; Cheng, Diana; Jackson, Terri J

    2011-08-01

    To examine differences between Queensland and Victorian coding of hospital-acquired conditions and suggest ways to improve the usefulness of these data in the monitoring of patient safety events. Secondary analysis of admitted patient episode data collected in Queensland and Victoria. Comparison of depth of coding, and patterns in the coding of ten commonly coded complications of five elective procedures. Comparison of the mean complication codes assigned per episode revealed Victoria assigns more valid codes than Queensland for all procedures, with the difference between the states being significantly different in all cases. The proportion of the codes flagged as complications was consistently lower for Queensland when comparing 10 common complications for each of the five selected elective procedures. The estimated complication rates for the five procedures showed Victoria to have an apparently higher complication rate than Queensland for 35 of the 50 complications examined. Our findings demonstrate that the coding of complications is more comprehensive in Victoria than in Queensland. It is known that inconsistencies exist between states in routine hospital data quality. Comparative use of patient safety indicators should be viewed with caution until standards are improved across Australia. More exploration of data quality issues is needed to identify areas for improvement.

  5. Recent improvements of the TNG statistical model code

    International Nuclear Information System (INIS)

    Shibata, K.; Fu, C.Y.

    1986-08-01

    The applicability of the nuclear model code TNG to cross-section evaluations has been extended. The new TNG is capable of using variable bins for outgoing particle energies. Moreover, three additional quantities can now be calculated: capture gamma-ray spectrum, the precompound mode of the (n,γ) reaction, and fission cross section. In this report, the new features of the code are described together with some sample calculations and a brief explanation of the input data. 15 refs., 6 figs., 2 tabs

  6. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  7. Parallelization of Subchannel Analysis Code MATRA

    International Nuclear Information System (INIS)

    Kim, Seongjin; Hwang, Daehyun; Kwon, Hyouk

    2014-01-01

    A stand-alone calculation of MATRA code used up pertinent computing time for the thermal margin calculations while a relatively considerable time is needed to solve the whole core pin-by-pin problems. In addition, it is strongly required to improve the computation speed of the MATRA code to satisfy the overall performance of the multi-physics coupling calculations. Therefore, a parallel approach to improve and optimize the computability of the MATRA code is proposed and verified in this study. The parallel algorithm is embodied in the MATRA code using the MPI communication method and the modification of the previous code structure was minimized. An improvement is confirmed by comparing the results between the single and multiple processor algorithms. The speedup and efficiency are also evaluated when increasing the number of processors. The parallel algorithm was implemented to the subchannel code MATRA using the MPI. The performance of the parallel algorithm was verified by comparing the results with those from the MATRA with the single processor. It is also noticed that the performance of the MATRA code was greatly improved by implementing the parallel algorithm for the 1/8 core and whole core problems

  8. Progressive significance map and its application to error-resilient image transmission.

    Science.gov (United States)

    Hu, Yang; Pearlman, William A; Li, Xin

    2012-07-01

    Set partition coding (SPC) has shown tremendous success in image compression. Despite its popularity, the lack of error resilience remains a significant challenge to the transmission of images in error-prone environments. In this paper, we propose a novel data representation called the progressive significance map (prog-sig-map) for error-resilient SPC. It structures the significance map (sig-map) into two parts: a high-level summation sig-map and a low-level complementary sig-map (comp-sig-map). Such a structured representation of the sig-map allows us to improve its error-resilient property at the price of only a slight sacrifice in compression efficiency. For example, we have found that a fixed-length coding of the comp-sig-map in the prog-sig-map renders 64% of the coded bitstream insensitive to bit errors, compared with 40% with that of the conventional sig-map. Simulation results have shown that the prog-sig-map can achieve highly competitive rate-distortion performance for binary symmetric channels while maintaining low computational complexity. Moreover, we note that prog-sig-map is complementary to existing independent packetization and channel-coding-based error-resilient approaches and readily lends itself to other source coding applications such as distributed video coding.

  9. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  10. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  11. Do code of conduct audits improve chemical safety in garment factories? Lessons on corporate social responsibility in the supply chain from Fair Wear Foundation

    Science.gov (United States)

    2016-01-01

    Background In managing chemical risks to the environment and human health in supply chains, voluntary corporate social responsibility (CSR) measures, such as auditing code of conduct compliance, play an important role. Objectives To examine how well suppliers’ chemical health and safety performance complies with buyers’ CSR policies and whether audited factories improve their performance. Methods CSR audits (n = 288) of garment factories conducted by Fair Wear Foundation (FWF), an independent non-profit organization, were analyzed using descriptive statistics and statistical modeling. Results Forty-three per cent of factories did not comply with the FWF code of conduct, i.e. received remarks on chemical safety. Only among factories audited 10 or more times was there a significant increase in the number of factories receiving no remarks. Conclusions Compliance with chemical safety requirements in garment supply chains is low and auditing is statistically correlated with improvements only at factories that have undergone numerous audits. PMID:27611103

  12. The Significance of the 2014 Corporate Governance Code of the Bank of Russia

    Directory of Open Access Journals (Sweden)

    Anna Vladislavovna Shashkova

    2014-01-01

    Full Text Available The present article focuses on corporate governance in Russia, as well as on the approval in 2014 of the Code of Corporate Governance by the Bank of Russia and by the Russian Government. The article also provides the concept of the famous foreign term Compliance. Compliance is a system based on binding rules of conduct contained in the regulations which are mandatory for the company. In order to fulfill best practices and implement local acts on the most important issues for the company, many foreign companies as well as large Russian companies have formed special Compliance departments. Taking into account such international experience and international corporate governance principles the Bank of Russia has elaborated the Corporate Governance Code, approved by the Russian Government in February 2014. Corporate Governance Code regulates a number of the most important issues of corporate governance such as shareholders'rights and fair treatment of shareholders; Board of Directors; Corporate Secretary of the Company; system of remuneration of members of the Board of Directors, executive bodies and other key executives of the company; system of risk management and internal control; disclosure of information about the company, the information policy of the company; major corporate actions. The most important issue which is analyzed by the author is the problem of the composition of the Board of Directors: the presence of independent directors in the company. According to the author the new Corporate Governance Code reflects the latest trends as well as the current situation with corporate governance in Russia today.

  13. Improvements in practical applicability of NSHEX: nodal transport calculation code for three-dimensional hexagonal-Z geometry

    International Nuclear Information System (INIS)

    Sugino, Kazuteru

    1998-07-01

    As a tool to perform a fast reactor core calculations with high accuracy, NSHEX the nodal transport calculation code for three-dimensional hexagonal-Z geometry is under development. To improve the practical applicability of NSHEX, for instance, in its application to safety analysis and commercial reactor core design studies, we investigated the basic theory used in it, improved the program performance, and evaluated its applicability to the analysis of commercial reactor cores. The current studies show the following: (1) An improvement in the treatment of radial leakage in the radial nodal coupling equation bettered calculational convergence for safety analysis calculation, so the applicability of NSHEX to safety analysis was improved. (2) As a result of comparison of results from NSHEX and the standard core calculation code, it was confirmed that there was consistency between them. (3) According to the evaluation of the effect due to the difference of calculational condition, it was found that the calculation under appropriate nodal expansion orders and Sn orders correspond to the one under most detailed condition. However further investigation is required to reduce the uncertainty in calculational results due to the treatment of high order flux moments. (4) A whole core version of NSHEX enabling calculation for any FBR core geometry has been developed, this improved general applicability of NSHEX. (5) An investigation of the applicability of the rebalance method to acceleration clarified that this improved calculational convergence and it was effective. (J.P.N.)

  14. Coded aperture solution for improving the performance of traffic enforcement cameras

    Science.gov (United States)

    Masoudifar, Mina; Pourreza, Hamid Reza

    2016-10-01

    A coded aperture camera is proposed for automatic license plate recognition (ALPR) systems. It captures images using a noncircular aperture. The aperture pattern is designed for the rapid acquisition of high-resolution images while preserving high spatial frequencies of defocused regions. It is obtained by minimizing an objective function, which computes the expected value of perceptual deblurring error. The imaging conditions and camera sensor specifications are also considered in the proposed function. The designed aperture improves the depth of field (DoF) and subsequently ALPR performance. The captured images can be directly analyzed by the ALPR software up to a specific depth, which is 13 m in our case, though it is 11 m for the circular aperture. Moreover, since the deblurring results of images captured by our aperture yield fewer artifacts than those captured by the circular aperture, images can be first deblurred and then analyzed by the ALPR software. In this way, the DoF and recognition rate can be improved at the same time. Our case study shows that the proposed camera can improve the DoF up to 17 m while it is limited to 11 m in the conventional aperture.

  15. Significant improvement in the thermal annealing process of optical resonators

    Science.gov (United States)

    Salzenstein, Patrice; Zarubin, Mikhail

    2017-05-01

    Thermal annealing performed during process improves the quality of the roughness of optical resonators reducing stresses at the periphery of their surface thus allowing higher Q-factors. After a preliminary realization, the design of the oven and the electronic method were significantly improved thanks to nichrome resistant alloy wires and chopped basalt fibers for thermal isolation during the annealing process. Q-factors can then be improved.

  16. Improved gap conductance model for the TRAC code

    International Nuclear Information System (INIS)

    Hatch, S.W.; Mandell, D.A.

    1980-01-01

    The purpose of the present work, as indicated earlier, is to improve the present constant fuel clad spacing in TRAC-P1A without significantly increasing the computer costs. It is realized that the simple model proposed may not be accurate enough for some cases, but for the initial calculations made the DELTAR model improves the predictions over the constant Δr results of TRAC-P1A and the additional computing costs are negligible

  17. SCDAP/RELAP5 code development and assessment

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.

    1996-01-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The current version of the code is SCDAP/RELAP5/MOD3.1e. Although MOD3.1e contains a number of significant improvements since the initial version of MOD3.1 was released, new models to treat the behavior of the fuel and cladding during reflood have had the most dramatic impact on the code's calculations. This paper provides a brief description of the new reflood models, presents highlights of the assessment of the current version of MOD3.1, and discusses future SCDAP/RELAP5/MOD3.2 model development activities

  18. Omega-3 fatty acid therapy dose-dependently and significantly decreased triglycerides and improved flow-mediated dilation, however, did not significantly improve insulin sensitivity in patients with hypertriglyceridemia.

    Science.gov (United States)

    Oh, Pyung Chun; Koh, Kwang Kon; Sakuma, Ichiro; Lim, Soo; Lee, Yonghee; Lee, Seungik; Lee, Kyounghoon; Han, Seung Hwan; Shin, Eak Kyun

    2014-10-20

    Experimental studies demonstrate that higher intake of omega-3 fatty acids (n-3 FA) improves insulin sensitivity, however, we reported that n-3 FA 2g therapy, most commonly used dosage did not significantly improve insulin sensitivity despite reducing triglycerides by 21% in patients. Therefore, we investigated the effects of different dosages of n-3 FA in patients with hypertriglyceridemia. This was a randomized, single-blind, placebo-controlled, parallel study. Age, sex, and body mass index were matched among groups. All patients were recommended to maintain a low fat diet. Forty-four patients (about 18 had metabolic syndrome/type 2 diabetes mellitus) in each group were given placebo, n-3 FA 1 (O1), 2 (O2), or 4 g (O4), respectively daily for 2 months. n-3 FA therapy dose-dependently and significantly decreased triglycerides and triglycerides/HDL cholesterol and improved flow-mediated dilation, compared with placebo (by ANOVA). However, each n-3 FA therapy did not significantly decrease high-sensitivity C-reactive protein and fibrinogen, compared with placebo. O1 significantly increased insulin levels and decreased insulin sensitivity (determined by QUICKI) and O2 significantly decreased plasma adiponectin levels relative to baseline measurements. Of note, when compared with placebo, each n-3 FA therapy did not significantly change insulin, glucose, adiponectin, glycated hemoglobin levels and insulin sensitivity (by ANOVA). We observed similar results in a subgroup of patients with the metabolic syndrome. n-3 FA therapy dose-dependently and significantly decreased triglycerides and improved flow-mediated dilation. Nonetheless, n-3 FA therapy did not significantly improve acute-phase reactants and insulin sensitivity in patients with hypertriglyceridemia, regardless of dosages. Copyright © 2014. Published by Elsevier Ireland Ltd.

  19. Improved Encrypted-Signals-Based Reversible Data Hiding Using Code Division Multiplexing and Value Expansion

    Directory of Open Access Journals (Sweden)

    Xianyi Chen

    2018-01-01

    Full Text Available Compared to the encrypted-image-based reversible data hiding (EIRDH method, the encrypted-signals-based reversible data hiding (ESRDH technique is a novel way to achieve a greater embedding rate and better quality of the decrypted signals. Motivated by ESRDH using signal energy transfer, we propose an improved ESRDH method using code division multiplexing and value expansion. At the beginning, each pixel of the original image is divided into several parts containing a little signal and multiple equal signals. Next, all signals are encrypted by Paillier encryption. And then a large number of secret bits are embedded into the encrypted signals using code division multiplexing and value expansion. Since the sum of elements in any spreading sequence is equal to 0, lossless quality of directly decrypted signals can be achieved using code division multiplexing on the encrypted equal signals. Although the visual quality is reduced, high-capacity data hiding can be accomplished by conducting value expansion on the encrypted little signal. The experimental results show that our method is better than other methods in terms of the embedding rate and average PSNR.

  20. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  1. Recent improvements and new features in the Westinghouse lattice physics codes

    International Nuclear Information System (INIS)

    Huria, H.C.; Buechel, R.J.

    1995-01-01

    Westinghouse has been using the ANC three-dimensional, two-energy-group nodal model for nuclear analysis and fuel management calculations for standard pressurized water reactor (PWR) reload design analysis since 1988. The cross sections are obtained from PHOENIX-P, a modified version of the PHOENIX lattice physics code for all square-assembly PWR cores. The PHOENIX-H code was developed for modeling both the VVER-1000 and VVER-440 fuel lattice configurations. The PHOENIX-H code has evolved from PHOENIX-P, the primary difference being in the neutronic solution modules. The PHOENIX-P code determines the assembly flux distribution using integral transport theory-based pin-cell nodal coupling followed by two-dimensional discrete ordinates solution in x-y geometry. The PHOENIX-H code uses the two-dimensional heterogeneous response method. The other infrastructure is identical in both the codes, and they share the same 42-group cross-section library

  2. Improvement of Secret Image Invisibility in Circulation Image with Dyadic Wavelet Based Data Hiding with Run-Length Coded Secret Images of Which Location of Codes are Determined with Random Number

    OpenAIRE

    Kohei Arai; Yuji Yamada

    2011-01-01

    An attempt is made for improvement of secret image invisibility in circulation images with dyadic wavelet based data hiding with run-length coded secret images of which location of codes are determined by random number. Through experiments, it is confirmed that secret images are almost invisible in circulation images. Also robustness of the proposed data hiding method against data compression of circulation images is discussed. Data hiding performance in terms of invisibility of secret images...

  3. An improved thermal-hydraulic modeling of the Jules Horowitz Reactor using the CATHARE2 system code

    Energy Technology Data Exchange (ETDEWEB)

    Pegonen, R., E-mail: pegonen@kth.se [KTH Royal Institute of Technology, Roslagstullsbacken 21, SE-10691 Stockholm (Sweden); Bourdon, S.; Gonnier, C. [CEA, DEN, DER, SRJH, CEA Cadarache, 13108 Saint-Paul-lez-Durance Cedex (France); Anglart, H. [KTH Royal Institute of Technology, Roslagstullsbacken 21, SE-10691 Stockholm (Sweden)

    2017-01-15

    Highlights: • An improved thermal-hydraulic modeling of the JHR reactor is described. • Thermal-hydraulics of the JHR is analyzed during loss of flow accident. • The heat exchanger approach gives more realistic and less conservative results. - Abstract: The newest European high performance material testing reactor, the Jules Horowitz Reactor, will support current and future nuclear reactor designs. The reactor is under construction at the CEA Cadarache research center in southern France and is expected to achieve first criticality at the end of this decade. This paper presents an improved thermal-hydraulic modeling of the reactor using solely CATHARE2 system code. Up to now, the CATHARE2 code was simulating the full reactor with a simplified approach for the core and the boundary conditions were transferred into the three-dimensional FLICA4 core simulation. A new more realistic methodology is utilized to analyze the thermal-hydraulic simulation of the reactor during a loss of flow accident.

  4. Application of FFTBM with signal mirroring to improve accuracy assessment of MELCOR code

    International Nuclear Information System (INIS)

    Saghafi, Mahdi; Ghofrani, Mohammad Bagher; D’Auria, Francesco

    2016-01-01

    Highlights: • FFTBM-SM is an improved Fast Fourier Transform Base Method by signal mirroring. • FFTBM-SM has been applied to accuracy assessment of MELCOR code predictions. • The case studied was Station Black-Out accident in PSB-VVER integral test facility. • FFTBM-SM eliminates fluctuations of accuracy indices when signals sharply change. • Accuracy assessment is performed in a more realistic and consistent way by FFTBM-SM. - Abstract: This paper deals with the application of Fast Fourier Transform Base Method (FFTBM) with signal mirroring (FFTBM-SM) to assess accuracy of MELCOR code. This provides deeper insights into how the accuracy of MELCOR code in predictions of thermal-hydraulic parameters varies during transients. The case studied was modeling of Station Black-Out (SBO) accident in PSB-VVER integral test facility by MELCOR code. The accuracy of this thermal-hydraulic modeling was previously quantified using original FFTBM in a few number of time-intervals, based on phenomenological windows of SBO accident. Accuracy indices calculated by original FFTBM in a series of time-intervals unreasonably fluctuate when the investigated signals sharply increase or decrease. In the current study, accuracy of MELCOR code is quantified using FFTBM-SM in a series of increasing time-intervals, and the results are compared to those with original FFTBM. Also, differences between the accuracy indices of original FFTBM and FFTBM-SM are investigated and correction factors calculated to eliminate unphysical effects in original FFTBM. The main findings are: (1) replacing limited number of phenomena-based time-intervals by a series of increasing time-intervals provides deeper insights about accuracy variation of the MELCOR calculations, and (2) application of FFTBM-SM for accuracy evaluation of the MELCOR predictions, provides more reliable results than original FFTBM by eliminating the fluctuations of accuracy indices when experimental signals sharply increase or

  5. Application of FFTBM with signal mirroring to improve accuracy assessment of MELCOR code

    Energy Technology Data Exchange (ETDEWEB)

    Saghafi, Mahdi [Department of Energy Engineering, Sharif University of Technology, Azadi Avenue, Tehran (Iran, Islamic Republic of); Ghofrani, Mohammad Bagher, E-mail: ghofrani@sharif.edu [Department of Energy Engineering, Sharif University of Technology, Azadi Avenue, Tehran (Iran, Islamic Republic of); D’Auria, Francesco [San Piero a Grado Nuclear Research Group (GRNSPG), University of Pisa, Via Livornese 1291, San Piero a Grado, Pisa (Italy)

    2016-11-15

    Highlights: • FFTBM-SM is an improved Fast Fourier Transform Base Method by signal mirroring. • FFTBM-SM has been applied to accuracy assessment of MELCOR code predictions. • The case studied was Station Black-Out accident in PSB-VVER integral test facility. • FFTBM-SM eliminates fluctuations of accuracy indices when signals sharply change. • Accuracy assessment is performed in a more realistic and consistent way by FFTBM-SM. - Abstract: This paper deals with the application of Fast Fourier Transform Base Method (FFTBM) with signal mirroring (FFTBM-SM) to assess accuracy of MELCOR code. This provides deeper insights into how the accuracy of MELCOR code in predictions of thermal-hydraulic parameters varies during transients. The case studied was modeling of Station Black-Out (SBO) accident in PSB-VVER integral test facility by MELCOR code. The accuracy of this thermal-hydraulic modeling was previously quantified using original FFTBM in a few number of time-intervals, based on phenomenological windows of SBO accident. Accuracy indices calculated by original FFTBM in a series of time-intervals unreasonably fluctuate when the investigated signals sharply increase or decrease. In the current study, accuracy of MELCOR code is quantified using FFTBM-SM in a series of increasing time-intervals, and the results are compared to those with original FFTBM. Also, differences between the accuracy indices of original FFTBM and FFTBM-SM are investigated and correction factors calculated to eliminate unphysical effects in original FFTBM. The main findings are: (1) replacing limited number of phenomena-based time-intervals by a series of increasing time-intervals provides deeper insights about accuracy variation of the MELCOR calculations, and (2) application of FFTBM-SM for accuracy evaluation of the MELCOR predictions, provides more reliable results than original FFTBM by eliminating the fluctuations of accuracy indices when experimental signals sharply increase or

  6. Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.

    Science.gov (United States)

    Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P

    2018-01-04

    clearly improved with MC-based OSEM reconstruction, e.g., the activity recovery was 88% for the largest sphere, while it was 66% for AC-OSEM and 79% for RRC-OSEM. The GPU-based MC code generated an MC-based SPECT/CT reconstruction within a few minutes, and reconstructed patient images of 177 Lu-DOTATATE treatments revealed clearly improved resolution and contrast.

  7. Improvement of MARS code through the removal of bit-packed words and multiple use of DLLs (Dynamic Link Library)

    Energy Technology Data Exchange (ETDEWEB)

    Jung, B. D.; Jung, J. J.; Ha, K. S.; Hwang, M. K.; Lee, Y. S.; Lee, W. J. [KAERI, Taejon (Korea, Republic of)

    1999-10-01

    The readability of MARS code has been enhanced greatly by replacing the bit-packed word with several logical words and integer words and recoding the related subroutines, which have the complicated bit operations and packed words. Functional improvements of code has been achieved through the multiple uses of dynamic link libraries(DLL) for containment analysis module CONTEMPT4 and multidimensional kinetics analysis module MASTER. The establishment of integrated analysis system, MARS/CONTEMPT/MASTER, was validated through the verification calculation for a postulated problem. MARS user-friendly features are also improved by displaying the 2D contour map of 3 D module data on-line. In addition to the on-line-graphics, the MARS windows menus were upgraded to include the on-line-manual, pre-view of input and output, and link to MARS web site. As a result, the readability, applicability, and user-friendly features of MARS code has been greatly enhanced.

  8. Improvement of MARS code through the removal of bit-packed words and multiple use of DLLs (Dynamic Link Library)

    International Nuclear Information System (INIS)

    Jung, B. D.; Jung, J. J.; Ha, K. S.; Hwang, M. K.; Lee, Y. S.; Lee, W. J.

    1999-01-01

    The readability of MARS code has been enhanced greatly by replacing the bit-packed word with several logical words and integer words and recoding the related subroutines, which have the complicated bit operations and packed words. Functional improvements of code has been achieved through the multiple uses of dynamic link libraries(DLL) for containment analysis module CONTEMPT4 and multidimensional kinetics analysis module MASTER. The establishment of integrated analysis system, MARS/CONTEMPT/MASTER, was validated through the verification calculation for a postulated problem. MARS user-friendly features are also improved by displaying the 2D contour map of 3 D module data on-line. In addition to the on-line-graphics, the MARS windows menus were upgraded to include the on-line-manual, pre-view of input and output, and link to MARS web site. As a result, the readability, applicability, and user-friendly features of MARS code has been greatly enhanced

  9. Improved response function calculations for scintillation detectors using an extended version of the MCNP code

    CERN Document Server

    Schweda, K

    2002-01-01

    The analysis of (e,e'n) experiments at the Darmstadt superconducting electron linear accelerator S-DALINAC required the calculation of neutron response functions for the NE213 liquid scintillation detectors used. In an open geometry, these response functions can be obtained using the Monte Carlo codes NRESP7 and NEFF7. However, for more complex geometries, an extended version of the Monte Carlo code MCNP exists. This extended version of the MCNP code was improved upon by adding individual light-output functions for charged particles. In addition, more than one volume can be defined as a scintillator, thus allowing the simultaneous calculation of the response for multiple detector setups. With the implementation of sup 1 sup 2 C(n,n'3 alpha) reactions, all relevant reactions for neutron energies E sub n <20 MeV are now taken into consideration. The results of these calculations were compared to experimental data using monoenergetic neutrons in an open geometry and a sup 2 sup 5 sup 2 Cf neutron source in th...

  10. Significant Improvement of Catalytic Efficiencies in Ionic Liquids

    International Nuclear Information System (INIS)

    Song, Choong Eui; Yoon, Mi Young; Choi, Doo Seong

    2005-01-01

    The use of ionic liquids as reaction media can confer many advantages upon catalytic reactions over reactions in organic solvents. In ionic liquids, catalysts having polar or ionic character can easily be immobilized without additional structural modification and thus the ionic solutions containing the catalyst can easily be separated from the reagents and reaction products, and then, be reused. More interestingly, switching from an organic solvent to an ionic liquid often results in a significant improvement in catalytic performance (e.g., rate acceleration, (enantio)selectivity improvement and an increase in catalyst stability). In this review, some recent interesting results which can nicely demonstrate these positive 'ionic liquid effect' on catalysis are discussed

  11. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  12. QR Code: An Interactive Mobile Advertising Tool

    Directory of Open Access Journals (Sweden)

    Ela Sibel Bayrak Meydanoglu

    2013-10-01

    Full Text Available Easy and rapid interaction between consumers and marketers enabled by mobile technology prompted  an increase in the usage of mobile media as an interactive marketing tool in recent years. One of the mobile technologies that can be used in interactive marketing for advertising is QR code (Quick Response Code. Interactive advertising brings back some advantages for the companies that apply it. For example, interaction with consumers provides significant information about consumers' preferences. Marketers can use information obtained from consumers for various marketing activities such as customizing advertisement messages, determining  target audience, improving future products and services. QR codes used in marketing campaigns can provide links to specific websites in which through various tools (e.g. questionnaires, voting information about the needs and wants of customers are collected. The aim of this basic research is to illustrate the contribution of  QR codes to the realization of the advantages gained by interactive advertising.

  13. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  14. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    Science.gov (United States)

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  15. DESIGN AND IMPLEMENTATION OF IMPROVED SUPERIMPOSED CYCLIC OPTICAL ORTHOGONAL CODES (SCOOC BASED OPTICAL ENCODER/DECODER STRUCTURE FOR 1GBPS OPTICAL CDMA SYSTEM

    Directory of Open Access Journals (Sweden)

    GURJIT KAUR

    2010-12-01

    Full Text Available In this paper, an improved form of two dimensional optical orthogonal codes is introduced for optical CDMA system by using just six lasers. This new technique not only reduces the length of the code but also improves the bit error rate (BER performance of the system. The uniqueness of this coding architecture is that the two adjacent codes are not only different by their time slots but have different wavelength combination as well. The encoder and decoder structure has been designed with the help of filters and optical delay lines. An OCDMA system at 1 Gbps bit rate is designed for above codes and performance is evaluated and compared for various parameters i.e. number of simultaneous users, bit error rate, quality factor. The OCDMA system can accommodate 25 users for permissible BER of 10-9, with -15db received power at 1 Gbps bit rate respectively. If received power is kept low i.e. -22db, the OCDMA system can support 16 users with extremely low BER of 1.58e-41 for 1G bps bit rate.

  16. Short-term hydro generation scheduling of Xiluodu and Xiangjiaba cascade hydropower stations using improved binary-real coded bee colony optimization algorithm

    International Nuclear Information System (INIS)

    Lu, Peng; Zhou, Jianzhong; Wang, Chao; Qiao, Qi; Mo, Li

    2015-01-01

    Highlights: • STHGS problem is decomposed into two parallel sub-problems of UC and ELD. • Binary coded BCO is used to solve UC sub-problem with 0–1 discrete variables. • Real coded BCO is used to solve ELD sub-problem with continuous variables. • Some heuristic repairing strategies are designed to handle various constraints. • The STHGS of Xiluodu and Xiangjiaba cascade stations is solved by IB-RBCO. - Abstract: Short-term hydro generation scheduling (STHGS) of cascade hydropower stations is a typical nonlinear mixed integer optimization problem to minimize the total water consumption while simultaneously meeting the grid requirements and other hydraulic and electrical constraints. In this paper, STHGS problem is decomposed into two parallel sub-problems of unit commitment (UC) and economic load dispatch (ELD), and the methodology of improved binary-real coded bee colony optimization (IB-RBCO) algorithm is proposed to solve them. Firstly, the improved binary coded BCO is used to solve the UC sub-problem with 0–1 discrete variables, and the heuristic repairing strategy for unit state constrains is applied to generate the feasible unit commitment schedule. Then, the improved real coded BCO is used to solve the ELD sub-problem with continuous variables, and an effective method is introduced to handle various unit operation constraints. Especially, the new updating strategy of DE/best/2/bin method with dynamic parameter control mechanism is applied to real coded BCO to improve the search ability of IB-RBCO. Finally, to verify the feasibility and effectiveness of the proposed IB-RBCO method, it is applied to solve the STHGS problem of Xiluodu and Xiangjiaba cascaded hydropower stations, and the simulating results are compared with other intelligence algorithms. The simulation results demonstrate that the proposed IB-RBCO method can get higher-quality solutions with less water consumption and shorter calculating time when facing the complex STHGS problem

  17. Training directionally selective motion pathways can significantly improve reading efficiency

    Science.gov (United States)

    Lawton, Teri

    2004-06-01

    This study examined whether perceptual learning at early levels of visual processing would facilitate learning at higher levels of processing. This was examined by determining whether training the motion pathways by practicing leftright movement discrimination, as found previously, would improve the reading skills of inefficient readers significantly more than another computer game, a word discrimination game, or the reading program offered by the school. This controlled validation study found that practicing left-right movement discrimination 5-10 minutes twice a week (rapidly) for 15 weeks doubled reading fluency, and significantly improved all reading skills by more than one grade level, whereas inefficient readers in the control groups barely improved on these reading skills. In contrast to previous studies of perceptual learning, these experiments show that perceptual learning of direction discrimination significantly improved reading skills determined at higher levels of cognitive processing, thereby being generalized to a new task. The deficits in reading performance and attentional focus experienced by the person who struggles when reading are suggested to result from an information overload, resulting from timing deficits in the direction-selectivity network proposed by Russell De Valois et al. (2000), that following practice on direction discrimination goes away. This study found that practicing direction discrimination rapidly transitions the inefficient 7-year-old reader to an efficient reader.

  18. Proposals for an effective application of the continuous improvement at the ININ according to the IAEA 50-C/SG-Q new code

    International Nuclear Information System (INIS)

    Cardenas A, B.M.; Olivares O, L.A.

    1997-01-01

    This work contains the requirements of continuous improvement contained in the IAEA new code Q uality assurance for safety in nuclear power plants and other nuclear installations, code 50-C/SG-Q. Assuming that it was the base for to elaborate the review No. 5 of the Quality assurance plan at ININ, it was done an analysis to give proposals for the application of continuous improvement in effective way. The relevant points which must be taken in account at the continuous improvement process are: Direction responsibility, involucring of all personnel, process planning, education and training, elaboration of improvement projects, investigation of processes which can be improved, continuation of the improvement process and its evaluation. With the implantation of an effective continuous improvement system it will be obtained to get a better quality and a more efficient safety. (Author)

  19. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  20. Development of fast reactor containment safety analysis code, CONTAIN-LMR. (3) Improvement of sodium-concrete reaction model

    International Nuclear Information System (INIS)

    Kawaguchi, Munemichi; Doi, Daisuke; Seino, Hiroshi; Miyahara, Shinya

    2015-01-01

    A computer code, CONTAIN-LMR, is an integrated analysis tool to predict the consequence of severe accident in a liquid metal fast reactor. Because a sodium-concrete reaction behavior is one of the most important phenomena in the accident, a Sodium-Limestone Concrete Ablation Model (SLAM) has been developed and installed into the original CONTAIN code at Sandia National Laboratories (SNL) in the U.S. The SLAM treats chemical reaction kinetics between the sodium and the concrete compositions mechanistically using a three-region model, containing a pool (sodium and reaction debris) region, a dry (boundary layer (B/L) and dehydrated concrete) region, and a wet (hydrated concrete) region, the application is limited to the reaction between sodium and limestone concrete. In order to apply SLAM to the reaction between sodium and siliceous concrete which is an ordinary structural concrete in Japan, the chemical reaction kinetics model has been improved to consider the new chemical reactions between sodium and silicon dioxide. The improved model was validated to analyze a series of sodium-concrete experiments which were conducted in Japan Atomic Energy Agency (JAEA). It has been found that relatively good agreement between calculation and experimental results is obtained and the CONTAIN-LMR code has been validated with regard to the sodium-concrete reaction phenomena. (author)

  1. Temporal Coding of Volumetric Imagery

    Science.gov (United States)

    Llull, Patrick Ryan

    of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.

  2. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  3. A strategy of implementation of the improved constitutive equations for the advanced subchannel code

    International Nuclear Information System (INIS)

    Shirai, Hiroshi; Hotta, Akitoshi; Ninokata, Hisashi

    2004-01-01

    To develop the advanced subchannel analysis code, the dominant factors that influence the boiling transitional process must be taken into account in the mechanistic constitutive equations based on the flow geometries and the fluid properties. The dominant factors that influence the boiling transitional processes are (1) the gas-liquid re-distribution by cross flow, (2) the liquid film dryout, (3) the two-phase flow regime transition, (4) the droplet deposition, and (5) the spacer-droplet interaction. At first, we indicated the strategy for the development of the constitutive equations for the five dominant factors based on the experimental database by the latest measurement technique and the latest computational fluid dynamics method. Then, the problems of the present constitutive equations and the improvement plan of the constitutive equations were indicated. Finally, the layered structure for the two-phase/three-field subchannel code including the new constitutive equations was designed. (author)

  4. OSCAR-4 Code System Application to the SAFARI-1 Reactor

    International Nuclear Information System (INIS)

    Stander, Gerhardt; Prinsloo, Rian H.; Tomasevic, Djordje I.; Mueller, Erwin

    2008-01-01

    The OSCAR reactor calculation code system consists of a two-dimensional lattice code, the three-dimensional nodal core simulator code MGRAC and related service codes. The major difference between the new version of the OSCAR system, OSCAR-4, and its predecessor, OSCAR-3, is the new version of MGRAC which contains many new features and model enhancements. In this work some of the major improvements in the nodal diffusion solution method, history tracking, nuclide transmutation and cross section models are described. As part of the validation process of the OSCAR-4 code system (specifically the new MGRAC version), some of the new models are tested by comparing computational results to SAFARI-1 reactor plant data for a number of operational cycles and for varying applications. A specific application of the new features allows correct modeling of, amongst others, the movement of fuel-follower type control rods and dynamic in-core irradiation schedules. It is found that the effect of the improved control rod model, applied over multiple cycles of the SAFARI-1 reactor operation history, has a significant effect on in-cycle reactivity prediction and fuel depletion. (authors)

  5. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 2: Verification and improvement of reactor core seismic analysis codes using core mock-up experiments. Proceedings of a research co-ordination meeting held in Vienna, 26-28 September 1994

    International Nuclear Information System (INIS)

    1995-10-01

    This report (Volume II) contains the papers summarizing the verification of and improvement to the codes on the basis of the French and Japanese data. Volume I: ''Validation of the Seismic Analysis Codes Using the Reactor Code Experiments'' (IAEA-TECDOC-798) included the Italian PEC reactor data. Refs, figs and tabs

  6. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 2: Verification and improvement of reactor core seismic analysis codes using core mock-up experiments. Proceedings of a research co-ordination meeting held in Vienna, 26-28 September 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    This report (Volume II) contains the papers summarizing the verification of and improvement to the codes on the basis of the French and Japanese data. Volume I: ``Validation of the Seismic Analysis Codes Using the Reactor Code Experiments`` (IAEA-TECDOC-798) included the Italian PEC reactor data. Refs, figs and tabs.

  7. Improving Eleventh Graders’ Reading Comprehension Through Text Coding and Double Entry Organizer Reading Strategies

    Directory of Open Access Journals (Sweden)

    Rocío Mahecha

    2011-07-01

    Full Text Available In this article we report on an innovation project developed with a group of eleventh graders at a public school in Bogotá. Its aim was to encourage students to improve reading comprehension of texts in English. It was conducted taking into account students' needs, interests and level of English. To do it, we implemented two reading strategies: text coding and double entry organizer. We observed the students' attitudes during two lesson plans, compared their level of comprehension before and after using the reading strategies and asked them to self-evaluate their performance. At the end, we could see their improvement, how they enjoyed doing the activities and became more confident.

  8. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  9. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  10. RELAP5/MOD2 code assessment

    International Nuclear Information System (INIS)

    Nithianandan, C.K.; Shah, N.H.; Schomaker, R.J.; Miller, F.R.

    1985-01-01

    Babcock and Wilcox (B and W) has been working with the code developers at EG and G and the US Nuclear Regulatory Commission in assessing the RELAP5/MOD2 computer code for the past year by simulating selected separate-effects tests. The purpose of this assessment has been to evaluate the code for use in MIST (Ref. 2) and OTIS integral system tests simulations and in the prediction of pressurized water reactor transients. B and W evaluated various versions of the code and made recommendations to improve code performance. As a result, the currently released version (cycle 36.1) has been improved considerably over earlier versions. However, further refinements to some of the constitutive models may still be needed to further improve the predictive capability of RELAP5/MOD2. The following versions of the code were evaluated. (1) RELAP/MOD2/Cycle 22 - first released version; (2) YELAP5/Cycle 32 - EG and G test version of RELAP5/MOD2/Cycle 32; (3) RELAP5/MOD2/Cycle 36 - frozen cycle for international code assessment; (4) updates to cycle 36 based on recommendations developed by B and W during the simulation of a Massachusetts Institute of Technology (MIT) pressurizer test; and (5) cycle 36.1 updates received from EG and G

  11. RELAP5/MOD2 code assessment

    Energy Technology Data Exchange (ETDEWEB)

    Nithianandan, C.K.; Shah, N.H.; Schomaker, R.J.; Miller, F.R.

    1985-11-01

    Babcock and Wilcox (B and W) has been working with the code developers at EG and G and the US Nuclear Regulatory Commission in assessing the RELAP5/MOD2 computer code for the past year by simulating selected separate-effects tests. The purpose of this assessment has been to evaluate the code for use in MIST (Ref. 2) and OTIS integral system tests simulations and in the prediction of pressurized water reactor transients. B and W evaluated various versions of the code and made recommendations to improve code performance. As a result, the currently released version (cycle 36.1) has been improved considerably over earlier versions. However, further refinements to some of the constitutive models may still be needed to further improve the predictive capability of RELAP5/MOD2. The following versions of the code were evaluated. (1) RELAP/MOD2/Cycle 22 - first released version; (2) YELAP5/Cycle 32 - EG and G test version of RELAP5/MOD2/Cycle 32; (3) RELAP5/MOD2/Cycle 36 - frozen cycle for international code assessment; (4) updates to cycle 36 based on recommendations developed by B and W during the simulation of a Massachusetts Institute of Technology (MIT) pressurizer test; and (5) cycle 36.1 updates received from EG and G.

  12. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  13. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  14. Assessment of subchannel code ASSERT-PV for flow-distribution predictions

    International Nuclear Information System (INIS)

    Nava-Dominguez, A.; Rao, Y.F.; Waddington, G.M.

    2014-01-01

    Highlights: • Assessment of the subchannel code ASSERT-PV 3.2 for the prediction of flow distribution. • Open literature and in-house experimental data to quantify ASSERT-PV predictions. • Model changes assessed against vertical and horizontal flow experiments. • Improvement of flow-distribution predictions under CANDU-relevant conditions. - Abstract: This paper reports an assessment of the recently released subchannel code ASSERT-PV 3.2 for the prediction of flow-distribution in fuel bundles, including subchannel void fraction, quality and mass fluxes. Experimental data from open literature and from in-house tests are used to assess the flow-distribution models in ASSERT-PV 3.2. The prediction statistics using the recommended model set of ASSERT-PV 3.2 are compared to those from previous code versions. Separate-effects sensitivity studies are performed to quantify the contribution of each flow-distribution model change or enhancement to the improvement in flow-distribution prediction. The assessment demonstrates significant improvement in the prediction of flow-distribution in horizontal fuel channels containing CANDU bundles

  15. Assessment of subchannel code ASSERT-PV for flow-distribution predictions

    Energy Technology Data Exchange (ETDEWEB)

    Nava-Dominguez, A., E-mail: navadoma@aecl.ca; Rao, Y.F., E-mail: raoy@aecl.ca; Waddington, G.M., E-mail: waddingg@aecl.ca

    2014-08-15

    Highlights: • Assessment of the subchannel code ASSERT-PV 3.2 for the prediction of flow distribution. • Open literature and in-house experimental data to quantify ASSERT-PV predictions. • Model changes assessed against vertical and horizontal flow experiments. • Improvement of flow-distribution predictions under CANDU-relevant conditions. - Abstract: This paper reports an assessment of the recently released subchannel code ASSERT-PV 3.2 for the prediction of flow-distribution in fuel bundles, including subchannel void fraction, quality and mass fluxes. Experimental data from open literature and from in-house tests are used to assess the flow-distribution models in ASSERT-PV 3.2. The prediction statistics using the recommended model set of ASSERT-PV 3.2 are compared to those from previous code versions. Separate-effects sensitivity studies are performed to quantify the contribution of each flow-distribution model change or enhancement to the improvement in flow-distribution prediction. The assessment demonstrates significant improvement in the prediction of flow-distribution in horizontal fuel channels containing CANDU bundles.

  16. Spatial attention improves the quality of population codes in human visual cortex.

    Science.gov (United States)

    Saproo, Sameer; Serences, John T

    2010-08-01

    Selective attention enables sensory input from behaviorally relevant stimuli to be processed in greater detail, so that these stimuli can more accurately influence thoughts, actions, and future goals. Attention has been shown to modulate the spiking activity of single feature-selective neurons that encode basic stimulus properties (color, orientation, etc.). However, the combined output from many such neurons is required to form stable representations of relevant objects and little empirical work has formally investigated the relationship between attentional modulations on population responses and improvements in encoding precision. Here, we used functional MRI and voxel-based feature tuning functions to show that spatial attention induces a multiplicative scaling in orientation-selective population response profiles in early visual cortex. In turn, this multiplicative scaling correlates with an improvement in encoding precision, as evidenced by a concurrent increase in the mutual information between population responses and the orientation of attended stimuli. These data therefore demonstrate how multiplicative scaling of neural responses provides at least one mechanism by which spatial attention may improve the encoding precision of population codes. Increased encoding precision in early visual areas may then enhance the speed and accuracy of perceptual decisions computed by higher-order neural mechanisms.

  17. Efficient coding schemes with power allocation using space-time-frequency spreading

    Institute of Scientific and Technical Information of China (English)

    Jiang Haining; Luo Hanwen; Tian Jifeng; Song Wentao; Liu Xingzhao

    2006-01-01

    An efficient space-time-frequency (STF) coding strategy for multi-input multi-output orthogonal frequency division multiplexing (MIMO-OFDM) systems is presented for high bit rate data transmission over frequency selective fading channels. The proposed scheme is a new approach to space-time-frequency coded OFDM (COFDM) that combines OFDM with space-time coding, linear precoding and adaptive power allocation to provide higher quality of transmission in terms of the bit error rate performance and power efficiency. In addition to exploiting the maximum diversity gain in frequency, time and space, the proposed scheme enjoys high coding advantages and low-complexity decoding. The significant performance improvement of our design is confirmed by corroborating numerical simulations.

  18. Ultrasound imaging using coded signals

    DEFF Research Database (Denmark)

    Misaridis, Athanasios

    Modulated (or coded) excitation signals can potentially improve the quality and increase the frame rate in medical ultrasound scanners. The aim of this dissertation is to investigate systematically the applicability of modulated signals in medical ultrasound imaging and to suggest appropriate...... methods for coded imaging, with the goal of making better anatomic and flow images and three-dimensional images. On the first stage, it investigates techniques for doing high-resolution coded imaging with improved signal-to-noise ratio compared to conventional imaging. Subsequently it investigates how...... coded excitation can be used for increasing the frame rate. The work includes both simulated results using Field II, and experimental results based on measurements on phantoms as well as clinical images. Initially a mathematical foundation of signal modulation is given. Pulse compression based...

  19. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    International Nuclear Information System (INIS)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man

    1995-07-01

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H 2 /air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author)

  20. Improvements of Physical Models in TRITGO code for Tritium Behavior Analysis in VHTR

    International Nuclear Information System (INIS)

    Yoo, Jun Soo; Tak, Nam Il; Lim, Hong Sik

    2010-01-01

    Since tritium is radioactive material with 12.32 year of half-life and is generated by a ternary fission reaction in fuel as well as by neutron absorption reactions of impurities in Very High Temperature gas-cooled Reactor (VHTR) core, accurate prediction of tritium behavior and its concentration in product hydrogen is definitely important in terms of public safety for its construction. In this respect, TRITGO code was developed for estimating the tritium production and distribution in high temperature gas-cooled reactors by General Atomics (GA). However, some models in it are hard-wired to specific reactor type or too simplified, which makes the analysis results less applicable. Thus, major improvements need to be considered for better predictions. In this study, some of model improvements have been suggested and its effect is evaluated based on the analysis work against PMR600 design concept

  1. Fresh Prime Codes Evaluation for Synchronous PPM and OPPM Signaling for Optical CDMA Networks

    Science.gov (United States)

    Karbassian, M. Massoud; Ghafouri-Shiraz, H.

    2007-06-01

    In this paper, we have proposed a novel prime spreading sequence family hereby referred to as “Double-Padded Modified Prime Code (DPMPC)” for direct-detection synchronous optical code-division multiple-access (OCDMA) networks. The new code is applied to both pulse-position and overlapping pulse-position modulation CDMA networks, and their performances were evaluated and compared with existing prime codes family. In addition, we have analyzed the system throughput and also introduced a new interference cancellation technique which significantly improves the bit error probability of OCDMA networks.

  2. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Science.gov (United States)

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  3. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Directory of Open Access Journals (Sweden)

    Yueying Wu

    Full Text Available High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI extraction using the high efficiency video coding (H.265/HEVC standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0. The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  4. Adaptation of radiation shielding code to space environment

    International Nuclear Information System (INIS)

    Okuno, Koichi; Hara, Akihisa

    1992-01-01

    Recently, the trend to the development of space has heightened. To the development of space, many problems are related, and as one of them, there is the protection from cosmic ray. The cosmic ray is the radiation having ultrahigh energy, and there was not the radiation shielding design code that copes with cosmic ray so far. Therefore, the high energy radiation shielding design code for accelerators was improved so as to cope with the peculiarity that cosmic ray possesses. Moreover, the calculation of the radiation dose equivalent rate in the moon base to which the countermeasures against cosmic ray were taken was simulated by using the improved code. As the important countermeasures for the safety protection from radiation, the covering with regolith is carried out, and the effect of regolith was confirmed by using the improved code. Galactic cosmic ray, solar flare particles, radiation belt, the adaptation of the radiation shielding code HERMES to space environment, the improvement of the three-dimensional hadron cascade code HETCKFA-2 and the electromagnetic cascade code EGS 4-KFA, and the cosmic ray simulation are reported. (K.I.)

  5. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  6. Optical code division multiple access secure communications systems with rapid reconfigurable polarization shift key user code

    Science.gov (United States)

    Gao, Kaiqiang; Wu, Chongqing; Sheng, Xinzhi; Shang, Chao; Liu, Lanlan; Wang, Jian

    2015-09-01

    An optical code division multiple access (OCDMA) secure communications system scheme with rapid reconfigurable polarization shift key (Pol-SK) bipolar user code is proposed and demonstrated. Compared to fix code OCDMA, by constantly changing the user code, the performance of anti-eavesdropping is greatly improved. The Pol-SK OCDMA experiment with a 10 Gchip/s user code and a 1.25 Gb/s user data of payload has been realized, which means this scheme has better tolerance and could be easily realized.

  7. Code meshing: Online bilingual tutoring in Higher Education

    Directory of Open Access Journals (Sweden)

    Batyi, Thelma Thokozile

    2016-12-01

    Full Text Available Students’ academic writing literacies are required to express their knowledge, as academic writing is the common mode of assessment in higher education. 28 isiXhosa-speaking first-year diploma students, who failed an academic literacies admission test evaluating the level of their academic writing literacies in the Business faculty, participated once a week over a period of eight months in a course including the practice of code meshing. In the June and November Tourism Communication tests, which also evaluated their academic writing literacies, there was a significant difference in the mean scores when compared to the admission test in the Business faculty. Their academic writing had also improved, according to their assignment marks. The researcher in this project provides evidence that code meshing as a bi/multilingual strategy could be used to improve academic writing literacies in students.

  8. ComboCoding: Combined intra-/inter-flow network coding for TCP over disruptive MANETs

    Directory of Open Access Journals (Sweden)

    Chien-Chia Chen

    2011-07-01

    Full Text Available TCP over wireless networks is challenging due to random losses and ACK interference. Although network coding schemes have been proposed to improve TCP robustness against extreme random losses, a critical problem still remains of DATA–ACK interference. To address this issue, we use inter-flow coding between DATA and ACK to reduce the number of transmissions among nodes. In addition, we also utilize a “pipeline” random linear coding scheme with adaptive redundancy to overcome high packet loss over unreliable links. The resulting coding scheme, ComboCoding, combines intra-flow and inter-flow coding to provide robust TCP transmission in disruptive wireless networks. The main contributions of our scheme are twofold; the efficient combination of random linear coding and XOR coding on bi-directional streams (DATA and ACK, and the novel redundancy control scheme that adapts to time-varying and space-varying link loss. The adaptive ComboCoding was tested on a variable hop string topology with unstable links and on a multipath MANET with dynamic topology. Simulation results show that TCP with ComboCoding delivers higher throughput than with other coding options in high loss and mobile scenarios, while introducing minimal overhead in normal operation.

  9. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  10. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  11. Recent advances in multiview distributed video coding

    Science.gov (United States)

    Dufaux, Frederic; Ouaret, Mourad; Ebrahimi, Touradj

    2007-04-01

    We consider dense networks of surveillance cameras capturing overlapped images of the same scene from different viewing directions, such a scenario being referred to as multi-view. Data compression is paramount in such a system due to the large amount of captured data. In this paper, we propose a Multi-view Distributed Video Coding approach. It allows for low complexity / low power consumption at the encoder side, and the exploitation of inter-view correlation without communications among the cameras. We introduce a combination of temporal intra-view side information and homography inter-view side information. Simulation results show both the improvement of the side information, as well as a significant gain in terms of coding efficiency.

  12. Physician involvement enhances coding accuracy to ensure national standards: an initiative to improve awareness among new junior trainees.

    Science.gov (United States)

    Nallasivan, S; Gillott, T; Kamath, S; Blow, L; Goddard, V

    2011-06-01

    Record Keeping Standards is a development led by the Royal College of Physicians of London (RCP) Health Informatics Unit and funded by the National Health Service (NHS) Connecting for Health. A supplementary report produced by the RCP makes a number of recommendations based on a study held at an acute hospital trust. We audited the medical notes and coding to assess the accuracy, documentation by the junior doctors and also to correlate our findings with the RCP audit. Northern Lincolnshire & Goole Hospitals NHS Foundation Trust has 114,000 'finished consultant episodes' per year. A total of 100 consecutive medical (50) and rheumatology (50) discharges from Diana Princess of Wales Hospital from August-October 2009 were reviewed. The results showed an improvement in coding accuracy (10% errors), comparable to the RCP audit but with 5% documentation errors. Physician involvement needs enhancing to improve the effectiveness and to ensure clinical safety.

  13. Research on the improvement of nuclear safety -Development of computing code system for level 3 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong Tae; Kim, Dong Ha; Park, Won Seok; Hwang, Mi Jeong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated. These results will give a physical insight in the development of a new dispersion model. A wind tunnel experiment with bell shaped hill model was made in order to develop a new dispersion model. And an improved dispersion model was developed based on the concentration distribution data obtained from the wind tunnel experiment. This model will be added as an option to the atmospheric dispersion code. A stand-alone atmospheric code using MS Visual Basic programming language which runs at the Windows environment of a PC was developed. A user can easily select a necessary data file and type input data by clicking menus, and can select calculation options such building wake, plume rise etc., if necessary. And a user can easily understand the meaning of concentration distribution on the map around the plant site as well as output files. Also the methodologies for the estimation of radiation exposure and for the calculation of risks was established. These methodologies will be used for the development of modules for the radiation exposure and risks respectively. These modules will be developed independently and finally will be combined to the atmospheric dispersion code in order to develop a level 3 PSA code. 30 tabs., 56 figs., refs. (Author).

  14. Research on the improvement of nuclear safety -Development of computing code system for level 3 PSA

    International Nuclear Information System (INIS)

    Jeong, Jong Tae; Kim, Dong Ha; Park, Won Seok; Hwang, Mi Jeong

    1995-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated. These results will give a physical insight in the development of a new dispersion model. A wind tunnel experiment with bell shaped hill model was made in order to develop a new dispersion model. And an improved dispersion model was developed based on the concentration distribution data obtained from the wind tunnel experiment. This model will be added as an option to the atmospheric dispersion code. A stand-alone atmospheric code using MS Visual Basic programming language which runs at the Windows environment of a PC was developed. A user can easily select a necessary data file and type input data by clicking menus, and can select calculation options such building wake, plume rise etc., if necessary. And a user can easily understand the meaning of concentration distribution on the map around the plant site as well as output files. Also the methodologies for the estimation of radiation exposure and for the calculation of risks was established. These methodologies will be used for the development of modules for the radiation exposure and risks respectively. These modules will be developed independently and finally will be combined to the atmospheric dispersion code in order to develop a level 3 PSA code. 30 tabs., 56 figs., refs. (Author)

  15. Efficient Work Team Scheduling: Using Psychological Models of Knowledge Retention to Improve Code Writing Efficiency

    Directory of Open Access Journals (Sweden)

    Michael J. Pelosi

    2014-12-01

    Full Text Available Development teams and programmers must retain critical information about their work during work intervals and gaps in order to improve future performance when work resumes. Despite time lapses, project managers want to maximize coding efficiency and effectiveness. By developing a mathematically justified, practically useful, and computationally tractable quantitative and cognitive model of learning and memory retention, this study establishes calculations designed to maximize scheduling payoff and optimize developer efficiency and effectiveness.

  16. The missing evaluation codes from order domain theory

    DEFF Research Database (Denmark)

    Andersen, Henning Ejnar; Geil, Olav

    The Feng-Rao bound gives a lower bound on the minimum distance of codes defined by means of their parity check matrices. From the Feng-Rao bound it is clear how to improve a large family of codes by leaving out certain rows in their parity check matrices. In this paper we derive a simple lower...... generalized Hamming weight. We interpret our methods into the setting of order domain theory. In this way we fill in an obvious gap in the theory of order domains. The improved codes from the present paper are not in general equal to the Feng-Rao improved codes but the constructions are very much related....

  17. Improvement of Monte Carlo code A3MCNP for large-scale shielding problems

    International Nuclear Information System (INIS)

    Miyake, Y.; Ohmura, M.; Hasegawa, T.; Ueki, K.; Sato, O.; Haghighat, A.; Sjoden, G.E.

    2004-01-01

    A 3 MCNP (Automatic Adjoint Accelerated MCNP) is a revised version of the MCNP Monte Carlo code, that automatically prepares variance reduction parameters for the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using a deterministic 'importance' (or adjoint) function, CADIS performs source and transport biasing within the weight-window technique. The current version of A 3 MCNP uses the 3-D Sn transport TORT code to determine a 3-D importance function distribution. Based on simulation of several real-life problems, it is demonstrated that A 3 MCNP provides precise calculation results with a remarkably short computation time by using the proper and objective variance reduction parameters. However, since the first version of A 3 MCNP provided only a point source configuration option for large-scale shielding problems, such as spent-fuel transport casks, a large amount of memory may be necessary to store enough points to properly represent the source. Hence, we have developed an improved version of A 3 MCNP (referred to as A 3 MCNPV) which has a volumetric source configuration option. This paper describes the successful use of A 3 MCNPV for a concrete cask streaming problem and a PWR dosimetry problem. (author)

  18. Inhaler Reminders Significantly Improve Asthma Patients' Use of Controller Medications

    Science.gov (United States)

    ... controller medications Share | Inhaler reminders significantly improve asthma patients’ use of controller medications Published Online: July 22, ... the burden and risk of asthma, but many patients do not use them regularly. This poor adherence ...

  19. Protograph-Based Raptor-Like Codes

    Science.gov (United States)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  20. An improved steam generator model for the SASSYS code

    International Nuclear Information System (INIS)

    Pizzica, P.A.

    1989-01-01

    A new steam generator model has been developed for the SASSYS computer code, which analyzes accident conditions in a liquid metal cooled fast reactor. It has been incorporated into the new SASSYS balance-of-plant model but it can also function on a stand-alone basis. The steam generator can be used in a once-through mode, or a variant of the model can be used as a separate evaporator and a superheater with recirculation loop. The new model provides for an exact steady-state solution as well as the transient calculation. There was a need for a faster and more flexible model than the old steam generator model. The new model provides for more detail with its multi-mode treatment as opposed to the previous model's one node per region approach. Numerical instability problems which were the result of cell-centered spatial differencing, fully explicit time differencing, and the moving boundary treatment of the boiling crisis point in the boiling region have been reduced. This leads to an increase in speed as larger time steps can now be taken. The new model is an improvement in many respects. 2 refs., 3 figs

  1. Cross-band noise model refinement for transform domain Wyner–Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2012-01-01

    TDWZ video coding trails that of conventional video coding solutions, mainly due to the quality of side information, inaccurate noise modeling and loss in the final coding step. The major goal of this paper is to enhance the accuracy of the noise modeling, which is one of the most important aspects...... influencing the coding performance of DVC. A TDWZ video decoder with a novel cross-band based adaptive noise model is proposed, and a noise residue refinement scheme is introduced to successively update the estimated noise residue for noise modeling after each bit-plane. Experimental results show...... that the proposed noise model and noise residue refinement scheme can improve the rate-distortion (RD) performance of TDWZ video coding significantly. The quality of the side information modeling is also evaluated by a measure of the ideal code length....

  2. The application of LDPC code in MIMO-OFDM system

    Science.gov (United States)

    Liu, Ruian; Zeng, Beibei; Chen, Tingting; Liu, Nan; Yin, Ninghao

    2018-03-01

    The combination of MIMO and OFDM technology has become one of the key technologies of the fourth generation mobile communication., which can overcome the frequency selective fading of wireless channel, increase the system capacity and improve the frequency utilization. Error correcting coding introduced into the system can further improve its performance. LDPC (low density parity check) code is a kind of error correcting code which can improve system reliability and anti-interference ability, and the decoding is simple and easy to operate. This paper mainly discusses the application of LDPC code in MIMO-OFDM system.

  3. Coupling methods for parallel running RELAPSim codes in nuclear power plant simulation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yankai; Lin, Meng, E-mail: linmeng@sjtu.edu.cn; Yang, Yanhua

    2016-02-15

    When the plant is modeled detailedly for high precision, it is hard to achieve real-time calculation for one single RELAP5 in a large-scale simulation. To improve the speed and ensure the precision of simulation at the same time, coupling methods for parallel running RELAPSim codes were proposed in this study. Explicit coupling method via coupling boundaries was realized based on a data-exchange and procedure-control environment. Compromise of synchronization frequency was well considered to improve the precision of simulation and guarantee the real-time simulation at the same time. The coupling methods were assessed using both single-phase flow models and two-phase flow models and good agreements were obtained between the splitting–coupling models and the integrated model. The mitigation of SGTR was performed as an integral application of the coupling models. A large-scope NPP simulator was developed adopting six splitting–coupling models of RELAPSim and other simulation codes. The coupling models could improve the speed of simulation significantly and make it possible for real-time calculation. In this paper, the coupling of the models in the engineering simulator is taken as an example to expound the coupling methods, i.e., coupling between parallel running RELAPSim codes, and coupling between RELAPSim code and other types of simulation codes. However, the coupling methods are also referable in other simulator, for example, a simulator employing ATHLETE instead of RELAP5, other logic code instead of SIMULINK. It is believed the coupling method is commonly used for NPP simulator regardless of the specific codes chosen in this paper.

  4. Some new quasi-twisted ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2015-09-01

    Full Text Available Let [n, k, d]_q code be a linear code of length n, dimension k and minimum Hamming distance d over GF(q. One of the basic and most important problems in coding theory is to construct codes with best possible minimum distances. In this paper seven quasi-twisted ternary linear codes are constructed. These codes are new and improve the best known lower bounds on the minimum distance in [6].

  5. Current status of the transient integral fuel element performance code URANUS

    International Nuclear Information System (INIS)

    Preusser, T.; Lassmann, K.

    1983-01-01

    To investigate the behavior of fuel pins during normal and off-normal operation, the integral fuel rod code URANUS has been extended to include a transient version. The paper describes the current status of the program system including a presentation of newly developed models for hypothetical accident investigation. The main objective of current development work is to improve the modelling of fuel and clad material behavior during fast transients. URANUS allows detailed analysis of experiments until the onset of strong material transport phenomena. Transient fission gas analysis is carried out due to the coupling with a special version of the LANGZEIT-KURZZEIT-code (KfK). Fuel restructuring and grain growth kinetics models have been improved recently to better characterize pre-experimental steady-state operation; transient models are under development. Extensive verification of the new version has been carried out by comparison with analytical solutions, experimental evidence, and code-to-code evaluation studies. URANUS, with all these improvements, has been successfully applied to difficult fast breeder fuel rod analysis including TOP, LOF, TUCOP, local coolant blockage and specific carbide fuel experiments. Objective of further studies is the description of transient PCMI. It is expected that the results of these developments will contribute significantly to the understanding of fuel element structural behavior during severe transients. (orig.)

  6. Improvement of Severe Accident Analysis Computer Code and Development of Accident Management Guidance for Heavy Water Reactor

    International Nuclear Information System (INIS)

    Park, Soo Yong; Kim, Ko Ryu; Kim, Dong Ha; Kim, See Darl; Song, Yong Mann; Choi, Young; Jin, Young Ho

    2005-03-01

    The objective of the project is to develop a generic severe accident management guidance(SAMG) applicable to Korean PHWR and the objective of this 3 year continued phase is to construct a base of the generic SAMG. Another objective is to improve a domestic computer code, ISAAC (Integrated Severe Accident Analysis code for CANDU), which still has many deficiencies to be improved in order to apply for the SAMG development. The scope and contents performed in this Phase-2 are as follows: The characteristics of major design and operation for the domestic Wolsong NPP are analyzed from the severe accident aspects. On the basis, preliminary strategies for SAM of PHWR are selected. The information needed for SAM and the methods to get that information are analyzed. Both the individual strategies applicable for accident mitigation under PHWR severe accident conditions and the technical background for those strategies are developed. A new version of ISAAC 2.0 has been developed after analyzing and modifying the existing models of ISAAC 1.0. The general SAMG applicable for PHWRs confirms severe accident management techniques for emergencies, provides the base technique to develop the plant specific SAMG by utility company and finally contributes to the public safety enhancement as a NPP safety assuring step. The ISAAC code will be used inevitably for the PSA, living PSA, severe accident analysis, SAM program development and operator training in PHWR

  7. Improvement of Severe Accident Analysis Computer Code and Development of Accident Management Guidance for Heavy Water Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Park, Soo Yong; Kim, Ko Ryu; Kim, Dong Ha; Kim, See Darl; Song, Yong Mann; Choi, Young; Jin, Young Ho

    2005-03-15

    The objective of the project is to develop a generic severe accident management guidance(SAMG) applicable to Korean PHWR and the objective of this 3 year continued phase is to construct a base of the generic SAMG. Another objective is to improve a domestic computer code, ISAAC (Integrated Severe Accident Analysis code for CANDU), which still has many deficiencies to be improved in order to apply for the SAMG development. The scope and contents performed in this Phase-2 are as follows: The characteristics of major design and operation for the domestic Wolsong NPP are analyzed from the severe accident aspects. On the basis, preliminary strategies for SAM of PHWR are selected. The information needed for SAM and the methods to get that information are analyzed. Both the individual strategies applicable for accident mitigation under PHWR severe accident conditions and the technical background for those strategies are developed. A new version of ISAAC 2.0 has been developed after analyzing and modifying the existing models of ISAAC 1.0. The general SAMG applicable for PHWRs confirms severe accident management techniques for emergencies, provides the base technique to develop the plant specific SAMG by utility company and finally contributes to the public safety enhancement as a NPP safety assuring step. The ISAAC code will be used inevitably for the PSA, living PSA, severe accident analysis, SAM program development and operator training in PHWR.

  8. Recent developments and improvements in the code system for the neutronic design of fast breeder reactors at CEA

    International Nuclear Information System (INIS)

    Giacometti; Estiot, J.C.; Palmiotti, G.; Grondein, C.; Le Cardinal, G.; Ravier, M.

    1982-09-01

    The new modular system has been developed using software tools ALOS, which give effectiveness and flexibility. Second step is the application of new algorithms to improve the resolution of diffusion equation. The state of art for the realisation of the new code system is briefly described

  9. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  10. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems

  11. Software coding for reliable data communication in a reactor safety system

    International Nuclear Information System (INIS)

    Maghsoodi, R.

    1978-01-01

    A software coding method is proposed to improve the communication reliability of a microprocessor based fast-reactor safety system. This method which replaces the conventional coding circuitry, applies a program to code the data which is communicated between the processors via their data memories. The system requirements are studied and the suitable codes are suggested. The problems associated with hardware coders, and the advantages of software coding methods are discussed. The product code which proves a faster coding time over the cyclic code is chosen as the final code. Then the improvement of the communication reliability is derived for a processor and its data memory. The result is used to calculate the reliability improvement of the processing channel as the basic unit for the safety system. (author)

  12. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  13. An Improved Real-Coded Population-Based Extremal Optimization Method for Continuous Unconstrained Optimization Problems

    Directory of Open Access Journals (Sweden)

    Guo-Qiang Zeng

    2014-01-01

    Full Text Available As a novel evolutionary optimization method, extremal optimization (EO has been successfully applied to a variety of combinatorial optimization problems. However, the applications of EO in continuous optimization problems are relatively rare. This paper proposes an improved real-coded population-based EO method (IRPEO for continuous unconstrained optimization problems. The key operations of IRPEO include generation of real-coded random initial population, evaluation of individual and population fitness, selection of bad elements according to power-law probability distribution, generation of new population based on uniform random mutation, and updating the population by accepting the new population unconditionally. The experimental results on 10 benchmark test functions with the dimension N=30 have shown that IRPEO is competitive or even better than the recently reported various genetic algorithm (GA versions with different mutation operations in terms of simplicity, effectiveness, and efficiency. Furthermore, the superiority of IRPEO to other evolutionary algorithms such as original population-based EO, particle swarm optimization (PSO, and the hybrid PSO-EO is also demonstrated by the experimental results on some benchmark functions.

  14. Feasibility study for improved steady-state initialization algorithms for the RELAP5 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; Peterson, C.E.; Katsma, K.R.

    1993-04-01

    A design for a new steady-state initialization method is presented that represents an improvement over the current method used in RELAP5. Current initialization methods for RELAP5 solve the transient fluidflow balance equations simulating a transient to achieve steady-state conditions. Because the transient solution is used, the initial conditions may change from the desired values requiring the use of controllers and long transient running times to obtain steady-state conditions for system problems. The new initialization method allows the user to fix thermal-hydraulic values in volumes and junctions where the conditions are best known and have the code compute the initial conditions in other areas of the system. The steady-state balance equations and solution methods are presented. The constitutive, component, and specialpurpose models are reviewed with respect to modifications required for the new steady-state initialization method. The requirements for user input are defined and the feasibility of the method is demonstrated with a testbed code by initializing some simple channel problems. The initialization of the sample problems using, the old and the new methods are compared

  15. Ultrasound strain imaging using Barker code

    Science.gov (United States)

    Peng, Hui; Tie, Juhong; Guo, Dequan

    2017-01-01

    Ultrasound strain imaging is showing promise as a new way of imaging soft tissue elasticity in order to help clinicians detect lesions or cancers in tissues. In this paper, Barker code is applied to strain imaging to improve its quality. Barker code as a coded excitation signal can be used to improve the echo signal-to-noise ratio (eSNR) in ultrasound imaging system. For the Baker code of length 13, the sidelobe level of the matched filter output is -22dB, which is unacceptable for ultrasound strain imaging, because high sidelobe level will cause high decorrelation noise. Instead of using the conventional matched filter, we use the Wiener filter to decode the Barker-coded echo signal to suppress the range sidelobes. We also compare the performance of Barker code and the conventional short pulse in simulation method. The simulation results demonstrate that the performance of the Wiener filter is much better than the matched filter, and Baker code achieves higher elastographic signal-to-noise ratio (SNRe) than the short pulse in low eSNR or great depth conditions due to the increased eSNR with it.

  16. Modification and validation of the natural heat convection and subcooled void formation models in the code PARET

    International Nuclear Information System (INIS)

    Hainoun, A.; Alhabit, F.; Ghazi, N.

    2008-01-01

    Two new modifications have been included in the current PARET code that is widely applied in the dynamic and safety analysis of research reactors. A new model was implemented for the simulation of void formation in the subcooled boiling regime, the other modification dealt with the implementation of a new approach to improve the prediction of heat transfer coefficient under natural circulation condition. The modified code was successfully validated using adequate single effect tests covering the physical phenomena of interest for both natural circulation and subcooled void formation at low pressure and low heat flux. The validation results indicate significant improvement of the code compared to the default version. Additionally, to simplify the code application an interactive user interface was developed enabling pre and post-processing of the code predictions. (author)

  17. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  18. System code improvements for modelling passive safety systems and their validation

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, Sebastian; Cron, Daniel von der; Schaffrath, Andreas [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    GRS has been developing the system code ATHLET over many years. Because ATHLET, among other codes, is widely used in nuclear licensing and supervisory procedures, it has to represent the current state of science and technology. New reactor concepts such as Generation III+ and IV reactors and SMR are using passive safety systems intensively. The simulation of passive safety systems with the GRS system code ATHLET is still a big challenge, because of non-defined operation points and self-setting operation conditions. Additionally, the driving forces of passive safety systems are smaller and uncertainties of parameters have a larger impact than for active systems. This paper addresses the code validation and qualification work of ATHLET on the example of slightly inclined horizontal heat exchangers, which are e. g. used as emergency condensers (e. g. in the KERENA and the CAREM) or as heat exchanger in the passive auxiliary feed water systems (PAFS) of the APR+.

  19. Leadership Class Configuration Interaction Code - Status and Opportunities

    Science.gov (United States)

    Vary, James

    2011-10-01

    With support from SciDAC-UNEDF (www.unedf.org) nuclear theorists have developed and are continuously improving a Leadership Class Configuration Interaction Code (LCCI) for forefront nuclear structure calculations. The aim of this project is to make state-of-the-art nuclear structure tools available to the entire community of researchers including graduate students. The project includes codes such as NuShellX, MFDn and BIGSTICK that run a range of computers from laptops to leadership class supercomputers. Codes, scripts, test cases and documentation have been assembled, are under continuous development and are scheduled for release to the entire research community in November 2011. A covering script that accesses the appropriate code and supporting files is under development. In addition, a Data Base Management System (DBMS) that records key information from large production runs and archived results of those runs has been developed (http://nuclear.physics.iastate.edu/info/) and will be released. Following an outline of the project, the code structure, capabilities, the DBMS and current efforts, I will suggest a path forward that would benefit greatly from a significant partnership between researchers who use the codes, code developers and the National Nuclear Data efforts. This research is supported in part by DOE under grant DE-FG02-87ER40371 and grant DE-FC02-09ER41582 (SciDAC-UNEDF).

  20. Network coding at different layers in wireless networks

    CERN Document Server

    2016-01-01

    This book focuses on how to apply network coding at different layers in wireless networks – including MAC, routing, and TCP – with special focus on cognitive radio networks. It discusses how to select parameters in network coding (e.g., coding field, number of packets involved, and redundant information ration) in order to be suitable for the varying wireless environments. The book explores how to deploy network coding in MAC to improve network performance and examines joint network coding with opportunistic routing to improve the successful rate of routing. In regards to TCP and network coding, the text considers transport layer protocol working with network coding to overcome the transmission error rate, particularly with how to use the ACK feedback of TCP to enhance the efficiency of network coding. The book pertains to researchers and postgraduate students, especially whose interests are in opportunistic routing and TCP in cognitive radio networks.

  1. Coding For Compression Of Low-Entropy Data

    Science.gov (United States)

    Yeh, Pen-Shu

    1994-01-01

    Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.

  2. Rotated Walsh-Hadamard Spreading with Robust Channel Estimation for a Coded MC-CDMA System

    Directory of Open Access Journals (Sweden)

    Raulefs Ronald

    2004-01-01

    Full Text Available We investigate rotated Walsh-Hadamard spreading matrices for a broadband MC-CDMA system with robust channel estimation in the synchronous downlink. The similarities between rotated spreading and signal space diversity are outlined. In a multiuser MC-CDMA system, possible performance improvements are based on the chosen detector, the channel code, and its Hamming distance. By applying rotated spreading in comparison to a standard Walsh-Hadamard spreading code, a higher throughput can be achieved. As combining the channel code and the spreading code forms a concatenated code, the overall minimum Hamming distance of the concatenated code increases. This asymptotically results in an improvement of the bit error rate for high signal-to-noise ratio. Higher convolutional channel code rates are mostly generated by puncturing good low-rate channel codes. The overall Hamming distance decreases significantly for the punctured channel codes. Higher channel code rates are favorable for MC-CDMA, as MC-CDMA utilizes diversity more efficiently compared to pure OFDMA. The application of rotated spreading in an MC-CDMA system allows exploiting diversity even further. We demonstrate that the rotated spreading gain is still present for a robust pilot-aided channel estimator. In a well-designed system, rotated spreading extends the performance by using a maximum likelihood detector with robust channel estimation at the receiver by about 1 dB.

  3. Two-Point Codes for the Generalised GK curve

    DEFF Research Database (Denmark)

    Barelli, Élise; Beelen, Peter; Datta, Mrinmoy

    2017-01-01

    completely cover and in many cases improve on their results, using different techniques, while also supporting any GGK curve. Our method builds on the order bound for AG codes: to enable this, we study certain Weierstrass semigroups. This allows an efficient algorithm for computing our improved bounds. We......We improve previously known lower bounds for the minimum distance of certain two-point AG codes constructed using a Generalized Giulietti–Korchmaros curve (GGK). Castellanos and Tizziotti recently described such bounds for two-point codes coming from the Giulietti–Korchmaros curve (GK). Our results...

  4. Coding in Stroke and Other Cerebrovascular Diseases.

    Science.gov (United States)

    Korb, Pearce J; Jones, William

    2017-02-01

    Accurate coding is critical for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of coding principles for patients with strokes and other cerebrovascular diseases and includes an illustrative case as a review of coding principles in a patient with acute stroke.

  5. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  6. Improved numerical grid generation techniques for the B2 edge plasma code

    International Nuclear Information System (INIS)

    Stotler, D.P.; Coster, D.P.

    1992-06-01

    Techniques used to generate grids for edge fluid codes such as B2 from numerically computed equilibria are discussed. Fully orthogonal, numerically derived grids closely resembling analytically prescribed meshes can be obtained. But, the details of the poloidal field can vary, yielding significantly different plasma parameters in the simulations. The magnitude of these differences is consistent with the predictions of an analytic model of the scrape-off layer. Both numerical and analytic grids are insensitive to changes in their defining parameters. Methods for implementing nonorthogonal boundaries in these meshes are also presented; they differ slightly from those required for fully orthogonal grids

  7. Improved Motion Estimation Using Early Zero-Block Detection

    Directory of Open Access Journals (Sweden)

    Y. Lin

    2008-07-01

    Full Text Available We incorporate the early zero-block detection technique into the UMHexagonS algorithm, which has already been adopted in H.264/AVC JM reference software, to speed up the motion estimation process. A nearly sufficient condition is derived for early zero-block detection. Although the conventional early zero-block detection method can achieve significant improvement in computation reduction, the PSNR loss, to whatever extent, is not negligible especially for high quantization parameter (QP or low bit-rate coding. This paper modifies the UMHexagonS algorithm with the early zero-block detection technique to improve its coding performance. The experimental results reveal that the improved UMHexagonS algorithm greatly reduces computation while maintaining very high coding efficiency.

  8. On Analyzing LDPC Codes over Multiantenna MC-CDMA System

    Directory of Open Access Journals (Sweden)

    S. Suresh Kumar

    2014-01-01

    Full Text Available Multiantenna multicarrier code-division multiple access (MC-CDMA technique has been attracting much attention for designing future broadband wireless systems. In addition, low-density parity-check (LDPC code, a promising near-optimal error correction code, is also being widely considered in next generation communication systems. In this paper, we propose a simple method to construct a regular quasicyclic low-density parity-check (QC-LDPC code to improve the transmission performance over the precoded MC-CDMA system with limited feedback. Simulation results show that the coding gain of the proposed QC-LDPC codes is larger than that of the Reed-Solomon codes, and the performance of the multiantenna MC-CDMA system can be greatly improved by these QC-LDPC codes when the data rate is high.

  9. Bedtime Blood Pressure Chronotherapy Significantly Improves Hypertension Management.

    Science.gov (United States)

    Hermida, Ramón C; Ayala, Diana E; Fernández, José R; Mojón, Artemio; Crespo, Juan J; Ríos, María T; Smolensky, Michael H

    2017-10-01

    Consistent evidence of numerous studies substantiates the asleep blood pressure (BP) mean derived from ambulatory BP monitoring (ABPM) is both an independent and a stronger predictor of cardiovascular disease (CVD) risk than are daytime clinic BP measurements or the ABPM-determined awake or 24-hour BP means. Hence, cost-effective adequate control of sleep-time BP is of marked clinical relevance. Ingestion time, according to circadian rhythms, of hypertension medications of 6 different classes and their combinations significantly improves BP control, particularly sleep-time BP, and reduces adverse effects. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Total sensitivity and uncertainty analysis for LWR pin-cells with improved UNICORN code

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • A new model is established for the total sensitivity and uncertainty analysis. • The NR approximation applied in S&U analysis can be avoided by the new model. • Sensitivity and uncertainty analysis is performed to PWR pin-cells by the new model. • The effects of the NR approximation for the PWR pin-cells are quantified. - Abstract: In this paper, improvements to the multigroup cross-section perturbation model have been proposed and applied in the self-developed UNICORN code, which is capable of performing the total sensitivity and total uncertainty analysis for the neutron-physics calculations by applying the direct numerical perturbation method and the statistical sampling method respectively. The narrow resonance (NR) approximation was applied in the multigroup cross-section perturbation model, implemented in UNICORN. As improvements to the NR approximation to refine the multigroup cross-section perturbation model, an ultrafine-group cross-section perturbation model has been established, in which the actual perturbations are applied to the ultrafine-group cross-section library and the reconstructions of the resonance cross sections are performed by solving the neutron slowing-down equation. The total sensitivity and total uncertainty analysis were then applied to the LWR pin-cells, using both the multigroup and the ultrafine-group cross-section perturbation models. The numerical results show that the NR approximation overestimates the relative sensitivity coefficients and the corresponding uncertainty results for the LWR pin-cells, and the effects of the NR approximation are significant for σ_(_n_,_γ_) and σ_(_n_,_e_l_a_s_) of "2"3"8U. Therefore, the effects of the NR approximation applied in the total sensitivity and total uncertainty analysis for the neutron-physics calculations of LWR should be taken into account.

  11. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    Science.gov (United States)

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p code (odds ratio 310.0, p coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  12. SCDAP/RELAP5/MOD3 code development

    International Nuclear Information System (INIS)

    Allison, C.M.; Siefken, J.L.; Coryell, E.W.

    1992-01-01

    The SCOAP/RELAP5/MOD3 computer code is designed to describe the overall reactor coolant system (RCS) thermal-hydraulic response, core damage progression, and fission product release and transport during severe accidents. The code is being developed at the Idaho National Engineering Laboratory (INEL) under the primary sponsorship of the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC). Code development activities are currently focused on three main areas - (a) code usability, (b) early phase melt progression model improvements, and (c) advanced reactor thermal-hydraulic model extensions. This paper describes the first two activities. A companion paper describes the advanced reactor model improvements being performed under RELAP5/MOD3 funding

  13. National Emergency Preparedness and Response: Improving for Incidents of National Significance

    National Research Council Canada - National Science Library

    Clayton, Christopher M

    2006-01-01

    The national emergency management system has need of significant improvement in its contingency planning and early consolidation of effort and coordination between federal, state, and local agencies...

  14. Improvement of Electromagnetic Code for Phased Array Antenna Design

    National Research Council Canada - National Science Library

    Holter, Henrik

    2007-01-01

    .... The code which is named PBFDTD (Periodic Boundary FDTD) now handles magnetic materials (lossy and loss-free). Frequency domain surface currents and the electromagnetic field in the computational volume can be visualized...

  15. Advanced Design of Dumbbell-shaped Genetic Minimal Vectors Improves Non-coding and Coding RNA Expression.

    Science.gov (United States)

    Jiang, Xiaoou; Yu, Han; Teo, Cui Rong; Tan, Genim Siu Xian; Goh, Sok Chin; Patel, Parasvi; Chua, Yiqiang Kevin; Hameed, Nasirah Banu Sahul; Bertoletti, Antonio; Patzel, Volker

    2016-09-01

    Dumbbell-shaped DNA minimal vectors lacking nontherapeutic genes and bacterial sequences are considered a stable, safe alternative to viral, nonviral, and naked plasmid-based gene-transfer systems. We investigated novel molecular features of dumbbell vectors aiming to reduce vector size and to improve the expression of noncoding or coding RNA. We minimized small hairpin RNA (shRNA) or microRNA (miRNA) expressing dumbbell vectors in size down to 130 bp generating the smallest genetic expression vectors reported. This was achieved by using a minimal H1 promoter with integrated transcriptional terminator transcribing the RNA hairpin structure around the dumbbell loop. Such vectors were generated with high conversion yields using a novel protocol. Minimized shRNA-expressing dumbbells showed accelerated kinetics of delivery and transcription leading to enhanced gene silencing in human tissue culture cells. In primary human T cells, minimized miRNA-expressing dumbbells revealed higher stability and triggered stronger target gene suppression as compared with plasmids and miRNA mimics. Dumbbell-driven gene expression was enhanced up to 56- or 160-fold by implementation of an intron and the SV40 enhancer compared with control dumbbells or plasmids. Advanced dumbbell vectors may represent one option to close the gap between durable expression that is achievable with integrating viral vectors and short-term effects triggered by naked RNA.

  16. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  17. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  18. Space-Time Turbo Trellis Coded Modulation for Wireless Data Communications

    Directory of Open Access Journals (Sweden)

    Welly Firmanto

    2002-05-01

    Full Text Available This paper presents the design of space-time turbo trellis coded modulation (ST turbo TCM for improving the bandwidth efficiency and the reliability of future wireless data networks. We present new recursive space-time trellis coded modulation (STTC which outperform feedforward STTC proposed in by Tarokh et al. (1998 and Baro et al. (2000 on slow and fast fading channels. A substantial improvement in performance can be obtained by constructing ST turbo TCM which consists of concatenated recursive STTC, decoded by iterative decoding algorithm. The proposed recursive STTC are used as constituent codes in this scheme. They have been designed to satisfy the design criteria for STTC on slow and fast fading channels, derived for systems with the product of transmit and receive antennas larger than 3. The proposed ST turbo TCM significantly outperforms the best known STTC on both slow and fast fading channels. The capacity of this scheme on fast fading channels is less than 3 dB away from the theoretical capacity bound for multi-input multi-output (MIMO channels.

  19. Impact of the Revised Malaysian Code on Corporate Governance on Audit Committee Attributes and Firm Performance

    OpenAIRE

    KALLAMU, Basiru Salisu

    2016-01-01

    Abstract. Using a sample of 37 finance companies listed under the finance segment of Bursa Malaysia, we examined the impact of the revision to Malaysian code on corporate governance on audit committee attributes and firm performance. Our result suggests that audit committee attributes significantly improved after the Code was revised. In addition, the coefficient for audit committee and risk committee interlock has a significant negative relationship with Tobin’s Q in the period before the re...

  20. Channel modeling, signal processing and coding for perpendicular magnetic recording

    Science.gov (United States)

    Wu, Zheng

    With the increasing areal density in magnetic recording systems, perpendicular recording has replaced longitudinal recording to overcome the superparamagnetic limit. Studies on perpendicular recording channels including aspects of channel modeling, signal processing and coding techniques are presented in this dissertation. To optimize a high density perpendicular magnetic recording system, one needs to know the tradeoffs between various components of the system including the read/write transducers, the magnetic medium, and the read channel. We extend the work by Chaichanavong on the parameter optimization for systems via design curves. Different signal processing and coding techniques are studied. Information-theoretic tools are utilized to determine the acceptable region for the channel parameters when optimal detection and linear coding techniques are used. Our results show that a considerable gain can be achieved by the optimal detection and coding techniques. The read-write process in perpendicular magnetic recording channels includes a number of nonlinear effects. Nonlinear transition shift (NLTS) is one of them. The signal distortion induced by NLTS can be reduced by write precompensation during data recording. We numerically evaluate the effect of NLTS on the read-back signal and examine the effectiveness of several write precompensation schemes in combating NLTS in a channel characterized by both transition jitter noise and additive white Gaussian electronics noise. We also present an analytical method to estimate the bit-error-rate and use it to help determine the optimal write precompensation values in multi-level precompensation schemes. We propose a mean-adjusted pattern-dependent noise predictive (PDNP) detection algorithm for use on the channel with NLTS. We show that this detector can offer significant improvements in bit-error-rate (BER) compared to conventional Viterbi and PDNP detectors. Moreover, the system performance can be further improved by

  1. Multispectral code excited linear prediction coding and its application in magnetic resonance images.

    Science.gov (United States)

    Hu, J H; Wang, Y; Cahill, P T

    1997-01-01

    This paper reports a multispectral code excited linear prediction (MCELP) method for the compression of multispectral images. Different linear prediction models and adaptation schemes have been compared. The method that uses a forward adaptive autoregressive (AR) model has been proven to achieve a good compromise between performance, complexity, and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over nonoverlapping three-dimensional (3-D) macroblocks. Each macroblock is further divided into several 3-D micro-blocks, and the best excitation signal for each microblock is determined through an analysis-by-synthesis procedure. The MFCELP method has been applied to multispectral magnetic resonance (MR) images. To satisfy the high quality requirement for medical images, the error between the original image set and the synthesized one is further specified using a vector quantizer. This method has been applied to images from 26 clinical MR neuro studies (20 slices/study, three spectral bands/slice, 256x256 pixels/band, 12 b/pixel). The MFCELP method provides a significant visual improvement over the discrete cosine transform (DCT) based Joint Photographers Expert Group (JPEG) method, the wavelet transform based embedded zero-tree wavelet (EZW) coding method, and the vector tree (VT) coding method, as well as the multispectral segmented autoregressive moving average (MSARMA) method we developed previously.

  2. Development status of TUF code

    International Nuclear Information System (INIS)

    Liu, W.S.; Tahir, A.; Zaltsgendler

    1996-01-01

    An overview of the important development of the TUF code in 1995 is presented. The development in the following areas is presented: control of round-off error propagation, gas resolution and release models, and condensation induced water hammer. This development is mainly generated from station requests for operational support and code improvement. (author)

  3. Benchmark calculation of subchannel analysis codes

    International Nuclear Information System (INIS)

    1996-02-01

    In order to evaluate the analysis capabilities of various subchannel codes used in thermal-hydraulic design of light water reactors, benchmark calculations were performed. The selected benchmark problems and major findings obtained by the calculations were as follows: (1)As for single-phase flow mixing experiments between two channels, the calculated results of water temperature distribution along the flow direction were agreed with experimental results by tuning turbulent mixing coefficients properly. However, the effect of gap width observed in the experiments could not be predicted by the subchannel codes. (2)As for two-phase flow mixing experiments between two channels, in high water flow rate cases, the calculated distributions of air and water flows in each channel were well agreed with the experimental results. In low water flow cases, on the other hand, the air mixing rates were underestimated. (3)As for two-phase flow mixing experiments among multi-channels, the calculated mass velocities at channel exit under steady-state condition were agreed with experimental values within about 10%. However, the predictive errors of exit qualities were as high as 30%. (4)As for critical heat flux(CHF) experiments, two different results were obtained. A code indicated that the calculated CHF's using KfK or EPRI correlations were well agreed with the experimental results, while another code suggested that the CHF's were well predicted by using WSC-2 correlation or Weisman-Pei mechanistic model. (5)As for droplets entrainment and deposition experiments, it was indicated that the predictive capability was significantly increased by improving correlations. On the other hand, a remarkable discrepancy between codes was observed. That is, a code underestimated the droplet flow rate and overestimated the liquid film flow rate in high quality cases, while another code overestimated the droplet flow rate and underestimated the liquid film flow rate in low quality cases. (J.P.N.)

  4. Potential of coded excitation in medical ultrasound imaging

    DEFF Research Database (Denmark)

    Misaridis, Athanasios; Gammelmark, Kim; Jørgensen, C. H.

    2000-01-01

    Improvement in SNR and/or penetration depth can be achieved in medical ultrasoundby using long coded waveforms, in a similar manner as in radars or sonars.However, the time-bandwidth product (TB) improvement, and thereby SNRimprovement is considerably lower in medical ultrasound, due...... codes have a larger bandwidth than the transducerin a typical medical ultrasound system can drive, a more careful code designhas been proven essential. Simulation results are also presented forcomparison.This paper presents an improved non-linear FM signal appropriatefor ultrasonic applications. The new...... coded waveform exhibits distinctfeatures, that make it very attractive in the implementation of codedultrasound systems. The range resolution that can be achieved is comparableto that of a conventional system, depending on the transducer's bandwidth andcan even be better for broad-band transducers...

  5. Exploring a QoS Driven Scheduling Approach for Peer-to-Peer Live Streaming Systems with Network Coding

    Science.gov (United States)

    Cui, Laizhong; Lu, Nan; Chen, Fu

    2014-01-01

    Most large-scale peer-to-peer (P2P) live streaming systems use mesh to organize peers and leverage pull scheduling to transmit packets for providing robustness in dynamic environment. The pull scheduling brings large packet delay. Network coding makes the push scheduling feasible in mesh P2P live streaming and improves the efficiency. However, it may also introduce some extra delays and coding computational overhead. To improve the packet delay, streaming quality, and coding overhead, in this paper are as follows. we propose a QoS driven push scheduling approach. The main contributions of this paper are: (i) We introduce a new network coding method to increase the content diversity and reduce the complexity of scheduling; (ii) we formulate the push scheduling as an optimization problem and transform it to a min-cost flow problem for solving it in polynomial time; (iii) we propose a push scheduling algorithm to reduce the coding overhead and do extensive experiments to validate the effectiveness of our approach. Compared with previous approaches, the simulation results demonstrate that packet delay, continuity index, and coding ratio of our system can be significantly improved, especially in dynamic environments. PMID:25114968

  6. Addressing challenges in bar-code scanning of large-volume infusion bags.

    Science.gov (United States)

    Raman, Kirthana; Heelon, Mark; Kerr, Gary; Higgins, Thomas L

    2011-08-01

    A hospital pharmacy's efforts to identify and address challenges with bedside scanning of bar codes on large-volume parenteral (LVP) infusion bags are described. Bar-code-assisted medication administration (BCMA) has been shown to reduce medication errors and improve patient safety. After the pilot implementation of a BCMA system and point-of-care scanning procedures at a medical center's intensive care unit, it was noted that nurses' attempted bedside scans of certain LVP bags for product identification purposes often were not successful. An investigation and root-cause analysis, including observation of nurses' scanning technique by a multidisciplinary team, determined that the scanning failures stemmed from the placement of two bar-code imprints-one with the product identification code and another, larger imprint with the expiration date and lot number-adjacently on the LVP bags. The nursing staff was educated on a modified scanning technique, which resulted in significantly improved success rates in the scanning of the most commonly used LVP bags. Representatives of the LVP bag manufacturer met with hospital staff to discuss the problem and corrective measures. As part of a subsequent infusion bag redesign, the manufacturer discontinued the use of the bar-code imprint implicated in the scanning failures. Failures in scanning LVP bags were traced to problematic placement of bar-code imprints on the bags. Interdisciplinary collaboration, consultation with the bag manufacturer, and education of the nursing and pharmacy staff resulted in a reduction in scanning failures and the manufacturer's removal of one of the bar codes from its LVP bags.

  7. Effective coding with VHDL principles and best practice

    CERN Document Server

    Jasinski, Ricardo

    2016-01-01

    A guide to applying software design principles and coding practices to VHDL to improve the readability, maintainability, and quality of VHDL code. This book addresses an often-neglected aspect of the creation of VHDL designs. A VHDL description is also source code, and VHDL designers can use the best practices of software development to write high-quality code and to organize it in a design. This book presents this unique set of skills, teaching VHDL designers of all experience levels how to apply the best design principles and coding practices from the software world to the world of hardware. The concepts introduced here will help readers write code that is easier to understand and more likely to be correct, with improved readability, maintainability, and overall quality. After a brief review of VHDL, the book presents fundamental design principles for writing code, discussing such topics as design, quality, architecture, modularity, abstraction, and hierarchy. Building on these concepts, the book then int...

  8. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  9. Code-specific learning rules improve action selection by populations of spiking neurons.

    Science.gov (United States)

    Friedrich, Johannes; Urbanczik, Robert; Senn, Walter

    2014-08-01

    Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.

  10. New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin

    2017-03-01

    This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.

  11. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  12. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  13. Graphical user interface development for the MARS code

    International Nuclear Information System (INIS)

    Jeong, J.-J.; Hwang, M.; Lee, Y.J.; Kim, K.D.; Chung, B.D.

    2003-01-01

    KAERI has developed the best-estimate thermal-hydraulic system code MARS using the RELAP5/MOD3 and COBRA-TF codes. To exploit the excellent features of the two codes, we consolidated the two codes. Then, to improve the readability, maintainability, and portability of the consolidated code, all the subroutines were completely restructured by employing a modular data structure. At present, a major part of the MARS code development program is underway to improve the existing capabilities. The code couplings with three-dimensional neutron kinetics, containment analysis, and transient critical heat flux calculations have also been carried out. At the same time, graphical user interface (GUI) tools have been developed for user friendliness. This paper presents the main features of the MARS GUI. The primary objective of the GUI development was to provide a valuable aid for all levels of MARS users in their output interpretation and interactive controls. Especially, an interactive control function was designed to allow operator actions during simulation so that users can utilize the MARS code like conventional nuclear plant analyzers (NPAs). (author)

  14. Multiple Schemes for Mobile Payment Authentication Using QR Code and Visual Cryptography

    Directory of Open Access Journals (Sweden)

    Jianfeng Lu

    2017-01-01

    Full Text Available QR code (quick response code is used due to its beneficial properties, especially in the mobile payment field. However, there exists an inevitable risk in the transaction process. It is not easily perceived that the attacker tampers with or replaces the QR code that contains merchant’s beneficiary account. Thus, it is of great urgency to conduct authentication of QR code. In this study, we propose a novel mechanism based on visual cryptography scheme (VCS and aesthetic QR code, which contains three primary schemes for different concealment levels. The main steps of these schemes are as follows. Firstly, one original QR code is split into two shadows using VC multiple rules; secondly, the two shadows are embedded into the same background image, respectively, and the embedded results are fused with the same carrier QR code, respectively, using XOR mechanism of RS and QR code error correction mechanism. Finally, the two aesthetic QR codes can be stacked precisely and the original QR code is restored according to the defined VCS. Experiments corresponding to three proposed schemes are conducted and demonstrate the feasibility and security of the mobile payment authentication, the significant improvement of the concealment for the shadows in QR code, and the diversity of mobile payment authentication.

  15. Method for improvement of gamma-transition cascade spectra amplitude resolution by computer processing of coincidence codes

    International Nuclear Information System (INIS)

    Sukhovoj, A.M.; Khitrov, V.A.

    1982-01-01

    A method of improvement of amplitude resolution in the case of record of coinciding codes on the magnetic tape is suggested. It is shown on the record with Ge(Li) detectors of cascades of gamma-transitions from the 35 Cl(n, #betta#) reaction that total width at a half maximum of the peak may decrease by a factor of 2.6 for quanta with the energy similar to the neutron binding energy. Efficiency loss is absent

  16. Improving the identification of people with dementia in primary care: evaluation of the impact of primary care dementia coding guidance on identified prevalence.

    Science.gov (United States)

    Russell, Paul; Banerjee, Sube; Watt, Jen; Adleman, Rosalyn; Agoe, Belinda; Burnie, Nerida; Carefull, Alex; Chandan, Kiran; Constable, Dominie; Daniels, Mark; Davies, David; Deshmukh, Sid; Huddart, Martin; Jabin, Ashrafi; Jarrett, Penelope; King, Jenifer; Koch, Tamar; Kumar, Sanjoy; Lees, Stavroula; Mir, Sinan; Naidoo, Dominic; Nyame, Sylvia; Sasae, Ryuichiro; Sharma, Tushar; Thormod, Clare; Vedavanam, Krish; Wilton, Anja; Flaherty, Breda

    2013-12-23

    Improving dementia care is a policy priority nationally and internationally; there is a 'diagnosis gap' with less than half of the cases of dementia ever diagnosed. The English Health Department's Quality and Outcomes Framework (QOF) encourages primary care recognition and recording of dementia. The codes for dementia are complex with the possibility of underidentification through miscoding. We developed guidance on coding of dementia; we report the impact of applying this to 'clean up' dementia coding and records at a practice level. The guidance had five elements: (1) identify Read Codes for dementia; (2) access QOF dementia register; (3) generate lists of patients who may have dementia; (4) compare search with QOF data and (5) review cases. In each practice, one general practitioner conducted the exercise. The number of dementia QOF registers before and after the exercise was recorded with the hours taken to complete the exercise. London primary care. 23 (85%) of 27 practices participated, covering 79 312 (19 562 over 65 s) participants. The number on dementia QOF registers; time taken. The number of people with dementia on QOF registers increased from 1007 to 1139 (χ(2)=8.17, p=0.004), raising identification rates by 8.8%. It took 4.7 h per practice, on an average. These data demonstrate the potential of a simple primary care coding exercise, requiring no specific training, to increase the dementia identification rate. An improvement of 8.8% between 2011 and 2012 is equivalent to that of the fourth most improved primary care trust in the UK. In absolute terms, if this effects were mirrored across the UK primary care, the number of cases with dementia identified would rise by over 70 000 from 364 329 to 434 488 raising the recognition rate from 46% to 54.8%. Implementing this exercise appears to be a simple and effective way to improve recognition rates in primary care.

  17. Secure Coding for Safety I and C Systems on Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Y. M.; Park, H. S.; Kim, T. H.

    2015-01-01

    This paper addresses secure coding technologies which can reduce the software vulnerabilities and provides secure coding application guidelines for nuclear safety I and C systems. The use of digital equipment may improve their reliability and reduce maintenance costs. But, the design characteristics of nuclear I and C systems are becoming more complex and the possibility of cyber-attacks using software vulnerabilities has been increased. Software defects, bugs and logic flaws have been consistently the primary causes of software vulnerabilities which can introduce security vulnerabilities. In this study, we described a applying methods for secure coding which can reduce the software vulnerabilities. Software defects lists, countermeasures for each defect and coding rules can be applied properly depending on target system's condition. We expect that the results of this study can help developing the secure coding guidelines and significantly reducing or eliminating vulnerabilities in nuclear safety I and C software

  18. Secure Coding for Safety I and C Systems on Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Y. M.; Park, H. S. [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kim, T. H. [Formal Works Inc., Seoul (Korea, Republic of)

    2015-10-15

    This paper addresses secure coding technologies which can reduce the software vulnerabilities and provides secure coding application guidelines for nuclear safety I and C systems. The use of digital equipment may improve their reliability and reduce maintenance costs. But, the design characteristics of nuclear I and C systems are becoming more complex and the possibility of cyber-attacks using software vulnerabilities has been increased. Software defects, bugs and logic flaws have been consistently the primary causes of software vulnerabilities which can introduce security vulnerabilities. In this study, we described a applying methods for secure coding which can reduce the software vulnerabilities. Software defects lists, countermeasures for each defect and coding rules can be applied properly depending on target system's condition. We expect that the results of this study can help developing the secure coding guidelines and significantly reducing or eliminating vulnerabilities in nuclear safety I and C software.

  19. Decoding Algorithms for Random Linear Network Codes

    DEFF Research Database (Denmark)

    Heide, Janus; Pedersen, Morten Videbæk; Fitzek, Frank

    2011-01-01

    We consider the problem of efficient decoding of a random linear code over a finite field. In particular we are interested in the case where the code is random, relatively sparse, and use the binary finite field as an example. The goal is to decode the data using fewer operations to potentially...... achieve a high coding throughput, and reduce energy consumption.We use an on-the-fly version of the Gauss-Jordan algorithm as a baseline, and provide several simple improvements to reduce the number of operations needed to perform decoding. Our tests show that the improvements can reduce the number...

  20. The effect of total noise on two-dimension OCDMA codes

    Science.gov (United States)

    Dulaimi, Layth A. Khalil Al; Badlishah Ahmed, R.; Yaakob, Naimah; Aljunid, Syed A.; Matem, Rima

    2017-11-01

    In this research, we evaluate the performance of total noise effect on two dimension (2-D) optical code-division multiple access (OCDMA) performance systems using 2-D Modified Double Weight MDW under various link parameters. The impact of the multi-access interference (MAI) and other noise effect on the system performance. The 2-D MDW is compared mathematically with other codes which use similar techniques. We analyzed and optimized the data rate and effective receive power. The performance and optimization of MDW code in OCDMA system are reported, the bit error rate (BER) can be significantly improved when the 2-D MDW code desired parameters are selected especially the cross correlation properties. It reduces the MAI in the system compensate BER and phase-induced intensity noise (PIIN) in incoherent OCDMA The analysis permits a thorough understanding of PIIN, shot and thermal noises impact on 2-D MDW OCDMA system performance. PIIN is the main noise factor in the OCDMA network.

  1. The effect of total noise on two-dimension OCDMA codes

    Directory of Open Access Journals (Sweden)

    Khalil Al Dulaimi Layth A.

    2017-01-01

    Full Text Available In this research, we evaluate the performance of total noise effect on two dimension (2-D optical code-division multiple access (OCDMA performance systems using 2-D Modified Double Weight MDW under various link parameters. The impact of the multi-access interference (MAI and other noise effect on the system performance. The 2-D MDW is compared mathematically with other codes which use similar techniques. We analyzed and optimized the data rate and effective receive power. The performance and optimization of MDW code in OCDMA system are reported, the bit error rate (BER can be significantly improved when the 2-D MDW code desired parameters are selected especially the cross correlation properties. It reduces the MAI in the system compensate BER and phase-induced intensity noise (PIIN in incoherent OCDMA The analysis permits a thorough understanding of PIIN, shot and thermal noises impact on 2-D MDW OCDMA system performance. PIIN is the main noise factor in the OCDMA network.

  2. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  3. An approach to improving the structure of error-handling code in the linux kernel

    DEFF Research Database (Denmark)

    Saha, Suman; Lawall, Julia; Muller, Gilles

    2011-01-01

    The C language does not provide any abstractions for exception handling or other forms of error handling, leaving programmers to devise their own conventions for detecting and handling errors. The Linux coding style guidelines suggest placing error handling code at the end of each function, where...... an automatic program transformation that transforms error-handling code into this style. We have applied our transformation to the Linux 2.6.34 kernel source code, on which it reorganizes the error handling code of over 1800 functions, in about 25 minutes....

  4. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  5. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  6. Clean Code - Why you should care

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    - Martin Fowler Writing code is communication, not solely with the computer that executes it, but also with other developers and with oneself. A developer spends a lot of his working time reading and understanding code that was written by other developers or by himself in the past. The readability of the code plays an important factor for the time to find a bug or add new functionality, which in turn has a big impact on the productivity. Code that is difficult to undestand, hard to maintain and refactor, and offers many spots for bugs to hide is not considered to be "clean code". But what could considered as "clean code" and what are the advantages of a strict application of its guidelines? In this presentation we will take a look on some typical "code smells" and proposed guidelines to improve your coding skills to write cleaner code that is less bug prone and better to maintain.

  7. In-facility transport code review

    International Nuclear Information System (INIS)

    Spore, J.W.; Boyack, B.E.; Bohl, W.R.

    1996-07-01

    The following computer codes were reviewed by the In-Facility Transport Working Group for application to the in-facility transport of radioactive aerosols, flammable gases, and/or toxic gases: (1) CONTAIN, (2) FIRAC, (3) GASFLOW, (4) KBERT, and (5) MELCOR. Based on the review criteria as described in this report and the versions of each code available at the time of the review, MELCOR is the best code for the analysis of in-facility transport when multidimensional effects are not significant. When multi-dimensional effects are significant, GASFLOW should be used

  8. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  9. Optimization of Particle-in-Cell Codes on RISC Processors

    Science.gov (United States)

    Decyk, Viktor K.; Karmesin, Steve Roy; Boer, Aeint de; Liewer, Paulette C.

    1996-01-01

    General strategies are developed to optimize particle-cell-codes written in Fortran for RISC processors which are commonly used on massively parallel computers. These strategies include data reorganization to improve cache utilization and code reorganization to improve efficiency of arithmetic pipelines.

  10. Polarization diversity scheme on spectral polarization coding optical code-division multiple-access network

    Science.gov (United States)

    Yen, Chih-Ta; Huang, Jen-Fa; Chang, Yao-Tang; Chen, Bo-Hau

    2010-12-01

    We present an experiment demonstrating the spectral-polarization coding optical code-division multiple-access system introduced with a nonideal state of polarization (SOP) matching conditions. In the proposed system, the encoding and double balanced-detection processes are implemented using a polarization-diversity scheme. Because of the quasiorthogonality of Hadamard codes combining with array waveguide grating routers and a polarization beam splitter, the proposed codec pair can encode-decode multiple code words of Hadamard code while retaining the ability for multiple-access interference cancellation. The experimental results demonstrate that when the system is maintained with an orthogonal SOP for each user, an effective reduction in the phase-induced intensity noise is obtained. The analytical SNR values are found to overstate the experimental results by around 2 dB when the received effective power is large. This is mainly limited by insertion losses of components and a nonflattened optical light source. Furthermore, the matching conditions can be improved by decreasing nonideal influences.

  11. Reracking of fuel pools, experience with improved codes and design for reactor sites with high seismic loads

    International Nuclear Information System (INIS)

    Banck, J.; Wirtz, K.

    1998-01-01

    Reracking of existing pools to the maximum extent is desirable from the economical point of view. Although the load onto the storage rack structure and the fuel pool bottom will be increased, new improved codes, optimized structural qualification procedures and advanced design enable to demonstrate the structural integrity for all normal and accident conditions so that the design provides a safe compact storage of spent fuel under any condition.(author)

  12. Development of computer code in PNC, 3

    International Nuclear Information System (INIS)

    Ohtaki, Akira; Ohira, Hiroaki

    1990-01-01

    Super-COPD, a code which is integrated by calculation modules, has been developed in order to evaluate kinds of dynamics of LMFBR plant by improving COPD. The code involves all models and its advanced models of COPD in module structures. The code makes it possible to simulate the system dynamics of LMFBR plant of any configurations and components. (author)

  13. 75 FR 81587 - Coding of Design Marks in Registrations

    Science.gov (United States)

    2010-12-28

    ... DEPARTMENT OF COMMERCE Patent and Trademark Office [Docket No. PTO-T-2010-0090] Coding of Design... discontinue its secondary design coding, the practice of coding newly registered trademarks in its searchable... temporarily retain the paper collection of registrations with design coding, while improving the accuracy of...

  14. Improving a Power Line Communications Standard with LDPC Codes

    Directory of Open Access Journals (Sweden)

    Hsu Christine

    2007-01-01

    Full Text Available We investigate a power line communications (PLC scheme that could be used to enhance the HomePlug 1.0 standard, specifically its ROBO mode which provides modest throughput for the worst case PLC channel. The scheme is based on using a low-density parity-check (LDPC code, in lieu of the concatenated Reed-Solomon and convolutional codes in ROBO mode. The PLC channel is modeled with multipath fading and Middleton's class A noise. Clipping is introduced to mitigate the effect of impulsive noise. A simple and effective method is devised to estimate the variance of the clipped noise for LDPC decoding. Simulation results show that the proposed scheme outperforms the HomePlug 1.0 ROBO mode and has lower computational complexity. The proposed scheme also dispenses with the repetition of information bits in ROBO mode to gain time diversity, resulting in 4-fold increase in physical layer throughput.

  15. NASA space radiation transport code development consortium

    International Nuclear Information System (INIS)

    Townsend, L. W.

    2005-01-01

    Recently, NASA established a consortium involving the Univ. of Tennessee (lead institution), the Univ. of Houston, Roanoke College and various government and national laboratories, to accelerate the development of a standard set of radiation transport computer codes for NASA human exploration applications. This effort involves further improvements of the Monte Carlo codes HETC and FLUKA and the deterministic code HZETRN, including developing nuclear reaction databases necessary to extend the Monte Carlo codes to carry out heavy ion transport, and extending HZETRN to three dimensions. The improved codes will be validated by comparing predictions with measured laboratory transport data, provided by an experimental measurements consortium, and measurements in the upper atmosphere on the balloon-borne Deep Space Test Bed (DSTB). In this paper, we present an overview of the consortium members and the current status and future plans of consortium efforts to meet the research goals and objectives of this extensive undertaking. (authors)

  16. An Improved MOEA/D for QoS Oriented Multimedia Multicasting with Network Coding

    Directory of Open Access Journals (Sweden)

    Zhaoyuan Wang

    2015-08-01

    Full Text Available Recent years witness a significant growth in multimedia applications. Among them, a stream of applications is real-time and requires one-to-many fast data transmission with stringent quality-of-service (QoS requirements, where multicast is an important supporting technology. In particular, with more and more mobile end users requesting real-time broadband multimedia applications, it is of vital importance to provide them with satisfied quality of experience. As network coding can offer higher bandwidth to users and accommodate more flows for networks than traditional routing, this paper studies the multicast routing problem with network coding and formulates it as a multi-objective optimization problem. As delay and packet loss ratio (PLR are two important performance indicators for QoS, we consider them as the two objectives for minimization. To address the problem above, we present a multi-objective evolutionary algorithm based on decomposition (MOEA/D, where an all population updating rule is devised to address the problem of lacking feasible solutions in the search space. Experimental results demonstrate the effectiveness of the proposed algorithm and it outperforms a number of state-of-the-art algorithms.

  17. Improving early detection of gastric cancer: a novel systematic alphanumeric-coded endoscopic approach.

    Science.gov (United States)

    Emura, Fabian; Gralnek, Ian; Baron, Todd H

    2013-01-01

    Despite extensive worldwide use of standard esophagogastroduodenoscopy (EGD) examinations, gastric cancer (GC) is one of the most common forms of cancer and ranks as the most common malignant tumor in East Asia, Eastern Europe and parts of Latin America. Current limitations of using non systematic examination during standard EGD could be at least partially responsible for the low incidence of early GC diagnosis in countries with a high prevalence of the disease. Originally proposed by Emura et al., systematic alphanumeric-coded endoscopy (SACE) is a novel method that facilitates complete examination of the upper GI tract based on sequential systematic overlapping photo-documentation using an endoluminal alphanumeric-coded nomenclature comprised of eight regions and 28 areas covering the entire surface upper GI surface. For precise localization or normal or abnormal areas, SACE incorporates a simple coordinate system based on the identification of certain natural axes, walls, curvatures and anatomical endoluminal landmarks. Efectiveness of SACE was recently demonstrated in a screening study that diagnosed early GC at a frequency of 0.30% (2/650) in healthy, average-risk volunteer subjects. Such a novel approach, if uniformly implemented worldwide, could significantly change the way we practice upper endoscopy in our lifetimes.

  18. PWR fuel physico chemistry. Improvements of the Sage code to compute thermochemical balance in PWR fuel

    International Nuclear Information System (INIS)

    Garcia, P.; Baron, D.; Piron, J.P.

    1993-02-01

    A physicochemical survey of high burnup fuel has been undertaken in the context of a 3-party action (CEA Cadarache - EDF/DER - FRAMATOME). One of the tasks involved consists in adapting the SAGE code for assessment of the thermochemical equilibria of fission products in solution in the fuel matrix. This paper describes the first stage of this task. Even if other improvements are planned, the oxid oxygen potentials are yet properly reproduced for the simulated burnup. (authors). 63 figs., 4 tabs., 41 refs

  19. Evaluation Codes from Order Domain Theory

    DEFF Research Database (Denmark)

    Andersen, Henning Ejnar; Geil, Hans Olav

    2008-01-01

    bound is easily extended to deal with any generalized Hamming weights. We interpret our methods into the setting of order domain theory. In this way we fill in an obvious gap in the theory of order domains. [28] T. Shibuya and K. Sakaniwa, A Dual of Well-Behaving Type Designed Minimum Distance, IEICE......The celebrated Feng-Rao bound estimates the minimum distance of codes defined by means of their parity check matrices. From the Feng-Rao bound it is clear how to improve a large family of codes by leaving out certain rows in their parity check matrices. In this paper we derive a simple lower bound...... on the minimum distance of codes defined by means of their generator matrices. From our bound it is clear how to improve a large family of codes by adding certain rows to their generator matrices. The new bound is very much related to the Feng-Rao bound as well as to Shibuya and Sakaniwa's bound in [28]. Our...

  20. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  1. Amplitude Modulated Sinusoidal Signal Decomposition for Audio Coding

    DEFF Research Database (Denmark)

    Christensen, M. G.; Jacobson, A.; Andersen, S. V.

    2006-01-01

    In this paper, we present a decomposition for sinusoidal coding of audio, based on an amplitude modulation of sinusoids via a linear combination of arbitrary basis vectors. The proposed method, which incorporates a perceptual distortion measure, is based on a relaxation of a nonlinear least......-squares minimization. Rate-distortion curves and listening tests show that, compared to a constant-amplitude sinusoidal coder, the proposed decomposition offers perceptually significant improvements in critical transient signals....

  2. A user's manual for the three-dimensional Monte Carlo transport code SPARTAN

    International Nuclear Information System (INIS)

    Bending, R.C.; Heffer, P.J.H.

    1975-09-01

    SPARTAN is a general-purpose Monte Carlo particle transport code intended for neutron or gamma transport problems in reactor physics, health physics, shielding, and safety studies. The code used a very general geometry system enabling a complex layout to be described and allows the user to obtain physics data from a number of different types of source library. Special tracking and scoring techniques are used to improve the quality of the results obtained. To enable users to run SPARTAN, brief descriptions of the facilities available in the code are given and full details of data input and job control language, as well as examples of complete calculations, are included. It is anticipated that changes may be made to SPARTAN from time to time, particularly in those parts of the code which deal with physics data processing. The load module is identified by a version number and implementation date, and updates of sections of this manual will be issued when significant changes are made to the code. (author)

  3. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Directory of Open Access Journals (Sweden)

    Nawawi N. M.

    2017-01-01

    Full Text Available In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA system using Zero Cross Correlation (ZCC code and multiband Orthogonal Frequency Division Multiplexing (OFDM called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  4. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Science.gov (United States)

    Nawawi, N. M.; Anuar, M. S.; Junita, M. N.; Rashidi, C. B. M.

    2017-11-01

    In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA) system using Zero Cross Correlation (ZCC) code and multiband Orthogonal Frequency Division Multiplexing (OFDM) called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  5. Error Concealment using Neural Networks for Block-Based Image Coding

    Directory of Open Access Journals (Sweden)

    M. Mokos

    2006-06-01

    Full Text Available In this paper, a novel adaptive error concealment (EC algorithm, which lowers the requirements for channel coding, is proposed. It conceals errors in block-based image coding systems by using neural network. In this proposed algorithm, only the intra-frame information is used for reconstruction of the image with separated damaged blocks. The information of pixels surrounding a damaged block is used to recover the errors using the neural network models. Computer simulation results show that the visual quality and the MSE evaluation of a reconstructed image are significantly improved using the proposed EC algorithm. We propose also a simple non-neural approach for comparison.

  6. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G. [Paul Scherrer Institut, Villigen (Switzerland)

    1997-07-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together with some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.

  7. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    International Nuclear Information System (INIS)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.

    1997-01-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together with some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized

  8. Improving performance of single-path code through a time-predictable memory hierarchy

    DEFF Research Database (Denmark)

    Cilku, Bekim; Puffitsch, Wolfgang; Prokesch, Daniel

    2017-01-01

    -predictable memory hierarchy with a prefetcher that exploits the predictability of execution traces in single-path code to speed up code execution. The new memory hierarchy reduces both the cache-miss penalty time and the cache-miss rate on the instruction cache. The benefit of the approach is demonstrated through...

  9. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  10. Coding aperture applied to X-ray imaging

    International Nuclear Information System (INIS)

    Brunol, J.; Sauneuf, R.; Gex, J.P.

    1980-05-01

    We present some X-ray images of grids and plasmas. These images were obtained by using a single circular slit (annular code) as coding aperture and a computer decoding process. The experimental resolution is better than 10μm and it is expected to be in the order of 2 or 3 μm with the same code and an improved decoding process

  11. Iterative optimization of quantum error correcting codes

    International Nuclear Information System (INIS)

    Reimpell, M.; Werner, R.F.

    2005-01-01

    We introduce a convergent iterative algorithm for finding the optimal coding and decoding operations for an arbitrary noisy quantum channel. This algorithm does not require any error syndrome to be corrected completely, and hence also finds codes outside the usual Knill-Laflamme definition of error correcting codes. The iteration is shown to improve the figure of merit 'channel fidelity' in every step

  12. Vectorization of DOT3.5 code

    International Nuclear Information System (INIS)

    Nonomiya, Iwao; Ishiguro, Misako; Tsutsui, Tsuneo

    1990-07-01

    In this report, we describe the vectorization of two-dimensional Sn-method radiation transport code DOT3.5. Vectorized codes are not only the NEA original version developed at ORNL but also the versions improved by JAERI: DOT3.5 FNS version for fusion neutronics analyses, DOT3.5 FER version for fusion reactor design, and ESPRIT module of RADHEAT-V4 code system for radiation shielding and radiation transport analyses. In DOT3.5, input/output processing time amounts to a great part of the elapsed time when a large number of energy groups and/or a large number of spatial mesh points are used in the calculated problem. Therefore, an improvement has been made for the speedup of input/output processing in the DOT3.5 FNS version, and DOT-DD (Double Differential cross section) code. The total speedup ratio of vectorized version to the original scalar one is 1.7∼1.9 for DOT3.5 NEA version, 2.2∼2.3 fro DOT3.5 FNS version, 1.7 for DOT3.5 FER version, and 3.1∼4.4 for RADHEAT-V4, respectively. The elapsed times for improved DOT3.5 FNS version and DOT-DD are reduced to 50∼65% that of the original version by the input/output speedup. In this report, we describe summary of codes, the techniques used for the vectorization and input/output speedup, verification of computed results, and speedup effect. (author)

  13. [Improvement of Phi bodies stain and its clinical significance].

    Science.gov (United States)

    Gong, Xu-Bo; Lu, Xing-Guo; Yan, Li-Juan; Xiao, Xi-Bin; Wu, Dong; Xu, Gen-Bo; Zhang, Xiao-Hong; Zhao, Xiao-Ying

    2009-02-01

    The aim of this study was to improve the dyeing method of hydroperoxidase (HPO), to analyze the morphologic features of Phi bodies and to evaluate the clinical application of this method. 128 bone marrow or peripheral blood smears from patients with myeloid and lymphoid malignancies were stained by improved HPO staining. The Phi bodies were observed with detection rate of Phi bodies in different leukemias. 69 acute myeloid leukemia (AML) specimens were chosen randomly, the positive rate and the number of Phi bodies between the improved HPO and POX stain based on the same substrate of 3, 3'diaminobenzidine were compared. The results showed that the shape of bundle-like Phi bodies was variable, long or short. while the nubbly Phi bodies often presented oval and smooth. Club-like Phi bodies were found in M(3). The detection rates of bundle-like Phi bodies in AML M(1)-M(5) were 42.9% (6/14), 83.3% (15/18), 92.0% (23/25), 52.3% (11/21), 33.3% (5/15) respectively, and those of nubbly Phi bodies were 28.6% (4/14), 66.7% (12/18), 11.1% (3/25), 33.3% (7/21), 20.0% (3/15) respectively. The detection rate of bundle-like Phi bodies in M(3) was significantly higher than that in (M(1) + M(2)) or (M(4) + M(5)) groups. The detection rate of nubbly Phi bodies in (M(1) + M(2)) group was higher than that in M(3) group. In conclusion, after improvement of staining method, the HPO stain becomes simple, the detection rate of Phi bodies is higher than that by the previous method, the positive granules are more obvious, and the results become stable. This improved method plays an important role in differentiating AML from ALL, subtyping AML, and evaluating the therapeutic results.

  14. Computationally Efficient Amplitude Modulated Sinusoidal Audio Coding using Frequency-Domain Linear Prediction

    DEFF Research Database (Denmark)

    Christensen, M. G.; Jensen, Søren Holdt

    2006-01-01

    A method for amplitude modulated sinusoidal audio coding is presented that has low complexity and low delay. This is based on a subband processing system, where, in each subband, the signal is modeled as an amplitude modulated sum of sinusoids. The envelopes are estimated using frequency......-domain linear prediction and the prediction coefficients are quantized. As a proof of concept, we evaluate different configurations in a subjective listening test, and this shows that the proposed method offers significant improvements in sinusoidal coding. Furthermore, the properties of the frequency...

  15. Adaptive format conversion for scalable video coding

    Science.gov (United States)

    Wan, Wade K.; Lim, Jae S.

    2001-12-01

    The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.

  16. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  17. New Technique for Improving Performance of LDPC Codes in the Presence of Trapping Sets

    Directory of Open Access Journals (Sweden)

    Mohamed Adnan Landolsi

    2008-06-01

    Full Text Available Trapping sets are considered the primary factor for degrading the performance of low-density parity-check (LDPC codes in the error-floor region. The effect of trapping sets on the performance of an LDPC code becomes worse as the code size decreases. One approach to tackle this problem is to minimize trapping sets during LDPC code design. However, while trapping sets can be reduced, their complete elimination is infeasible due to the presence of cycles in the underlying LDPC code bipartite graph. In this work, we introduce a new technique based on trapping sets neutralization to minimize the negative effect of trapping sets under belief propagation (BP decoding. Simulation results for random, progressive edge growth (PEG and MacKay LDPC codes demonstrate the effectiveness of the proposed technique. The hardware cost of the proposed technique is also shown to be minimal.

  18. Two-phase wall friction model for the trace computer code

    International Nuclear Information System (INIS)

    Wang Weidong

    2005-01-01

    The wall drag model in the TRAC/RELAP5 Advanced Computational Engine computer code (TRACE) has certain known deficiencies. For example, in an annular flow regime, the code predicts an unphysical high liquid velocity compared to the experimental data. To address those deficiencies, a new wall frictional drag package has been developed and implemented in the TRACE code to model the wall drag for two-phase flow system code. The modeled flow regimes are (1) annular/mist, (2) bubbly/slug, and (3) bubbly/slug with wall nucleation. The new models use void fraction (instead of flow quality) as the correlating variable to minimize the calculation oscillation. In addition, the models allow for transitions between the three regimes. The annular/mist regime is subdivided into three separate regimes for pure annular flow, annular flow with entrainment, and film breakdown. For adiabatic two-phase bubbly/slug flows, the vapor phase primarily exists outside of the boundary layer, and the wall shear uses single-phase liquid velocity for friction calculation. The vapor phase wall friction drag is set to zero for bubbly/slug flows. For bubbly/slug flows with wall nucleation, the bubbles are presented within the hydrodynamic boundary layer, and the two-phase wall friction drag is significantly higher with a pronounced mass flux effect. An empirical correlation has been studied and applied to account for nucleate boiling. Verification and validation tests have been performed, and the test results showed a significant code improvement. (authors)

  19. New GOES satellite synchronized time code generation

    Science.gov (United States)

    Fossler, D. E.; Olson, R. K.

    1984-01-01

    The TRAK Systems' GOES Satellite Synchronized Time Code Generator is described. TRAK Systems has developed this timing instrument to supply improved accuracy over most existing GOES receiver clocks. A classical time code generator is integrated with a GOES receiver.

  20. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  1. The RAGE radiation-hydrodynamic code

    International Nuclear Information System (INIS)

    Gittings, Michael; Clover, Michael; Betlach, Thomas; Byrne, Nelson; Ranta, Dale; Weaver, Robert; Coker, Robert; Dendy, Edward; Hueckstaedt, Robert; New, Kim; Oakes, W Rob; Stefan, Ryan

    2008-01-01

    We describe RAGE, the 'radiation adaptive grid Eulerian' radiation-hydrodynamics code, including its data structures, its parallelization strategy and performance, its hydrodynamic algorithm(s), its (gray) radiation diffusion algorithm, and some of the considerable amount of verification and validation efforts. The hydrodynamics is a basic Godunov solver, to which we have made significant improvements to increase the advection algorithm's robustness and to converge stiffnesses in the equation of state. Similarly, the radiation transport is a basic gray diffusion, but our treatment of the radiation-material coupling, wherein we converge nonlinearities in a novel manner to allow larger timesteps and more robust behavior, can be applied to any multi-group transport algorithm

  2. COMPBRN III: a computer code for modeling compartment fires

    International Nuclear Information System (INIS)

    Ho, V.; Siu, N.; Apostolakis, G.; Flanagan, G.F.

    1986-07-01

    The computer code COMPBRN III deterministically models the behavior of compartment fires. This code is an improvement of the original COMPBRN codes. It employs a different air entrainment model and numerical scheme to estimate properties of the ceiling hot gas layer model. Moreover, COMPBRN III incorporates a number of improvements in shape factor calculations and error checking, which distinguish it from the COMPBRN II code. This report presents the ceiling hot gas layer model employed by COMPBRN III as well as several other modifications. Information necessary to run COMPBRN III, including descriptions of required input and resulting output, are also presented. Simulation of experiments and a sample problem are included to demonstrate the usage of the code. 37 figs., 46 refs

  3. Nodal kinetics model upgrade in the Penn State coupled TRAC/NEM codes

    International Nuclear Information System (INIS)

    Beam, Tara M.; Ivanov, Kostadin N.; Baratta, Anthony J.; Finnemann, Herbert

    1999-01-01

    The Pennsylvania State University currently maintains and does development and verification work for its own versions of the coupled three-dimensional kinetics/thermal-hydraulics codes TRAC-PF1/NEM and TRAC-BF1/NEM. The subject of this paper is nodal model enhancements in the above mentioned codes. Because of the numerous validation studies that have been performed on almost every aspect of these codes, this upgrade is done without a major code rewrite. The upgrade consists of four steps. The first two steps are designed to improve the accuracy of the kinetics model, based on the nodal expansion method. The polynomial expansion solution of 1D transverse integrated diffusion equation is replaced with a solution, which uses a semi-analytic expansion. Further the standard parabolic polynomial representation of the transverse leakage in the above 1D equations is replaced with an improved approximation. The last two steps of the upgrade address the code efficiency by improving the solution of the time-dependent NEM equations and implementing a multi-grid solver. These four improvements are implemented into the standalone NEM kinetics code. Verification of this code was accomplished based on the original verification studies. The results show that the new methods improve the accuracy and efficiency of the code. The verification of the upgraded NEM model in the TRAC-PF1/NEM and TRAC-BF1/NEM coupled codes is underway

  4. Improvement of implicit finite element code performance in deep drawing simulations by dynamics contributions

    NARCIS (Netherlands)

    Meinders, Vincent T.; van den Boogaard, Antonius H.; Huetink, Han

    2003-01-01

    To intensify the use of implicit finite element codes for solving large scale problems, the computation time of these codes has to be decreased drastically. A method is developed which decreases the computational time of implicit codes by factors. The method is based on introducing inertia effects

  5. Resonance self-shielding methodology of new neutron transport code STREAM

    International Nuclear Information System (INIS)

    Choi, Sooyoung; Lee, Hyunsuk; Lee, Deokjung; Hong, Ser Gi

    2015-01-01

    This paper reports on the development and verification of three new resonance self-shielding methods. The verifications were performed using the new neutron transport code, STREAM. The new methodologies encompass the extension of energy range for resonance treatment, the development of optimum rational approximation, and the application of resonance treatment to isotopes in the cladding region. (1) The extended resonance energy range treatment has been developed to treat the resonances below 4 eV of three resonance isotopes and shows significant improvements in the accuracy of effective cross sections (XSs) in that energy range. (2) The optimum rational approximation can eliminate the geometric limitations of the conventional approach of equivalence theory and can also improve the accuracy of fuel escape probability. (3) The cladding resonance treatment method makes it possible to treat resonances in cladding material which have not been treated explicitly in the conventional methods. These three new methods have been implemented in the new lattice physics code STREAM and the improvement in the accuracy of effective XSs is demonstrated through detailed verification calculations. (author)

  6. Two dimension MDW OCDMA code cross-correlation for reduction of phase induced intensity noise

    Directory of Open Access Journals (Sweden)

    Sh. Ahmed Israa

    2017-01-01

    Full Text Available In this paper, we first review 2-D MDW code cross correlation equations and table to be improved significantly by using code correlation properties. These codes can be used in the synchronous optical CDMA systems for multi access interference cancellation and maximum suppress the phase induced intensity noise. Low Psr is due to the reduction of interference noise that is induced by the 2-D MDW code PIIN suppression. High data rate causes increases in BER, requires high effective power and severely deteriorates the system performance. The 2-D W/T MDW code has an excellent system performance where the value of PIIN is suppressed as low as possible at the optimum Psr with high data bit rate. The 2-D MDW code shows better tolerance to PIIN in comparison to others with enhanced system performance. We prove by numerical analysis that the PIIN maximally suppressed by MDW code through the minimizing property of cross correlation in comparison to 2-D PDC and 2-D MQC OCDMA code.scheme systems.

  7. Two dimension MDW OCDMA code cross-correlation for reduction of phase induced intensity noise

    Science.gov (United States)

    Ahmed, Israa Sh.; Aljunid, Syed A.; Nordin, Junita M.; Dulaimi, Layth A. Khalil Al; Matem, Rima

    2017-11-01

    In this paper, we first review 2-D MDW code cross correlation equations and table to be improved significantly by using code correlation properties. These codes can be used in the synchronous optical CDMA systems for multi access interference cancellation and maximum suppress the phase induced intensity noise. Low Psr is due to the reduction of interference noise that is induced by the 2-D MDW code PIIN suppression. High data rate causes increases in BER, requires high effective power and severely deteriorates the system performance. The 2-D W/T MDW code has an excellent system performance where the value of PIIN is suppressed as low as possible at the optimum Psr with high data bit rate. The 2-D MDW code shows better tolerance to PIIN in comparison to others with enhanced system performance. We prove by numerical analysis that the PIIN maximally suppressed by MDW code through the minimizing property of cross correlation in comparison to 2-D PDC and 2-D MQC OCDMA code.scheme systems.

  8. Improved production of membrane proteins in Escherichia coli by selective codon substitutions

    DEFF Research Database (Denmark)

    Nørholm, Morten H.H.; Toddo, Stephen; Virkki, Minttu T.I.

    2013-01-01

    Membrane proteins are extremely challenging to produce in sufficient quantities for biochemical and structural analysis and there is a growing demand for solutions to this problem. In this study we attempted to improve expression of two difficult-to-express coding sequences (araH and narK) for me......Membrane proteins are extremely challenging to produce in sufficient quantities for biochemical and structural analysis and there is a growing demand for solutions to this problem. In this study we attempted to improve expression of two difficult-to-express coding sequences (araH and nar......K) for membrane transporters. For both coding sequences, synonymous codon substitutions in the region adjacent to the AUG start led to significant improvements in expression, whereas multi-parameter sequence optimization of codons throughout the coding sequence failed. We conclude that coding sequences can be re...

  9. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  10. Improvement of the MSG code for the MONJU evaporators. Additional function of reverse flow calculation on water/steam model and animation for post processing

    International Nuclear Information System (INIS)

    Toda, Shin-ichi; Yoshikawa, Shinji; Oketani, Kazuhiro

    2003-05-01

    The improved version of the MSG code (Multi-dimensional Thermal-hydraulic Analysis Code for Steam Generators) has been released. It has been carried out to improve based on the original version in order to calculate reverse flow on water/steam side, and to animate the post-processing data. To calculate reverse flow locally, modification to set pressure at each divided node point of water/steam region in the helical-coil heat transfer tubes has been carried out. And the matrix solver has been also improved to treat a problem within practical calculation time against increasing the pressure points. In this case pressure and enthalpy have to be calculated simultaneously, however, it was found out that using the block-Jacobean method make a diagonal-dominant matrix, and solve the matrix efficiently with a relaxation method. As the result of calculations of a steady-state condition and a transient of SG blow down with manual trip operation, the improvement on calculation function of the MSG code was confirmed. And an animation function of temperature contour in the sodium shell side as a post processing has been added. Since the animation is very effective to understand thermal-hydraulic behavior on the sodium shell side of the SG, especially in case of transient condition, the analysis and evaluation of the calculation results will be enabled to be more quickly and effectively. (author)

  11. Improvement of spallation reaction simulation codes NMTC/JAERI and NUCLEUS

    International Nuclear Information System (INIS)

    Nishida, T.; Takada, H.; Kanno, I.; Nakahara, Y.

    1990-01-01

    To make evaluations of theoretical models for nuclear spallation reaction, simulation codes are modified and a new mass formula is used to improve the accuracy of Monte Carlo calculations. The following conclusions are made from analyses of calculated distributions of nuclear spallation products. A difference is found between the Cameron's old and the Uno and Yamada's new mass formula, which is due to the difference in the method used to fit their shell energy terms to measured data for selected nuclei and in data themselves. For nuclides with an atomic number larger than 70, mass excesses calculated by the Camerons's mass formula are greater than those by the Uno and Yamada's one, whereas the reverse tendency is seen for ones with atomic numbers smaller than 70. Analysis shows that the distributions of produced nuclei have patterns that appear natural from a physical point of view when artificial restrictions are removed in counting the nuclide production events. The new mass formula can reproduce fairly well the experimental product yield distributions, especially in the neutron excess side. It is also found that the old mass formula gives lower estimations for the number of produced nuclei than the new one, especially in the nuclide region far from the beta stable line. (N.K.)

  12. Alternate symbol inversion for improved symbol synchronization in convolutionally coded systems

    Science.gov (United States)

    Simon, M. K.; Smith, J. G.

    1980-01-01

    Inverting alternate symbols of the encoder output of a convolutionally coded system provides sufficient density of symbol transitions to guarantee adequate symbol synchronizer performance, a guarantee otherwise lacking. Although alternate symbol inversion may increase or decrease the average transition density, depending on the data source model, it produces a maximum number of contiguous symbols without transition for a particular class of convolutional codes, independent of the data source model. Further, this maximum is sufficiently small to guarantee acceptable symbol synchronizer performance for typical applications. Subsequent inversion of alternate detected symbols permits proper decoding.

  13. Computer Security: is your code sane?

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2015-01-01

    How many of us write code? Software? Programs? Scripts? How many of us are properly trained in this and how well do we do it? Do we write functional, clean and correct code, without flaws, bugs and vulnerabilities*? In other words: are our codes sane?   Figuring out weaknesses is not that easy (see our quiz in an earlier Bulletin article). Therefore, in order to improve the sanity of your code, prevent common pit-falls, and avoid the bugs and vulnerabilities that can crash your code, or – worse – that can be misused and exploited by attackers, the CERN Computer Security team has reviewed its recommendations for checking the security compliance of your code. “Static Code Analysers” are stand-alone programs that can be run on top of your software stack, regardless of whether it uses Java, C/C++, Perl, PHP, Python, etc. These analysers identify weaknesses and inconsistencies including: employing undeclared variables; expressions resu...

  14. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  15. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  16. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  17. Development of Regulatory Audit Core Safety Code : COREDAX

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Chae Yong; Jo, Jong Chull; Roh, Byung Hwan [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Lee, Jae Jun; Cho, Nam Zin [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2005-07-01

    Korea Institute of Nuclear Safety (KINS) has developed a core neutronics simulator, COREDAX code, for verifying core safety of SMART-P reactor, which is technically supported by Korea Advanced Institute of Science and Technology (KAIST). The COREDAX code would be used for regulatory audit calculations of 3- dimendional core neutronics. The COREDAX code solves the steady-state and timedependent multi-group neutron diffusion equation in hexagonal geometry as well as rectangular geometry by analytic function expansion nodal (AFEN) method. AFEN method was developed at KAIST, and it was internationally verified that its accuracy is excellent. The COREDAX code is originally programmed based on the AFEN method. Accuracy of the code on the AFEN method was excellent for the hexagonal 2-dimensional problems, but there was a need for improvement for hexagonal-z 3-dimensional problems. Hence, several solution routines of the AFEN method are improved, and finally the advanced AFEN method is created. COREDAX code is based on the advanced AFEN method . The initial version of COREDAX code is to complete a basic framework, performing eigenvalue calculations and kinetics calculations with thermal-hydraulic feedbacks, for audit calculations of steady-state core design and reactivity-induced accidents of SMART-P reactor. This study describes the COREDAX code for hexagonal geometry.

  18. ACDOS2: an improved neutron-induced dose rate code

    International Nuclear Information System (INIS)

    Lagache, J.C.

    1981-06-01

    To calculate the expected dose rate from fusion reactors as a function of geometry, composition, and time after shutdown a computer code, ACDOS2, was written, which utilizes up-to-date libraries of cross-sections and radioisotope decay data. ACDOS2 is in ANSI FORTRAN IV, in order to make it readily adaptable elsewhere

  19. ACDOS2: an improved neutron-induced dose rate code

    Energy Technology Data Exchange (ETDEWEB)

    Lagache, J.C.

    1981-06-01

    To calculate the expected dose rate from fusion reactors as a function of geometry, composition, and time after shutdown a computer code, ACDOS2, was written, which utilizes up-to-date libraries of cross-sections and radioisotope decay data. ACDOS2 is in ANSI FORTRAN IV, in order to make it readily adaptable elsewhere.

  20. PHITS code improvements by Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority

    International Nuclear Information System (INIS)

    Goko, Shinji

    2017-01-01

    As for the safety analysis to be carried out when a nuclear power company applies for installation permission of facility or equipment, business license, design approval etc., the Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority continuously conducts safety research for the introduction of various technologies and their improvement in order to evaluate the adequacy of this safety analysis. In the field of the shielding analysis of nuclear fuel transportation materials, this group improved the code to make PHITS applicable to this field, and has been promoting the improvement as a tool used for regulations since FY2013. This paper introduced the history and progress of this safety research. PHITS 2.88, which is the latest version as of November 2016, was equipped with the automatic generation function of variance reduction parameters [T-WWG] etc., and developed as the tool equipped with many effective functions in practical application to nuclear power regulations. In addition, this group conducted the verification analysis against nuclear fuel packages, which showed a good agreement with the analysis by MCNP, which is extensively used worldwide and abundant in actual results. It also shows a relatively good agreement with the measured values, when considering differences in analysis and measurement. (A.O.)

  1. [QR-Code based patient tracking: a cost-effective option to improve patient safety].

    Science.gov (United States)

    Fischer, M; Rybitskiy, D; Strauß, G; Dietz, A; Dressler, C R

    2013-03-01

    Hospitals are implementing a risk management system to avoid patient or surgery mix-ups. The trend is to use preoperative checklists. This work deals specifically with a type of patient identification, which is realized by storing patient data on a patient-fixed medium. In 127 ENT surgeries data relevant for patient identification were encrypted in a 2D-QR-Code. The code, as a separate document coming with the patient chart or as a patient wristband, has been decrypted in the OR and the patient data were presented visible for all persons. The decoding time, the compliance of the patient data, as well as the duration of the patient identification was compared with the traditional patient identification by inspection of the patient chart. A total of 125 QR codes were read. The time for the decrypting of QR-Code was 5.6 s, the time for the screen view for patient identification was 7.9 s, and for a comparison group of 75 operations traditional patient identification was 27.3 s. Overall, there were 6 relevant information errors in the two parts of the experiment. This represents a ratio of 0.6% for 8 relevant classes per each encrypted QR code. This work allows a cost effective way to technically support patient identification based on electronic patient data. It was shown that the use in the clinical routine is possible. The disadvantage is a potential misinformation from incorrect or missing information in the HIS, or due to changes of the data after the code was created. The QR-code-based patient tracking is seen as a useful complement to the already widely used identification wristband. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  3. Codeword Structure Analysis for LDPC Convolutional Codes

    Directory of Open Access Journals (Sweden)

    Hua Zhou

    2015-12-01

    Full Text Available The codewords of a low-density parity-check (LDPC convolutional code (LDPC-CC are characterised into structured and non-structured. The number of the structured codewords is dominated by the size of the polynomial syndrome former matrix H T ( D , while the number of the non-structured ones depends on the particular monomials or polynomials in H T ( D . By evaluating the relationship of the codewords between the mother code and its super codes, the low weight non-structured codewords in the super codes can be eliminated by appropriately choosing the monomials or polynomials in H T ( D , resulting in improved distance spectrum of the mother code.

  4. Protograph LDPC Codes Over Burst Erasure Channels

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Sam; Jones, Christopher

    2006-01-01

    In this paper we design high rate protograph based LDPC codes suitable for binary erasure channels. To simplify the encoder and decoder implementation for high data rate transmission, the structure of codes are based on protographs and circulants. These LDPC codes can improve data link and network layer protocols in support of communication networks. Two classes of codes were designed. One class is designed for large block sizes with an iterative decoding threshold that approaches capacity of binary erasure channels. The other class is designed for short block sizes based on maximizing minimum stopping set size. For high code rates and short blocks the second class outperforms the first class.

  5. The use of best estimate codes to improve the simulation in real time

    International Nuclear Information System (INIS)

    Rivero, N.; Esteban, J. A.; Lenhardt, G.

    2007-01-01

    Best estimate codes are assumed to be the technology solution providing the most realistic and accurate response. Best estimate technology provides a complementary solution to the conservative simulation technology usually applied to determine plant safety margins and perform security related studies. Tecnatom in the early 90's, within the MAS project, pioneered the initiative to implement best estimate code in its training simulators. Result of this project was the implementation of the first six-equations thermal hydraulic code worldwide (TRAC R T), running in a training environment. To meet real time and other specific training requirements, it was necessary to overcome important difficulties. Tecnatom has just adapted the Global Nuclear Fuel core Design code: PANAC 11, and is about to complete the General Electric TRACG04 thermal hydraulic code adaptation. This technology features a unique solution for nuclear plants aiming at providing the highest fidelity in simulation, enabling to consider the simulator as a multipurpose: engineering and training, simulation platform. Besides, a visual environment designed to optimize the models life cycle, covering both pre and post-processing activities, is in its late development phase. (Author)

  6. Lossy/lossless coding of bi-level images

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1997-01-01

    Summary form only given. We present improvements to a general type of lossless, lossy, and refinement coding of bi-level images (Martins and Forchhammer, 1996). Loss is introduced by flipping pixels. The pixels are coded using arithmetic coding of conditional probabilities obtained using a template...... as is known from JBIG and proposed in JBIG-2 (Martins and Forchhammer). Our new state-of-the-art results are obtained using the more general free tree instead of a template. Also we introduce multiple refinement template coding. The lossy algorithm is analogous to the greedy `rate...

  7. Training and support to improve ICD coding quality: A controlled ...

    African Journals Online (AJOL)

    [1] In response to this challenge, the Western Cape Government: Health (WCGH) ... commit to costly, time-consuming accredited ICD coding courses, although ..... being blinded to the retrospective 'assignment' of patients to the intervention ...

  8. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  9. Description of the COMRADEX code

    International Nuclear Information System (INIS)

    Spangler, G.W.; Boling, M.; Rhoades, W.A.; Willis, C.A.

    1967-01-01

    The COMRADEX Code is discussed briefly and instructions are provided for the use of the code. The subject code was developed for calculating doses from hypothetical power reactor accidents. It permits the user to analyze four successive levels of containment with time-varying leak rates. Filtration, cleanup, fallout and plateout in each containment shell can also be analyzed. The doses calculated include the direct gamma dose from the containment building, the internal doses to as many as 14 organs including the thyroid, bone, lung, etc. from inhaling the contaminated air, and the external gamma doses from the cloud. While further improvements are needed, such as a provision for calculating doses from fallout, rainout and washout, the present code capabilities have a wide range of applicability for reactor accident analysis

  10. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  11. Fast decoding of codes from algebraic plane curves

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Jensen, Helge Elbrønd

    1992-01-01

    Improvement to an earlier decoding algorithm for codes from algebraic geometry is presented. For codes from an arbitrary regular plane curve the authors correct up to d*/2-m2 /8+m/4-9/8 errors, where d* is the designed distance of the code and m is the degree of the curve. The complexity of finding...

  12. Analysis of PPM-CDMA and OPPM-CDMA communication systems with new optical code

    Science.gov (United States)

    Liu, F.; Ghafouri-Shiraz, H.

    2005-11-01

    A novel type of optical spreading sequences, named the 'new-Modified Prime Code (nMPC)', is proposed for use in synchronous direct-detection optical code-division multiple-access (CDMA) systems which employ both pulse position modulation (PPM) and overlapping pulse position modulation (OPPM) schemes. The upper bounds on the bit error rate (BER) for nMPC used in PPM-CDMA systems are derived and compared with the respective systems, using a modified prime code (MPC) and a padded modified prime code (PMPC). The nMPC is further applied to the OPPM-CDMA system and the system with a proposed interference cancellation scheme. Our results show that under the same conditions the PPM-CDMA system performances are more improved with the use of nMPC than with the two other traditional codes. Moreover, they show that the system performances are significantly enhanced by the proposed interference reduction methods, if the nMPC is used in the OPPM-CDMA systems.

  13. One dimensional neutron kinetics in the TRAC-BF1 code

    International Nuclear Information System (INIS)

    Weaver, W.L. III; Wagner, K.C.

    1987-01-01

    The TRAC-BWR code development program at the Idaho National Engineering Laboratory is developing a version of the TRAC code for the U.S. Nuclear Regulatory Commission (USNRC) to provide a best-estimate analysis capability for the simulation of postulated accidents in boiling water reactor (BWR) power systems and related experimental facilities. Recent development efforts in the TRAC-BWR program have focused on improving the computational efficiency through the incorporation of a hybrid Courant- limit-violating numerical solution scheme in the one-dimensional component models and on improving code accuracy through the development of a one-dimensional neutron kinetics model. Many other improvements have been incorporated into TRAC-BWR to improve code portability, accuracy, efficiency, and maintainability. This paper will describe the one- dimensional neutron kinetics model, the generation of the required input data for this model, and present results of the first calculations using the model

  14. Efficient coding and detection of ultra-long IDs for visible light positioning systems.

    Science.gov (United States)

    Zhang, Hualong; Yang, Chuanchuan

    2018-05-14

    Visible light positioning (VLP) is a promising technique to complement Global Navigation Satellite System (GNSS) such as Global positioning system (GPS) and BeiDou Navigation Satellite System (BDS) which features the advantage of low-cost and high accuracy. The situation becomes even more crucial for indoor environments, where satellite signals are weak or even unavailable. For large-scale application of VLP, there would be a considerable number of Light emitting diode (LED) IDs, which bring forward the demand of long LED ID detection. In particular, to provision indoor localization globally, a convenient way is to program a unique ID into each LED during manufacture. This poses a big challenge for image sensors, such as the CMOS camera in everybody's hands since the long ID covers the span of multiple frames. In this paper, we investigate the detection of ultra-long ID using rolling shutter cameras. By analyzing the pattern of data loss in each frame, we proposed a novel coding technique to improve the efficiency of LED ID detection. We studied the performance of Reed-Solomon (RS) code in this system and designed a new coding method which considered the trade-off between performance and decoding complexity. Coding technique decreases the number of frames needed in data processing, significantly reduces the detection time, and improves the accuracy of detection. Numerical and experimental results show that the detected LED ID can be much longer with the coding technique. Besides, our proposed coding method is proved to achieve a performance close to that of RS code while the decoding complexity is much lower.

  15. The impact of time step definition on code convergence and robustness

    Science.gov (United States)

    Venkateswaran, S.; Weiss, J. M.; Merkle, C. L.

    1992-01-01

    We have implemented preconditioning for multi-species reacting flows in two independent codes, an implicit (ADI) code developed in-house and the RPLUS code (developed at LeRC). The RPLUS code was modified to work on a four-stage Runge-Kutta scheme. The performance of both the codes was tested, and it was shown that preconditioning can improve convergence by a factor of two to a hundred depending on the problem. Our efforts are currently focused on evaluating the effect of chemical sources and on assessing how preconditioning may be applied to improve convergence and robustness in the calculation of reacting flows.

  16. Development and validation of an improved version of the DART code

    International Nuclear Information System (INIS)

    Taboada, H; Moscarda, M.V.; Markiewicz, M.; Estevez, E.; Rest, J.

    2002-01-01

    ANL/USDOE and CNEA Argentina have been participating within a SisterLab Program in the area of Low Enriched Uranium Advanced Fuels since October 16, 1997 under the 'Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy'. An annex concerning DART code optimization has been operative since February 8, 1999. Previously, as a part of this annex we developed a visual version of DART named FASTDART for silicide and U-Mo fuels that was presented at the RERTR Meeting in Las Vegas, Nevada. This paper describes several major improvements in the FASTDART code: a thermal calculation subroutine, a fuel particle size distribution subroutine and several visual interfaces for thermal output plotting and particle size input. Using the power history, coolant regime data and fuel dimensions, the new thermal subroutine is able to calculate at each time step the maximum temperature along the z-longitudinal axis as a function of plate/rod morphology (corrosion oxide, cladding, meat, aluminide particle layer, each radial shell of a central fuel particle, and particle center). Calculated temperatures at each time step are coupled to the DART calculation kernel such that swelling processes, volume phase fractions and meat thermal conductivity are calculated synergistically. The new fuel particle size-distribution subroutine is essential in order to determine the evolution of the volume fraction of reaction product. This phase degrades the heat transport by a twofold mechanism: its appearance implies a diminution of aluminium phase and its thermal conductivity is lower than those of fuel and dispersant phase. The new version includes the capability of plotting thermal data output by means of the plate/rod temperature profile at a given irradiation step, and displaying the maximum temperature evolution of each layer. A comparison between the reaction layer thickness and matrix and fuel volume fractions of several RERTR-3 experiment

  17. Code Lavender: Cultivating Intentional Acts of Kindness in Response to Stressful Work Situations.

    Science.gov (United States)

    Davidson, Judy E; Graham, Patricia; Montross-Thomas, Lori; Norcross, William; Zerbi, Giovanna

    Providing healthcare can be stressful. Gone unchecked, clinicians may experience decreased compassion, and increased burnout or secondary traumatic stress. Code Lavender is designed to increase acts of kindness after stressful workplace events occur. To test the feasibility of providing Code Lavender. After stressful events in the workplace, staff will provide, receive, and recommend Code Lavender to others. The provision of Code Lavender will improve Professional Quality of Life Scale (ProQoL) scores, general job satisfaction, and feeling cared for in the workplace. Pilot program testing and evaluation. Staff and physicians on four hospital units were informed of the Code Lavender kit availability, which includes words of comfort, chocolate, lavender essential oil, and employee health referral information. Feasibility data and ProQoL scores were collected at baseline and three months. At baseline, 48% (n = 164) reported a stressful event at work in the last three months. Post-intervention, 51% reported experiencing a stressful workplace event, with 32% receiving a Code Lavender kit from their co-workers as a result (n = 83). Of those who received the Code Lavender intervention; 100% found it helpful, and 84% would recommend it to others. No significant changes were demonstrated before and after the intervention in ProQoL scores or job satisfaction, however the emotion of feeling cared-for improved. Results warrant continuation and further dissemination of Code Lavender. Investigators have received requests to expand the program implying positive reception of the intervention. Additional interventions are needed to overcome workplace stressors. A more intense peer support program is being tested. Copyright © 2017. Published by Elsevier Inc.

  18. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  19. Enhancing the shopping experience through QR codes: the perspective of the Romanian users

    Directory of Open Access Journals (Sweden)

    Irina Albăstroiu

    2015-05-01

    Full Text Available The progress registered in the field of information and communication technologies has induced significant changes in the purchasing behaviour of the consumers who nowadays want simultaneous use of online and offline channels within the same shopping experience. QR Code (Quick Response is among the emerging technologies that retailers can implement for adapting to these changes. Through reviewing the literature, the article firstly clarifies the characteristics and applicability of the QR codes, especially in retail, and then presents the results of an exploratory research conducted among Romanian users regarding their attitude and interest in QR codes. The objective of the research was to identify the degree of usage, the willingness of the Romanian people to use codes in the buying process and also their perception about the functionality and usefulness of the codes. This is a pioneering work for the Romanian literature, because there aren`t other studies that explore issues concerning the use of QR codes in the buying process. Results of the study indicated that respondents know the applicability of QR codes and have used codes for accessing information about products and for online purchases and they consider that QR codes contribute to the improving of the shopping experience.

  20. Building Energy Codes: Policy Overview and Good Practices

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Sadie [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-02-19

    Globally, 32% of total final energy consumption is attributed to the building sector. To reduce energy consumption, energy codes set minimum energy efficiency standards for the building sector. With effective implementation, building energy codes can support energy cost savings and complementary benefits associated with electricity reliability, air quality improvement, greenhouse gas emission reduction, increased comfort, and economic and social development. This policy brief seeks to support building code policymakers and implementers in designing effective building code programs.

  1. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  2. Optimal super dense coding over memory channels

    OpenAIRE

    Shadman, Zahra; Kampermann, Hermann; Macchiavello, Chiara; Bruß, Dagmar

    2011-01-01

    We study the super dense coding capacity in the presence of quantum channels with correlated noise. We investigate both the cases of unitary and non-unitary encoding. Pauli channels for arbitrary dimensions are treated explicitly. The super dense coding capacity for some special channels and resource states is derived for unitary encoding. We also provide an example of a memory channel where non-unitary encoding leads to an improvement in the super dense coding capacity.

  3. Maximization Network Throughput Based on Improved Genetic Algorithm and Network Coding for Optical Multicast Networks

    Science.gov (United States)

    Wei, Chengying; Xiong, Cuilian; Liu, Huanlin

    2017-12-01

    Maximal multicast stream algorithm based on network coding (NC) can improve the network's throughput for wavelength-division multiplexing (WDM) networks, which however is far less than the network's maximal throughput in terms of theory. And the existing multicast stream algorithms do not give the information distribution pattern and routing in the meantime. In the paper, an improved genetic algorithm is brought forward to maximize the optical multicast throughput by NC and to determine the multicast stream distribution by hybrid chromosomes construction for multicast with single source and multiple destinations. The proposed hybrid chromosomes are constructed by the binary chromosomes and integer chromosomes, while the binary chromosomes represent optical multicast routing and the integer chromosomes indicate the multicast stream distribution. A fitness function is designed to guarantee that each destination can receive the maximum number of decoding multicast streams. The simulation results showed that the proposed method is far superior over the typical maximal multicast stream algorithms based on NC in terms of network throughput in WDM networks.

  4. Quantifying the improvement in sepsis diagnosis, documentation, and coding: the marginal causal effect of year of hospitalization on sepsis diagnosis.

    Science.gov (United States)

    Jafarzadeh, S Reza; Thomas, Benjamin S; Marschall, Jonas; Fraser, Victoria J; Gill, Jeff; Warren, David K

    2016-01-01

    To quantify the coinciding improvement in the clinical diagnosis of sepsis, its documentation in the electronic health records, and subsequent medical coding of sepsis for billing purposes in recent years. We examined 98,267 hospitalizations in 66,208 patients who met systemic inflammatory response syndrome criteria at a tertiary care center from 2008 to 2012. We used g-computation to estimate the causal effect of the year of hospitalization on receiving an International Classification of Diseases, Ninth Revision, Clinical Modification discharge diagnosis code for sepsis by estimating changes in the probability of getting diagnosed and coded for sepsis during the study period. When adjusted for demographics, Charlson-Deyo comorbidity index, blood culture frequency per hospitalization, and intensive care unit admission, the causal risk difference for receiving a discharge code for sepsis per 100 hospitalizations with systemic inflammatory response syndrome, had the hospitalization occurred in 2012, was estimated to be 3.9% (95% confidence interval [CI], 3.8%-4.0%), 3.4% (95% CI, 3.3%-3.5%), 2.2% (95% CI, 2.1%-2.3%), and 0.9% (95% CI, 0.8%-1.1%) from 2008 to 2011, respectively. Patients with similar characteristics and risk factors had a higher of probability of getting diagnosed, documented, and coded for sepsis in 2012 than in previous years, which contributed to an apparent increase in sepsis incidence. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Sudan-decoding generalized geometric Goppa codes

    DEFF Research Database (Denmark)

    Heydtmann, Agnes Eileen

    2003-01-01

    Generalized geometric Goppa codes are vector spaces of n-tuples with entries from different extension fields of a ground field. They are derived from evaluating functions similar to conventional geometric Goppa codes, but allowing evaluation in places of arbitrary degree. A decoding scheme...... for these codes based on Sudan's improved algorithm is presented and its error-correcting capacity is analyzed. For the implementation of the algorithm it is necessary that the so-called increasing zero bases of certain spaces of functions are available. A method to obtain such bases is developed....

  6. Basic concept of common reactor physics code systems. Final report of working party on common reactor physics code systems (CCS)

    International Nuclear Information System (INIS)

    2004-03-01

    A working party was organized for two years (2001-2002) on common reactor physics code systems under the Research Committee on Reactor Physics of JAERI. This final report is compilation of activity of the working party on common reactor physics code systems during two years. Objectives of the working party is to clarify basic concept of common reactor physics code systems to improve convenience of reactor physics code systems for reactor physics researchers in Japan on their various field of research and development activities. We have held four meetings during 2 years, investigated status of reactor physics code systems and innovative software technologies, and discussed basic concept of common reactor physics code systems. (author)

  7. The RAGE radiation-hydrodynamic code

    Energy Technology Data Exchange (ETDEWEB)

    Gittings, Michael; Clover, Michael; Betlach, Thomas; Byrne, Nelson; Ranta, Dale [Science Applications International Corp. MS A-1, 10260 Campus Point Drive, San Diego, CA 92121 (United States); Weaver, Robert; Coker, Robert; Dendy, Edward; Hueckstaedt, Robert; New, Kim; Oakes, W Rob [Los Alamos National Laboratory, MS T087, PO Box 1663, Los Alamos, NM 87545 (United States); Stefan, Ryan [TaylorMade-adidas Golf, 5545 Fermi Court, Carlsbad, CA 92008-7324 (United States)], E-mail: michael.r.clover@saic.com

    2008-10-01

    We describe RAGE, the 'radiation adaptive grid Eulerian' radiation-hydrodynamics code, including its data structures, its parallelization strategy and performance, its hydrodynamic algorithm(s), its (gray) radiation diffusion algorithm, and some of the considerable amount of verification and validation efforts. The hydrodynamics is a basic Godunov solver, to which we have made significant improvements to increase the advection algorithm's robustness and to converge stiffnesses in the equation of state. Similarly, the radiation transport is a basic gray diffusion, but our treatment of the radiation-material coupling, wherein we converge nonlinearities in a novel manner to allow larger timesteps and more robust behavior, can be applied to any multi-group transport algorithm.

  8. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  9. Diabetes Mellitus Coding Training for Family Practice Residents.

    Science.gov (United States)

    Urse, Geraldine N

    2015-07-01

    Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.

  10. Prognostic significance of overexpressed long non-coding RNA TUG1 in patients with clear cell renal cell carcinoma.

    Science.gov (United States)

    Wang, P-Q; Wu, Y-X; Zhong, X-D; Liu, B; Qiao, G

    2017-01-01

    The long non-coding RNAs (lncRNAs) study has gradually become one of the hot topics in the field of RNA biology. However, little is known about the pathological role of lncRNA TUG1 in clear cell renal cell carcinoma (ccRCC) patients. This study attempted to investigate the association of lncRNA TUG1 expression with progression and prognosis in ccRCC patients. Using qRT-PCR, the expression of TUG1 was measured in 203 ccRCC tissues and 45 adjacent non-cancerous tissues. Then, the relationships between TUG1 level and the clinicopathological factors of patients with ccRCC were analyzed. The prognostic significance was evaluated using Kaplan-Meier and Cox regression analyses. The relative level of TUG1was significantly higher in ccRCC tissues compared to the adjacent non-tumor tissues (p TUG1 was associated significantly with histological grade, tumor stage, lymph node metastasis and distant metastasis (all p TUG1 expression levels were associated with a shorter overall survival (p TUG1 expression was an independent prognostic marker of poor outcome. These findings suggested that TUG1 may act as a tumor promoter in ccRCC and could serve as a potential therapeutic target for this tumor.

  11. Algorithms and computer codes for atomic and molecular quantum scattering theory

    International Nuclear Information System (INIS)

    Thomas, L.

    1979-01-01

    This workshop has succeeded in bringing up 11 different coupled equation codes on the NRCC computer, testing them against a set of 24 different test problems and making them available to the user community. These codes span a wide variety of methodologies, and factors of up to 300 were observed in the spread of computer times on specific problems. A very effective method was devised for examining the performance of the individual codes in the different regions of the integration range. Many of the strengths and weaknesses of the codes have been identified. Based on these observations, a hybrid code has been developed which is significantly superior to any single code tested. Thus, not only have the original goals been fully met, the workshop has resulted directly in an advancement of the field. All of the computer programs except VIVS are available upon request from the NRCC. Since an improved version of VIVS is contained in the hybrid program, VIVAS, it was not made available for distribution. The individual program LOGD is, however, available. In addition, programs which compute the potential energy matrices of the test problems are also available. The software library names for Tests 1, 2 and 4 are HEH2, LICO, and EN2, respectively

  12. Quality assurance procedures for the CONTAIN severe reactor accident computer code

    International Nuclear Information System (INIS)

    Russell, N.A.; Washington, K.E.; Bergeron, K.D.; Murata, K.K.; Carroll, D.E.; Harris, C.L.

    1991-01-01

    The CONTAIN quality assurance program follows a strict set of procedures designed to ensure the integrity of the code, to avoid errors in the code, and to prolong the life of the code. The code itself is maintained under a code-configuration control system that provides a historical record of changes. All changes are incorporated using an update processor that allows separate identification of improvements made to each successive code version. Code modifications and improvements are formally reviewed and checked. An exhaustive, multilevel test program validates the theory and implementation of all codes changes through assessment calculations that compare the code-predicted results to standard handbooks of idealized test cases. A document trail and archive establish the problems solved by the software, the verification and validation of the software, software changes and subsequent reverification and revalidation, and the tracking of software problems and actions taken to resolve those problems. This document describes in detail the CONTAIN quality assurance procedures. 4 refs., 21 figs., 4 tabs

  13. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  14. Fluid-structure-interaction analyses of reactor vessel using improved hybrid Lagrangian Eulerian code ALICE-II

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.Y.

    1993-06-01

    This paper describes fluid-structure-interaction and structure response analyses of a reactor vessel subjected to loadings associated with postulated accidents, using the hybrid Lagrangian-Eulerian code ALICE-II. This code has been improved recently to accommodate many features associated with innovative designs of reactor vessels. Calculational capabilities have been developed to treat water in the reactor cavity outside the vessel, internal shield structures and internal thin shells. The objective of the present analyses is to study the cover response and potential for missile generation in response to a fuel-coolant interaction in the core region. Three calculations were performed using the cover weight as a parameter. To study the effect of the cavity water, vessel response calculations for both wet- and dry-cavity designs are compared. Results indicate that for all cases studied and for the design parameters assumed, the calculated cover displacements are all smaller than the bolts` ultimate displacement and no missile generation of the closure head is predicted. Also, solutions reveal that the cavity water of the wet-cavity design plays an important role of restraining the downward displacement of the bottom head. Based on these studies, the analyses predict that the structure integrity is maintained throughout the postulated accident for the wet-cavity design.

  15. Fluid-structure-interaction analyses of reactor vessel using improved hybrid Lagrangian Eulerian code ALICE-II

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.Y.

    1993-01-01

    This paper describes fluid-structure-interaction and structure response analyses of a reactor vessel subjected to loadings associated with postulated accidents, using the hybrid Lagrangian-Eulerian code ALICE-II. This code has been improved recently to accommodate many features associated with innovative designs of reactor vessels. Calculational capabilities have been developed to treat water in the reactor cavity outside the vessel, internal shield structures and internal thin shells. The objective of the present analyses is to study the cover response and potential for missile generation in response to a fuel-coolant interaction in the core region. Three calculations were performed using the cover weight as a parameter. To study the effect of the cavity water, vessel response calculations for both wet- and dry-cavity designs are compared. Results indicate that for all cases studied and for the design parameters assumed, the calculated cover displacements are all smaller than the bolts' ultimate displacement and no missile generation of the closure head is predicted. Also, solutions reveal that the cavity water of the wet-cavity design plays an important role of restraining the downward displacement of the bottom head. Based on these studies, the analyses predict that the structure integrity is maintained throughout the postulated accident for the wet-cavity design.

  16. Simplified modeling and code usage in the PASC-3 code system by the introduction of a programming environment

    International Nuclear Information System (INIS)

    Pijlgroms, B.J.; Oppe, J.; Oudshoorn, H.L.; Slobben, J.

    1991-06-01

    A brief description is given of the PASC-3 (Petten-AMPX-SCALE) Reactor Physics code system and associated UNIPASC work environment. The PASC-3 code system is used for criticality and reactor calculations and consists of a selection from the Oak Ridge National Laboratory AMPX-SCALE-3 code collection complemented with a number of additional codes and nuclear data bases. The original codes have been adapted to run under the UNIX operating system. The recommended nuclear data base is a complete 219 group cross section library derived from JEF-1 of which some benchmark results are presented. By the addition of the UNIPASC work environment the usage of the code system is greatly simplified. Complex chains of programs can easily be coupled together to form a single job. In addition, the model parameters can be represented by variables instead of literal values which enhances the readability and may improve the integrity of the code inputs. (author). 8 refs.; 6 figs.; 1 tab

  17. Challenges to code status discussions for pediatric patients.

    Directory of Open Access Journals (Sweden)

    Katherine E Kruse

    Full Text Available In the context of serious or life-limiting illness, pediatric patients and their families are faced with difficult decisions surrounding appropriate resuscitation efforts in the event of a cardiopulmonary arrest. Code status orders are one way to inform end-of-life medical decision making. The objectives of this study are to evaluate the extent to which pediatric providers have knowledge of code status options and explore the association of provider role with (1 knowledge of code status options, (2 perception of timing of code status discussions, (3 perception of family receptivity to code status discussions, and (4 comfort carrying out code status discussions.Nurses, trainees (residents and fellows, and attending physicians from pediatric units where code status discussions typically occur completed a short survey questionnaire regarding their knowledge of code status options and perceptions surrounding code status discussions.Single center, quaternary care children's hospital.203 nurses, 31 trainees, and 29 attending physicians in 4 high-acuity pediatric units responded to the survey (N = 263, 90% response rate. Based on an objective knowledge measure, providers demonstrate poor understanding of available code status options, with only 22% of providers able to enumerate more than two of four available code status options. In contrast, provider groups self-report high levels of familiarity with available code status options, with attending physicians reporting significantly higher levels than nurses and trainees (p = 0.0125. Nurses and attending physicians show significantly different perception of code status discussion timing, with majority of nurses (63.4% perceiving discussions as occurring "too late" or "much too late" and majority of attending physicians (55.6% perceiving the timing as "about right" (p<0.0001. Attending physicians report significantly higher comfort having code status discussions with families than do nurses or trainees

  18. Parity-Check Network Coding for Multiple Access Relay Channel in Wireless Sensor Cooperative Communications

    Directory of Open Access Journals (Sweden)

    Du Bing

    2010-01-01

    Full Text Available A recently developed theory suggests that network coding is a generalization of source coding and channel coding and thus yields a significant performance improvement in terms of throughput and spatial diversity. This paper proposes a cooperative design of a parity-check network coding scheme in the context of a two-source multiple access relay channel (MARC model, a common compact model in hierarchical wireless sensor networks (WSNs. The scheme uses Low-Density Parity-Check (LDPC as the surrogate to build up a layered structure which encapsulates the multiple constituent LDPC codes in the source and relay nodes. Specifically, the relay node decodes the messages from two sources, which are used to generate extra parity-check bits by a random network coding procedure to fill up the rate gap between Source-Relay and Source-Destination transmissions. Then, we derived the key algebraic relationships among multidimensional LDPC constituent codes as one of the constraints for code profile optimization. These extra check bits are sent to the destination to realize a cooperative diversity as well as to approach MARC decode-and-forward (DF capacity.

  19. Improvement of genome assembly completeness and identification of novel full-length protein-coding genes by RNA-seq in the giant panda genome.

    Science.gov (United States)

    Chen, Meili; Hu, Yibo; Liu, Jingxing; Wu, Qi; Zhang, Chenglin; Yu, Jun; Xiao, Jingfa; Wei, Fuwen; Wu, Jiayan

    2015-12-11

    High-quality and complete gene models are the basis of whole genome analyses. The giant panda (Ailuropoda melanoleuca) genome was the first genome sequenced on the basis of solely short reads, but the genome annotation had lacked the support of transcriptomic evidence. In this study, we applied RNA-seq to globally improve the genome assembly completeness and to detect novel expressed transcripts in 12 tissues from giant pandas, by using a transcriptome reconstruction strategy that combined reference-based and de novo methods. Several aspects of genome assembly completeness in the transcribed regions were effectively improved by the de novo assembled transcripts, including genome scaffolding, the detection of small-size assembly errors, the extension of scaffold/contig boundaries, and gap closure. Through expression and homology validation, we detected three groups of novel full-length protein-coding genes. A total of 12.62% of the novel protein-coding genes were validated by proteomic data. GO annotation analysis showed that some of the novel protein-coding genes were involved in pigmentation, anatomical structure formation and reproduction, which might be related to the development and evolution of the black-white pelage, pseudo-thumb and delayed embryonic implantation of giant pandas. The updated genome annotation will help further giant panda studies from both structural and functional perspectives.

  20. Wavelength-Hopping Time-Spreading Optical CDMA With Bipolar Codes

    Science.gov (United States)

    Kwong, Wing C.; Yang, Guu-Chang; Chang, Cheng-Yuan

    2005-01-01

    Two-dimensional wavelength-hopping time-spreading coding schemes have been studied recently for supporting greater numbers of subscribers and simultaneous users than conventional one-dimensional approaches in optical code-division multiple-access (OCDMA) systems. To further improve both numbers without sacrificing performance, a new code design utilizing bipolar codes for both wavelength hopping and time spreading is studied and analyzed in this paper. A rapidly programmable, integratable hardware design for this new coding scheme, based on arrayed-waveguide gratings, is also discussed.

  1. Optical network security using unipolar Walsh code

    Science.gov (United States)

    Sikder, Somali; Sarkar, Madhumita; Ghosh, Shila

    2018-04-01

    Optical code-division multiple-access (OCDMA) is considered as a good technique to provide optical layer security. Many research works have been published to enhance optical network security by using optical signal processing. The paper, demonstrates the design of the AWG (arrayed waveguide grating) router-based optical network for spectral-amplitude-coding (SAC) OCDMA networks with Walsh Code to design a reconfigurable network codec by changing signature codes to against eavesdropping. In this paper we proposed a code reconfiguration scheme to improve the network access confidentiality changing the signature codes by cyclic rotations, for OCDMA system. Each of the OCDMA network users is assigned a unique signature code to transmit the information and at the receiving end each receiver correlates its own signature pattern a(n) with the receiving pattern s(n). The signal arriving at proper destination leads to s(n)=a(n).

  2. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  3. Variable weight Khazani-Syed code using hybrid fixed-dynamic technique for optical code division multiple access system

    Science.gov (United States)

    Anas, Siti Barirah Ahmad; Seyedzadeh, Saleh; Mokhtar, Makhfudzah; Sahbudin, Ratna Kalos Zakiah

    2016-10-01

    Future Internet consists of a wide spectrum of applications with different bit rates and quality of service (QoS) requirements. Prioritizing the services is essential to ensure that the delivery of information is at its best. Existing technologies have demonstrated how service differentiation techniques can be implemented in optical networks using data link and network layer operations. However, a physical layer approach can further improve system performance at a prescribed received signal quality by applying control at the bit level. This paper proposes a coding algorithm to support optical domain service differentiation using spectral amplitude coding techniques within an optical code division multiple access (OCDMA) scenario. A particular user or service has a varying weight applied to obtain the desired signal quality. The properties of the new code are compared with other OCDMA codes proposed for service differentiation. In addition, a mathematical model is developed for performance evaluation of the proposed code using two different detection techniques, namely direct decoding and complementary subtraction.

  4. Status of the ASTEC integral code

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Jacq, F.; Allelein, H.J.

    2000-01-01

    The ASTEC (Accident Source Term Evaluation Code) integrated code is developed since 1997 in close collaboration by IPSN and GRS to predict an entire LWR severe accident sequence from the initiating event up to Fission Product (FP) release out of the containment. The applications of such a code are source term determination studies, scenario evaluations, accident management studies and Probabilistic Safety Assessment level 2 (PSA-2) studies. The version V0 of ASTEC is based on the RCS modules of the ESCADRE integrated code (IPSN) and on the upgraded RALOC and FIPLOC codes (GRS) for containment thermalhydraulics and aerosol behaviour. The latest version V0.2 includes the general feed-back from the overall validation performed in 1998 (25 separate-effect experiments, PHEBUS.FP FPT1 integrated experiment), some modelling improvements (i.e. silver-iodine reactions in the containment sump), and the implementation of the main safety systems for Severe Accident Management. Several reactor-applications are under way on French and German PWR, and on VVER-1000, all with a multi-compartment configuration of the containment. The total IPSN-GRS manpower involved in ASTEC project is today about 20 men/year. The main evolution of the next version V1, foreseen end of 2001, concerns the integration of the front-end phase and the improvement of the in-vessel degradation late-phase modelling. (author)

  5. Interactive Video Coding and Transmission over Heterogeneous Wired-to-Wireless IP Networks Using an Edge Proxy

    Directory of Open Access Journals (Sweden)

    Modestino James W

    2004-01-01

    Full Text Available Digital video delivered over wired-to-wireless networks is expected to suffer quality degradation from both packet loss and bit errors in the payload. In this paper, the quality degradation due to packet loss and bit errors in the payload are quantitatively evaluated and their effects are assessed. We propose the use of a concatenated forward error correction (FEC coding scheme employing Reed-Solomon (RS codes and rate-compatible punctured convolutional (RCPC codes to protect the video data from packet loss and bit errors, respectively. Furthermore, the performance of a joint source-channel coding (JSCC approach employing this concatenated FEC coding scheme for video transmission is studied. Finally, we describe an improved end-to-end architecture using an edge proxy in a mobile support station to implement differential error protection for the corresponding channel impairments expected on the two networks. Results indicate that with an appropriate JSCC approach and the use of an edge proxy, FEC-based error-control techniques together with passive error-recovery techniques can significantly improve the effective video throughput and lead to acceptable video delivery quality over time-varying heterogeneous wired-to-wireless IP networks.

  6. Scaling gysela code beyond 32K-cores on bluegene/Q***

    Directory of Open Access Journals (Sweden)

    Bigot J.

    2013-12-01

    Full Text Available Gyrokinetic simulations lead to huge computational needs. Up to now, the semi- Lagrangian code Gysela performed large simulations using a few thousands cores (8k cores typically. Simulation with finer resolutions and with kinetic electrons are expected to increase those needs by a huge factor, providing a good example of applications requiring Exascale machines. This paper presents our work to improve Gysela in order to target an architecture that presents one possible way towards Exascale: the Blue Gene/Q. After analyzing the limitations of the code on this architecture, we have implemented three kinds of improvement: computational performance improvements, memory consumption improvements and disk i/o improvements. As a result, we show that the code now scales beyond 32k cores with much improved performances. This will make it possible to target the most powerful machines available and thus handle much larger physical cases.

  7. The Nudo, Rollo, Melon codes and nodal correlations

    International Nuclear Information System (INIS)

    Perlado, J.M.; Aragones, J.M.; Minguez, E.; Pena, J.

    1975-01-01

    Analysis of nodal calculation and checking results by the reference reactor experimental data. Nudo code description, adapting experimental data to nodal calculations. Rollo, Melon codes as improvement in the cycle life calculations of albedos, mixing parameters and nodal correlations. (author)

  8. On fuzzy semantic similarity measure for DNA coding.

    Science.gov (United States)

    Ahmad, Muneer; Jung, Low Tang; Bhuiyan, Md Al-Amin

    2016-02-01

    A coding measure scheme numerically translates the DNA sequence to a time domain signal for protein coding regions identification. A number of coding measure schemes based on numerology, geometry, fixed mapping, statistical characteristics and chemical attributes of nucleotides have been proposed in recent decades. Such coding measure schemes lack the biologically meaningful aspects of nucleotide data and hence do not significantly discriminate coding regions from non-coding regions. This paper presents a novel fuzzy semantic similarity measure (FSSM) coding scheme centering on FSSM codons׳ clustering and genetic code context of nucleotides. Certain natural characteristics of nucleotides i.e. appearance as a unique combination of triplets, preserving special structure and occurrence, and ability to own and share density distributions in codons have been exploited in FSSM. The nucleotides׳ fuzzy behaviors, semantic similarities and defuzzification based on the center of gravity of nucleotides revealed a strong correlation between nucleotides in codons. The proposed FSSM coding scheme attains a significant enhancement in coding regions identification i.e. 36-133% as compared to other existing coding measure schemes tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Wren, D.J.; Popov, N.; Snell, V.G.

    2004-01-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  10. Models and applications of the UEDGE code

    International Nuclear Information System (INIS)

    Rensink, M.E.; Knoll, D.A.; Porter, G.D.; Rognlien, T.D.; Smith, G.R.; Wising, F.

    1996-09-01

    The transport of particles and energy from the core of a tokamak to nearby material surfaces is an important problem for understanding present experiments and for designing reactor-grade devices. A number of fluid transport codes have been developed to model the plasma in the edge and scrape-off layer (SOL) regions. This report will focus on recent model improvements and illustrative results from the UEDGE code. Some geometric and mesh considerations are introduced, followed by a general description of the plasma and neutral fluid models. A few comments on computational issues are given and then two important applications are illustrated concerning benchmarking and the ITER radiative divertor. Finally, we report on some recent work to improve the models in UEDGE by coupling to a Monte Carlo neutrals code and by utilizing an adaptive grid

  11. Threshold Multi Split-Row algorithm for decoding irregular LDPC codes

    Directory of Open Access Journals (Sweden)

    Chakir Aqil

    2017-12-01

    Full Text Available In this work, we propose a new threshold multi split-row algorithm in order to improve the multi split-row algorithm for LDPC irregular codes decoding. We give a complete description of our algorithm as well as its advantages for the LDPC codes. The simulation results over an additive white gaussian channel show that an improvement in code error performance between 0.4 dB and 0.6 dB compared to the multi split-row algorithm.

  12. Applicability of Coupled Thermalhydraulic Codes for Safety Analysis of Nuclear Reactors

    International Nuclear Information System (INIS)

    Gairola, A.; Bhowmik, P. K.; Shamim, J. A.; Suh, K. Y.

    2014-01-01

    To this end computational codes like RELAP and TRACE are used to model thermal-hydraulic response of nuclear power plant during an accident. By careful modeling and significant user experience these system codes are able to simulate the behavior of primary system and the containment to a reasonable extent. Comparatively decoupled simulation is simple but might not produce reality and the physics involved in an accurate manner. Thus simulation using two different system codes is interesting as the whole system is coupled through the pressure in the containment and flow through the break. Using this methodology it might be possible to get new insight about the primary and containment behavior by the precise simulation of the accident both in the current reactors and future Gen-III/III+ reactors. Couple thermalhydraulic code methodology is still new and require further investigations. Applicability of such methodology to the GEN-II plants have met with limited success, however a number of situations in which this methodology could be applied are still unexplored and thus provides a room for improvement and modifications

  13. Applicability of Coupled Thermalhydraulic Codes for Safety Analysis of Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Gairola, A.; Bhowmik, P. K.; Shamim, J. A.; Suh, K. Y. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    To this end computational codes like RELAP and TRACE are used to model thermal-hydraulic response of nuclear power plant during an accident. By careful modeling and significant user experience these system codes are able to simulate the behavior of primary system and the containment to a reasonable extent. Comparatively decoupled simulation is simple but might not produce reality and the physics involved in an accurate manner. Thus simulation using two different system codes is interesting as the whole system is coupled through the pressure in the containment and flow through the break. Using this methodology it might be possible to get new insight about the primary and containment behavior by the precise simulation of the accident both in the current reactors and future Gen-III/III+ reactors. Couple thermalhydraulic code methodology is still new and require further investigations. Applicability of such methodology to the GEN-II plants have met with limited success, however a number of situations in which this methodology could be applied are still unexplored and thus provides a room for improvement and modifications.

  14. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  15. Performance of the improved version of Monte Carlo Code A3MCNP for cask shielding design

    International Nuclear Information System (INIS)

    Hasegawa, T.; Ueki, K.; Sato, O.; Sjoden, G.E.; Miyake, Y.; Ohmura, M.; Haghighat, A.

    2004-01-01

    A 3 MCNP (Automatic Adjoint Accelerated MCNP) is a revised version of the MCNP Monte Carlo code, that automatically prepares variance reduction parameters for the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using a deterministic ''importance'' (or adjoint) function, CADIS performs source and transport biasing within the weight-window technique. The current version of A 3 MCNP uses the 3-D Sn transport TORT code to determine a 3-D importance function distribution. Based on simulation of several real-life problems, it is demonstrated that A3MCNP provides precise calculation results with a remarkably short computation time by using the proper and objective variance reduction parameters. However, since the first version of A 3 MCNP provided only a point source configuration option for large-scale shielding problems, such as spent-fuel transport casks, a large amount of memory may be necessary to store enough points to properly represent the source. Hence, we have developed an improved version of A 3 MCNP (referred to as A 3 MCNPV) which has a volumetric source configuration option. This paper describes the successful use of A 3 MCNPV for cask neutron and gamma-ray shielding problem

  16. On the decoding process in ternary error-correcting output codes.

    Science.gov (United States)

    Escalera, Sergio; Pujol, Oriol; Radeva, Petia

    2010-01-01

    A common way to model multiclass classification problems is to design a set of binary classifiers and to combine them. Error-Correcting Output Codes (ECOC) represent a successful framework to deal with these type of problems. Recent works in the ECOC framework showed significant performance improvements by means of new problem-dependent designs based on the ternary ECOC framework. The ternary framework contains a larger set of binary problems because of the use of a "do not care" symbol that allows us to ignore some classes by a given classifier. However, there are no proper studies that analyze the effect of the new symbol at the decoding step. In this paper, we present a taxonomy that embeds all binary and ternary ECOC decoding strategies into four groups. We show that the zero symbol introduces two kinds of biases that require redefinition of the decoding design. A new type of decoding measure is proposed, and two novel decoding strategies are defined. We evaluate the state-of-the-art coding and decoding strategies over a set of UCI Machine Learning Repository data sets and into a real traffic sign categorization problem. The experimental results show that, following the new decoding strategies, the performance of the ECOC design is significantly improved.

  17. SCANAIR: A transient fuel performance code

    International Nuclear Information System (INIS)

    Moal, Alain; Georgenthum, Vincent; Marchand, Olivier

    2014-01-01

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  18. SCANAIR: A transient fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Moal, Alain, E-mail: alain.moal@irsn.fr; Georgenthum, Vincent; Marchand, Olivier

    2014-12-15

    Highlights: • Since the early 1990s, the code SCANAIR is developed at IRSN. • The software focuses on studying fast transients such as RIA in light water reactors. • The fuel rod modelling is based on a 1.5D approach. • Thermal and thermal-hydraulics, mechanical and gas behaviour resolutions are coupled. • The code is used for safety assessment and integral tests analysis. - Abstract: Since the early 1990s, the French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) has developed the SCANAIR computer code with the view to analysing pressurised water reactor (PWR) safety. This software specifically focuses on studying fast transients such as reactivity-initiated accidents (RIA) caused by possible ejection of control rods. The code aims at improving the global understanding of the physical mechanisms governing the thermal-mechanical behaviour of a single rod. It is currently used to analyse integral tests performed in CABRI and NSRR experimental reactors. The resulting validated code is used to carry out studies required to evaluate margins in relation to criteria for different types of fuel rods used in nuclear power plants. Because phenomena occurring during fast power transients are complex, the simulation in SCANAIR is based on a close coupling between several modules aimed at modelling thermal, thermal-hydraulics, mechanical and gas behaviour. During the first stage of fast power transients, clad deformation is mainly governed by the pellet–clad mechanical interaction (PCMI). At the later stage, heat transfers from pellet to clad bring the cladding material to such high temperatures that the boiling crisis might occurs. The significant over-pressurisation of the rod and the fact of maintaining the cladding material at elevated temperatures during a fairly long period can lead to ballooning and possible clad failure. A brief introduction describes the context, the historical background and recalls the main phenomena involved under

  19. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  20. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  1. Optimized iterative decoding method for TPC coded CPM

    Science.gov (United States)

    Ma, Yanmin; Lai, Penghui; Wang, Shilian; Xie, Shunqin; Zhang, Wei

    2018-05-01

    Turbo Product Code (TPC) coded Continuous Phase Modulation (CPM) system (TPC-CPM) has been widely used in aeronautical telemetry and satellite communication. This paper mainly investigates the improvement and optimization on the TPC-CPM system. We first add the interleaver and deinterleaver to the TPC-CPM system, and then establish an iterative system to iteratively decode. However, the improved system has a poor convergence ability. To overcome this issue, we use the Extrinsic Information Transfer (EXIT) analysis to find the optimal factors for the system. The experiments show our method is efficient to improve the convergence performance.

  2. Analysis of visual coding variables on CRT generated displays

    International Nuclear Information System (INIS)

    Blackman, H.S.; Gilmore, W.E.

    1985-01-01

    Cathode ray tube generated safety parameter display systems in a nuclear power plant control room situation have been found to be improved in effectiveness when color coding is employed. Research has indicated strong support for graphic coding techniques particularly in redundant coding schemes. In addition, findings on pictographs, as applied in coding schemes, indicate the need for careful application and for further research in the development of a standardized set of symbols

  3. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  4. Re-estimation of Motion and Reconstruction for Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Raket, Lars Lau; Forchhammer, Søren

    2014-01-01

    Transform domain Wyner-Ziv (TDWZ) video coding is an efficient approach to distributed video coding (DVC), which provides low complexity encoding by exploiting the source statistics at the decoder side. The DVC coding efficiency depends mainly on side information and noise modeling. This paper...... proposes a motion re-estimation technique based on optical flow to improve side information and noise residual frames by taking partially decoded information into account. To improve noise modeling, a noise residual motion re-estimation technique is proposed. Residual motion compensation with motion...

  5. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items to be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.

  6. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    International Nuclear Information System (INIS)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts' meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items to be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes

  7. TOOKUIL: A case study in user interface development for safety code application

    International Nuclear Information System (INIS)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-01-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today's safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL

  8. TOOKUIL: A case study in user interface development for safety code application

    International Nuclear Information System (INIS)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.; Peebles, R.C.; Smith, R.J.

    1996-11-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today's safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL

  9. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  10. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  11. Improvement on the KFOOD code for more realistic assessment of the annual food chain radiation dose due to operating nuclear facilities

    International Nuclear Information System (INIS)

    Choi, Yong Ho; Lee, Chang Woo; Kim, Jin Kyu; Lee, Myung Ho; Lee, Jeong Ho

    1993-01-01

    More realistic calculation models for evaluating man's annual intakes of radionuclides released from operating nuclear facilities were established. For the application of these models, the harvest years of food and feed crops consumed in the year of dose assessment and every year's average concentrations of a radionuclide in air and in water for the whole period of real operation had to be taken into account. KFOOD, an existing equilibrium food chain computer code for the Korean dose assessment, was modified according to the models. Sample runs of the modified code on the assumption of a constant release during 10 years' operation were made with three kinds of the input data files enabling the dose assessment in the improved method, the KFOOD method and another existing method, respectively, and the results were compared. Annual committed effective doses to Korean adult by intakes of Mn-54, Co-60, Sr-90, I-131 and Cs-137 calculated in the improved method were about 11, 2, 5, 60 and 3 %, respectively, lower than the corresponding KFOOD dose. To the intakes of the radionuclides except Sr-90 evaluated in the improved method, foliar uptake contributed much more than root uptake did but, in the case of Sr-90, the result was opposite. (Author)

  12. An Optical Multicast Routing with Minimal Network Coding Operations in WDM Networks

    Directory of Open Access Journals (Sweden)

    Huanlin Liu

    2014-01-01

    Full Text Available Network coding can improve the optical multicast routing performance in terms of network throughput, bandwidth utilization, and traffic load balance. But network coding needs high encoding operations costs in all-optical WDM networks due to shortage of optical RAM. In the paper, the network coding operation is defined to evaluate the number of network coding operation cost in the paper. An optical multicast routing algorithm based on minimal number of network coding operations is proposed to improve the multicast capacity. Two heuristic criteria are designed to establish the multicast routing with low network coding cost and high multicast capacity. One is to select one path from the former K shortest paths with the least probability of dropping the multicast maximal capacity. The other is to select the path with lowest potential coding operations with the highest link shared degree among the multiple wavelength disjoint paths cluster from source to each destination. Comparing with the other multicast routing based on network coding, simulation results show that the proposed multicast routing algorithm can effectively reduce the times of network coding operations, can improve the probability of reaching multicast maximal capacity, and can keep the less multicast routing link cost for optical WDM networks.

  13. Multiple Access Interference Reduction Using Received Response Code Sequence for DS-CDMA UWB System

    Science.gov (United States)

    Toh, Keat Beng; Tachikawa, Shin'ichi

    This paper proposes a combination of novel Received Response (RR) sequence at the transmitter and a Matched Filter-RAKE (MF-RAKE) combining scheme receiver system for the Direct Sequence-Code Division Multiple Access Ultra Wideband (DS-CDMA UWB) multipath channel model. This paper also demonstrates the effectiveness of the RR sequence in Multiple Access Interference (MAI) reduction for the DS-CDMA UWB system. It suggests that by using conventional binary code sequence such as the M sequence or the Gold sequence, there is a possibility of generating extra MAI in the UWB system. Therefore, it is quite difficult to collect the energy efficiently although the RAKE reception method is applied at the receiver. The main purpose of the proposed system is to overcome the performance degradation for UWB transmission due to the occurrence of MAI during multiple accessing in the DS-CDMA UWB system. The proposed system improves the system performance by improving the RAKE reception performance using the RR sequence which can reduce the MAI effect significantly. Simulation results verify that significant improvement can be obtained by the proposed system in the UWB multipath channel models.

  14. Engineering application of in-core fuel management optimization code with CSA algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhihong; Hu, Yongming [INET, Tsinghua university, Beijing 100084 (China)

    2009-06-15

    PWR in-core loading (reloading) pattern optimization is a complex combined problem. An excellent fuel management optimization code can greatly improve the efficiency of core reloading design, and bring economic and safety benefits. Today many optimization codes with experiences or searching algorithms (such as SA, GA, ANN, ACO) have been developed, while how to improve their searching efficiency and engineering usability still needs further research. CSA (Characteristic Statistic Algorithm) is a global optimization algorithm with high efficiency developed by our team. The performance of CSA has been proved on many problems (such as Traveling Salesman Problems). The idea of CSA is to induce searching direction by the statistic distribution of characteristic values. This algorithm is quite suitable for fuel management optimization. Optimization code with CSA has been developed and was used on many core models. The research in this paper is to improve the engineering usability of CSA code according to all the actual engineering requirements. Many new improvements have been completed in this code, such as: 1. Considering the asymmetry of burn-up in one assembly, the rotation of each assembly is considered as new optimization variables in this code. 2. Worth of control rods must satisfy the given constraint, so some relative modifications are added into optimization code. 3. To deal with the combination of alternate cycles, multi-cycle optimization is considered in this code. 4. To confirm the accuracy of optimization results, many identifications of the physics calculation module in this code have been done, and the parameters of optimization schemes are checked by SCIENCE code. The improved optimization code with CSA has been used on Qinshan nuclear plant of China. The reloading of cycle 7, 8, 9 (12 months, no burnable poisons) and the 18 months equilibrium cycle (with burnable poisons) reloading are optimized. At last, many optimized schemes are found by CSA code

  15. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  16. Evaluation of large girth LDPC codes for PMD compensation by turbo equalization.

    Science.gov (United States)

    Minkov, Lyubomir L; Djordjevic, Ivan B; Xu, Lei; Wang, Ting; Kueppers, Franko

    2008-08-18

    Large-girth quasi-cyclic LDPC codes have been experimentally evaluated for use in PMD compensation by turbo equalization for a 10 Gb/s NRZ optical transmission system, and observing one sample per bit. Net effective coding gain improvement for girth-10, rate 0.906 code of length 11936 over maximum a posteriori probability (MAP) detector for differential group delay of 125 ps is 6.25 dB at BER of 10(-6). Girth-10 LDPC code of rate 0.8 outperforms the girth-10 code of rate 0.906 by 2.75 dB, and provides the net effective coding gain improvement of 9 dB at the same BER. It is experimentally determined that girth-10 LDPC codes of length around 15000 approach channel capacity limit within 1.25 dB.

  17. Rehearsal significantly improves immediate and delayed recall on the Rey Auditory Verbal Learning Test.

    Science.gov (United States)

    Hessen, Erik

    2011-10-01

    A repeated observation during memory assessment with the Rey Auditory Verbal Learning Test (RAVLT) is that patients who spontaneously employ a memory rehearsal strategy by repeating the word list more than once achieve better scores than patients who only repeat the word list once. This observation led to concern about the ability of the standard test procedure of RAVLT and similar tests in eliciting the best possible recall scores. The purpose of the present study was to test the hypothesis that a rehearsal recall strategy of repeating the word list more than once would result in improved scores of recall on the RAVLT. We report on differences in outcome after standard administration and after experimental administration on Immediate and Delayed Recall measures from the RAVLT of 50 patients. The experimental administration resulted in significantly improved scores for all the variables employed. Additionally, it was found that patients who failed effort screening showed significantly poorer improvement on Delayed Recall compared with those who passed the effort screening. The general clear improvement both in raw scores and T-scores demonstrates that recall performance can be significantly influenced by the strategy of the patient or by small variations in instructions by the examiner.

  18. Present state of the SOURCES computer code

    International Nuclear Information System (INIS)

    Shores, Erik F.

    2002-01-01

    In various stages of development for over two decades, the SOURCES computer code continues to calculate neutron production rates and spectra from four types of problems: homogeneous media, two-region interfaces, three-region interfaces and that of a monoenergetic alpha particle beam incident on a slab of target material. Graduate work at the University of Missouri - Rolla, in addition to user feedback from a tutorial course, provided the impetus for a variety of code improvements. Recently upgraded to version 4B, initial modifications to SOURCES focused on updates to the 'tape5' decay data library. Shortly thereafter, efforts focused on development of a graphical user interface for the code. This paper documents the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) and describes additional library modifications in more detail. Minor improvements and planned enhancements are discussed.

  19. Fast and Flexible Successive-Cancellation List Decoders for Polar Codes

    Science.gov (United States)

    Hashemi, Seyyed Ali; Condo, Carlo; Gross, Warren J.

    2017-11-01

    Polar codes have gained significant amount of attention during the past few years and have been selected as a coding scheme for the next generation of mobile broadband standard. Among decoding schemes, successive-cancellation list (SCL) decoding provides a reasonable trade-off between the error-correction performance and hardware implementation complexity when used to decode polar codes, at the cost of limited throughput. The simplified SCL (SSCL) and its extension SSCL-SPC increase the speed of decoding by removing redundant calculations when encountering particular information and frozen bit patterns (rate one and single parity check codes), while keeping the error-correction performance unaltered. In this paper, we improve SSCL and SSCL-SPC by proving that the list size imposes a specific number of bit estimations required to decode rate one and single parity check codes. Thus, the number of estimations can be limited while guaranteeing exactly the same error-correction performance as if all bits of the code were estimated. We call the new decoding algorithms Fast-SSCL and Fast-SSCL-SPC. Moreover, we show that the number of bit estimations in a practical application can be tuned to achieve desirable speed, while keeping the error-correction performance almost unchanged. Hardware architectures implementing both algorithms are then described and implemented: it is shown that our design can achieve 1.86 Gb/s throughput, higher than the best state-of-the-art decoders.

  20. CONTAIN 2.0 code release and the transition to licensing

    International Nuclear Information System (INIS)

    Murata, K.K.; Griffith, R.O.; Bergeron, K.D.; Tills, J.

    1997-10-01

    CONTAIN is a reactor accident simulation code developed by Sandia National Laboratories under US Nuclear Regulatory Commission (USNRC) sponsorship to provide integrated analysis of containment phenomena, including those related to nuclear reactor containment loads and radiological source terms. The recently released CONTAIN 2.0 code version represents a significant advance in CONTAIN modeling capabilities over the last major code release (CONTAIN 1.12V). The new modeling capabilities are discussed here. The principal motivation for many of the recent model improvements has been to allow CONTAIN to model the special features in advanced light water reactor (ALWR) designs. The work done in this area is also summarized. In addition to the ALWR work, the USNRC is currently engaged in an effort to qualify CONTAIN for more general use in licensing, with the intent of supplementing or possibly replacing traditional licensing codes. To qualify the CONTAIN code for licensing applications, studies utilizing CONTAIN 2.0 are in progress. A number of results from this effort are presented in this paper to illustrate the code capabilities. In particular, CONTAIN calculations of the NUPEC M-8-1 and ISP-23 experiments and CVTR test number-sign 3 are presented to illustrate (1) the ability of CONTAIN to model non-uniform gas density and/or temperature distributions, and (2) the relationship between such gas distributions and containment loads. CONTAIN and CONTEMPT predictions for a large break loss of coolant accident scenario in the San Onofre plant are also compared

  1. MELCOR code modeling for APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Young; Park, S. Y.; Kim, D. H.; Ahn, K. I.; Song, Y. M.; Kim, S. D.; Park, J. H

    2001-11-01

    The severe accident phenomena of nuclear power plant have large uncertainties. For the retention of the containment integrity and improvement of nuclear reactor safety against severe accident, it is essential to understand severe accident phenomena and be able to access the accident progression accurately using computer code. Furthermore, it is important to attain a capability for developing technique and assessment tools for an advanced nuclear reactor design as well as for the severe accident prevention and mitigation. The objective of this report is to establish technical bases for an application of the MELCOR code to the Korean Next Generation Reactor (APR1400) by modeling the plant and analyzing plant steady state. This report shows the data and the input preparation for MELCOR code as well as state-state assessment results using MELCOR code.

  2. A new coupling of the 3D thermal-hydraulic code THYC and the thermo-mechanical code CYRANO3 for PWR calculations

    Energy Technology Data Exchange (ETDEWEB)

    Marguet, S.D. [Electricite de France (EDF), 92 - Clamart (France)

    1997-12-31

    Among all parameters, the fuel temperature has a significant influence on the reactivity of the core, because of the Doppler effect on cross-sections. Most neutronic codes use a straightforward method to calculate an average fuel temperature used in their specific feed-back models. For instance, EDF`s neutronic code COCCINELLE uses the Rowland`s formula using the temperatures of the center and the surface of the pellet. COCCINELLE is coupled to the 3D thermal-hydraulic code THYC with calculates TDoppler with is standard thermal model. In order to improve the accuracy of such calculations, we have developed the coupling of our two latest codes in thermal-hydraulics (THYC) and thermo-mechanics (CYRANO3). THYC calculates two-phase flows in pipes or rod bundles and is used for transient calculations such as steam-line break, boron dilution accidents, DNB predictions, steam generator and condenser studies. CYRANO3 calculates most of the phenomena that take place in the fuel such as: 1) heat transfer induced by nuclear power; 2) thermal expansion of the fuel and the cladding; 3) release of gaseous fission`s products; 4) mechanical interaction between the pellet and the cladding. These two codes are now qualified in their own field and the coupling, using Parallel Virtual Machine (PVM) libraries customized in an home-made-easy-to-use package called CALCIUM, has been validated on `low` configurations (no thermal expansion, constant thermal characteristics) and used on accidental transients such as rod ejection and loss of coolant accident. (K.A.) 7 refs.

  3. A new coupling of the 3D thermal-hydraulic code THYC and the thermo-mechanical code CYRANO3 for PWR calculations

    International Nuclear Information System (INIS)

    Marguet, S.D.

    1997-01-01

    Among all parameters, the fuel temperature has a significant influence on the reactivity of the core, because of the Doppler effect on cross-sections. Most neutronic codes use a straightforward method to calculate an average fuel temperature used in their specific feed-back models. For instance, EDF's neutronic code COCCINELLE uses the Rowland's formula using the temperatures of the center and the surface of the pellet. COCCINELLE is coupled to the 3D thermal-hydraulic code THYC with calculates TDoppler with is standard thermal model. In order to improve the accuracy of such calculations, we have developed the coupling of our two latest codes in thermal-hydraulics (THYC) and thermo-mechanics (CYRANO3). THYC calculates two-phase flows in pipes or rod bundles and is used for transient calculations such as steam-line break, boron dilution accidents, DNB predictions, steam generator and condenser studies. CYRANO3 calculates most of the phenomena that take place in the fuel such as: 1) heat transfer induced by nuclear power; 2) thermal expansion of the fuel and the cladding; 3) release of gaseous fission's products; 4) mechanical interaction between the pellet and the cladding. These two codes are now qualified in their own field and the coupling, using Parallel Virtual Machine (PVM) libraries customized in an home-made-easy-to-use package called CALCIUM, has been validated on 'low' configurations (no thermal expansion, constant thermal characteristics) and used on accidental transients such as rod ejection and loss of coolant accident. (K.A.)

  4. General features of the neutronics design code EQUICYCLE

    International Nuclear Information System (INIS)

    Jirlow, K.

    1978-10-01

    The neutronics code EQUICYCLE has been developed and improved over a long period of time. It is expecially adapted to survey type design calculations of large fast power reactors with particular emphasis on the nuclear parameters for a realistic equilibrium fuel cycle. Thus the code is used to evaluate the breeding performance, the power distributions and the uranium and plutonium mass balance for realistic refuelling schemes. In addition reactivity coefficients can be calculated and the influence of burnup could be assessed. The code is two-dimensional and treats the reactor core in R-Z geometry. The basic ideas of the calculating scheme are successive iterative improvement of cross-section sets and flux spectra and use of the mid-cycle flux for burning the fuel according to a specified refuelling scheme. Normally given peak burn-ups and maximum power densities are used as boundary conditions. The code is capable of handling the unconventional, so called heterogeneous cores. (author)

  5. Improvement of air transport data and wall transmission/reflection data in the SKYSHINE code. 2. Calculation of gamma-ray wall transmission and reflection data

    Energy Technology Data Exchange (ETDEWEB)

    Hayashida, Yoshihisa [Toshiba Corp., Kawasaki, Kanagawa (Japan); Ishikawa, Satoshi; Harima, Yoshiko [CRC Research Institute Inc., Tokyo (Japan); Hayashi, Katsumi; Tayama, Ryuichi [Hitachi Engineering Co. Ltd., Ibaraki (Japan); Hirayama, Hideo [High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan); Sakamoto, Yukio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Nemoto, Makoto [Visible Information Center, Tokai, Ibaraki (Japan); Sato, Osamu [Mitsubishi Research Inst., Inc., Tokyo (Japan)

    2000-03-01

    Transmission and reflection data of concrete and steel for 6.2 MeV gamma-ray in the SKYSHINE code have been generated using up-to-date data and method with a view to improving an accuracy of results. The transmission and reflection data depend on energy and angle. The invariant embedding method, which has merits of producing no negative angular flux and of taking small computer time, is suitable and adopted to the present purpose. Transmission data were calculated for concrete of 12 {approx} 160 cm thick and steel of 4 {approx} 39 cm thick based on the PHOTX library. Reflection data were calculated for semi-infinite slabs of concrete and steel. Consequently, smooth and consistent differential data over whole angle and energy were obtained compared with the original data calculated by discrete ordinates Sn code and Monte Carlo code. In order to use these data in the SKYSHINE code, further verification is needed using various calculation method or experimental data. (author)

  6. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  7. Economic aspects and models for building codes

    DEFF Research Database (Denmark)

    Bonke, Jens; Pedersen, Dan Ove; Johnsen, Kjeld

    It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study.......It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study....

  8. Code breaking in the pacific

    CERN Document Server

    Donovan, Peter

    2014-01-01

    Covers the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945 Describes, explains and analyzes the code breaking techniques developed during the war in the Pacific Exposes the blunders (in code construction and use) made by the Japanese Navy that led to significant US Naval victories

  9. An audit of the nature and impact of clinical coding subjectivity variability and error in otolaryngology.

    Science.gov (United States)

    Nouraei, S A R; Hudovsky, A; Virk, J S; Chatrath, P; Sandhu, G S

    2013-12-01

    To audit the accuracy of clinical coding in otolaryngology, assess the effectiveness of previously implemented interventions, and determine ways in which it can be further improved. Prospective clinician-auditor multidisciplinary audit of clinical coding accuracy. Elective and emergency ENT admissions and day-case activity. Concordance between initial coding and the clinician-auditor multi-disciplinary teams (MDT) coding in respect of primary and secondary diagnoses and procedures, health resource groupings health resource groupings (HRGs) and tariffs. The audit of 3131 randomly selected otolaryngology patients between 2010 and 2012 resulted in 420 instances of change to the primary diagnosis (13%) and 417 changes to the primary procedure (13%). In 1420 cases (44%), there was at least one change to the initial coding and 514 (16%) health resource groupings changed. There was an income variance of £343,169 or £109.46 per patient. The highest rates of health resource groupings change were observed in head and neck surgery and in particular skull-based surgery, laryngology and within that tracheostomy, and emergency admissions, and specially, epistaxis management. A randomly selected sample of 235 patients from the audit were subjected to a second audit by a second clinician-auditor multi-disciplinary team. There were 12 further health resource groupings changes (5%) and at least one further coding change occurred in 57 patients (24%). These changes were significantly lower than those observed in the pre-audit sample, but were also significantly greater than zero. Asking surgeons to 'code in theatre' and applying these codes without further quality assurance to activity resulted in an health resource groupings error rate of 45%. The full audit sample was regrouped under health resource groupings 3.5 and was compared with a previous audit of 1250 patients performed between 2007 and 2008. This comparison showed a reduction in the baseline rate of health resource

  10. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  11. Development of HTGR plant dynamics simulation code

    International Nuclear Information System (INIS)

    Ohashi, Kazutaka; Tazawa, Yujiro; Mitake, Susumu; Suzuki, Katsuo.

    1987-01-01

    Plant dynamics simulation analysis plays an important role in the design work of nuclear power plant especially in the plant safety analysis, control system analysis, and transient condition analysis. The authors have developed the plant dynamics simulation code named VESPER, which is applicable to the design work of High Temperature Engineering Test Reactor, and have been improving the code corresponding to the design changes made in the subsequent design works. This paper describes the outline of VESPER code and shows its sample calculation results selected from the recent design work. (author)

  12. An efficient adaptive arithmetic coding image compression technology

    International Nuclear Information System (INIS)

    Wang Xing-Yuan; Yun Jiao-Jiao; Zhang Yong-Lei

    2011-01-01

    This paper proposes an efficient lossless image compression scheme for still images based on an adaptive arithmetic coding compression algorithm. The algorithm increases the image coding compression rate and ensures the quality of the decoded image combined with the adaptive probability model and predictive coding. The use of adaptive models for each encoded image block dynamically estimates the probability of the relevant image block. The decoded image block can accurately recover the encoded image according to the code book information. We adopt an adaptive arithmetic coding algorithm for image compression that greatly improves the image compression rate. The results show that it is an effective compression technology. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  13. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  14. Unified Health Gamification can significantly improve well-being in corporate environments.

    Science.gov (United States)

    Shahrestani, Arash; Van Gorp, Pieter; Le Blanc, Pascale; Greidanus, Fabrizio; de Groot, Kristel; Leermakers, Jelle

    2017-07-01

    There is a multitude of mHealth applications that aim to solve societal health problems by stimulating specific types of physical activities via gamification. However, physical health activities cover just one of the three World Health Organization (WHO) dimensions of health. This paper introduces the novel notion of Unified Health Gamification (UHG), which covers besides physical health also social and cognitive health and well-being. Instead of rewarding activities in the three WHO dimensions using different mHealth competitions, UHG combines the scores for such activities on unified leaderboards and lets people interact in social circles beyond personal interests. This approach is promising in corporate environments since UHG can connect the employees with intrinsic motivation for physical health with those who have quite different interests. In order to evaluate this approach, we realized an app prototype and we evaluated it in two corporate pilot studies. In total, eighteen pilot users participated voluntarily for six weeks. Half of the participants were recruited from an occupational health setting and the other half from a treatment setting. Our results suggest that the UHG principles are worth more investigation: various positive health effects were found based on a validated survey. The mean mental health improved significantly at one pilot location and at the level of individual pilot participants, multiple other effects were found to be significant: among others, significant mental health improvements were found for 28% of the participants. Most participants intended to use the app beyond the pilot, especially if it would be further developed.

  15. An evaluation of TRAC-PF1/MOD1 computer code performance during posttest simulations of Semiscale MOD-2C feedwater line break transients

    International Nuclear Information System (INIS)

    Hall, D.G.; Watkins, J.C.

    1987-01-01

    This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In addition to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use

  16. Improvements to Busquet's Non LTE algorithm in NRL's Hydro code

    Science.gov (United States)

    Klapisch, M.; Colombant, D.

    1996-11-01

    Implementation of the Non LTE model RADIOM (M. Busquet, Phys. Fluids B, 5, 4191 (1993)) in NRL's RAD2D Hydro code in conservative form was reported previously(M. Klapisch et al., Bull. Am. Phys. Soc., 40, 1806 (1995)).While the results were satisfactory, the algorithm was slow and not always converging. We describe here modifications that address the latter two shortcomings. This method is quicker and more stable than the original. It also gives information about the validity of the fitting. It turns out that the number and distribution of groups in the multigroup diffusion opacity tables - a basis for the computation of radiation effects in the ionization balance in RADIOM- has a large influence on the robustness of the algorithm. These modifications give insight about the algorithm, and allow to check that the obtained average charge state is the true average. In addition, code optimization resulted in greatly reduced computing time: The ratio of Non LTE to LTE computing times being now between 1.5 and 2.

  17. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran.

    Science.gov (United States)

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-09-01

    Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  18. Comparison of energy conservation building codes of Iran, Turkey, Germany, China, ISO 9164 and EN 832

    International Nuclear Information System (INIS)

    Fayaz, Rima; Kari, Behrouz M.

    2009-01-01

    To improve the energy efficiency of buildings via compliance to regulation in Iran, Code No. 19 was devised in 1991. The code lacks high level aims and objectives, addressing the characteristics of Iranian buildings. As a consequence, the code has been revised and is not completely implemented in practice, and still remains inefficient. As with any energy coding system, this code has to identify the right balance between the different energy variables for the Iranian climate and way of life. In order to assist improvements to high level objectives of Code 19, this code is compared with ISO 9164, EN 832, German regulation, TS 825 of Turkey and China's GB 50189 to understand how these have adapted international standards to national features. In order to test the appropriateness of Code 19, five case study buildings in Iran are assessed against Code 19 as well as Turkish standard TS 825 and the results are compared. The results demonstrate that Code 19 is efficient in calculations of building envelope, but it needs improvements in the areas of ventilation, gains from internal and solar sources. The paper concludes by offering suggestions for improving the code.

  19. A Review on Block Matching Motion Estimation and Automata Theory based Approaches for Fractal Coding

    Directory of Open Access Journals (Sweden)

    Shailesh Kamble

    2016-12-01

    Full Text Available Fractal compression is the lossy compression technique in the field of gray/color image and video compression. It gives high compression ratio, better image quality with fast decoding time but improvement in encoding time is a challenge. This review paper/article presents the analysis of most significant existing approaches in the field of fractal based gray/color images and video compression, different block matching motion estimation approaches for finding out the motion vectors in a frame based on inter-frame coding and intra-frame coding i.e. individual frame coding and automata theory based coding approaches to represent an image/sequence of images. Though different review papers exist related to fractal coding, this paper is different in many sense. One can develop the new shape pattern for motion estimation and modify the existing block matching motion estimation with automata coding to explore the fractal compression technique with specific focus on reducing the encoding time and achieving better image/video reconstruction quality. This paper is useful for the beginners in the domain of video compression.

  20. SIMMER-II code and its applications

    International Nuclear Information System (INIS)

    Smith, L.L.

    1979-01-01

    The significant features of SIMMER-II, a disrupted-core analysis code, are described. The code has the capabalities to begin space-time neutronics calculations from nonstationary reactor states, to track the intermixing of fuel of different enrichments, and to model the complicated heat- and mass-transfer processes that occur in the transition phase. Example calculations are presented for analyses of whole-core accidents and for analyses of experiments supporting the code models

  1. Frameworks for improvement: clinical audit, the plan-do-study-act cycle and significant event audit.

    Science.gov (United States)

    Gillam, Steve; Siriwardena, A Niroshan

    2013-01-01

    This is the first in a series of articles about quality improvement tools and techniques. We explore common frameworks for improvement, including the model for improvement and its application to clinical audit, plan-do-study-act (PDSA) cycles and significant event analysis (SEA), examining the similarities and differences between these and providing examples of each.

  2. Linear-time general decoding algorithm for the surface code

    Science.gov (United States)

    Darmawan, Andrew S.; Poulin, David

    2018-05-01

    A quantum error correcting protocol can be substantially improved by taking into account features of the physical noise process. We present an efficient decoder for the surface code which can account for general noise features, including coherences and correlations. We demonstrate that the decoder significantly outperforms the conventional matching algorithm on a variety of noise models, including non-Pauli noise and spatially correlated noise. The algorithm is based on an approximate calculation of the logical channel using a tensor-network description of the noisy state.

  3. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  4. CoCo trial: Color-coded blood pressure Control, a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Chmiel C

    2014-10-01

    Full Text Available Corinne Chmiel, Oliver Senn, Thomas Rosemann, Valerio Del Prete, Claudia Steurer-Stey Institute of General Practice and Health Services Research, University of Zurich, Zurich, Switzerland Background: Inadequate blood pressure (BP control is a frequent challenge in general practice. The objective of this study was to determine whether a color-coded BP booklet using a traffic light scheme (red, >180 mmHg systolic BP and/or >110 mmHg diastolic BP; yellow, >140–180 mmHg systolic BP or >90–110 mmHg diastolic BP; green, ≤140 mmHg systolic BP and ≤90 mmHg diastolic BP improves BP control and adherence with home BP measurement.Methods: In this two-group, randomized controlled trial, general practitioners recruited adult patients with a BP >140 mmHg systolic and/or >90 mmHg diastolic. Patients in the control group received a standard BP booklet and the intervention group used a color-coded booklet for daily home BP measurement. The main outcomes were changes in BP, BP control (treatment goal <140/90 mmHg, and adherence with home BP measurement after 6 months.Results: One hundred and twenty-one of 137 included patients qualified for analysis. After 6 months, a significant decrease in systolic and diastolic BP was achieved in both groups, with no significant difference between the groups (16.1/7.9 mmHg in the intervention group versus 13.1/8.6 mmHg in the control group, P=0.3/0.7. BP control (treatment target <140/90 mmHg was achieved significantly more often in the intervention group (43% versus 25%; P=0.037; number needed to treat of 5. Adherence with home BP measurement overall was high, with a trend in favor of the intervention group (98.6% versus 96.2%; P=0.1Conclusion: Color-coded BP self-monitoring significantly improved BP control (number needed to treat of 5, meaning that every fifth patient utilizing color-coded self-monitoring achieved better BP control after 6 months, but no significant between-group difference was

  5. Performance Analysis of Optical Code Division Multiplex System

    Science.gov (United States)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  6. C++ Coding Standards 101 Rules, Guidelines, and Best Practices

    CERN Document Server

    Sutter, Herb

    2005-01-01

    Consistent, high-quality coding standards improve software quality, reduce time-to-market, promote teamwork, eliminate time wasted on inconsequential matters, and simplify maintenance. Now, two of the world's most respected C++ experts distill the rich collective experience of the global C++ community into a set of coding standards that every developer and development team can understand and use as a basis for their own coding standards.

  7. Complete permutation Gray code implemented by finite state machine

    Directory of Open Access Journals (Sweden)

    Li Peng

    2014-09-01

    Full Text Available An enumerating method of complete permutation array is proposed. The list of n! permutations based on Gray code defined over finite symbol set Z(n = {1, 2, …, n} is implemented by finite state machine, named as n-RPGCF. An RPGCF can be used to search permutation code and provide improved lower bounds on the maximum cardinality of a permutation code in some cases.

  8. TOOKUIL: A case study in user interface development for safety code application

    Energy Technology Data Exchange (ETDEWEB)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G. [and others

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.

  9. MCOR - Monte Carlo depletion code for reference LWR calculations

    Energy Technology Data Exchange (ETDEWEB)

    Puente Espel, Federico, E-mail: fup104@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Tippayakul, Chanatip, E-mail: cut110@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Ivanov, Kostadin, E-mail: kni1@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Misu, Stefan, E-mail: Stefan.Misu@areva.com [AREVA, AREVA NP GmbH, Erlangen (Germany)

    2011-04-15

    Research highlights: > Introduction of a reference Monte Carlo based depletion code with extended capabilities. > Verification and validation results for MCOR. > Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations. Additionally

  10. MCOR - Monte Carlo depletion code for reference LWR calculations

    International Nuclear Information System (INIS)

    Puente Espel, Federico; Tippayakul, Chanatip; Ivanov, Kostadin; Misu, Stefan

    2011-01-01

    Research highlights: → Introduction of a reference Monte Carlo based depletion code with extended capabilities. → Verification and validation results for MCOR. → Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations

  11. MELCOR Accident Consequence Code System (MACCS)

    International Nuclear Information System (INIS)

    Rollstin, J.A.; Chanin, D.I.; Jow, H.N.

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projections, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management

  12. MELCOR Accident Consequence Code System (MACCS)

    Energy Technology Data Exchange (ETDEWEB)

    Jow, H.N.; Sprung, J.L.; Ritchie, L.T. (Sandia National Labs., Albuquerque, NM (USA)); Rollstin, J.A. (GRAM, Inc., Albuquerque, NM (USA)); Chanin, D.I. (Technadyne Engineering Consultants, Inc., Albuquerque, NM (USA))

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management. 59 refs., 14 figs., 15 tabs.

  13. MELCOR Accident Consequence Code System (MACCS)

    International Nuclear Information System (INIS)

    Jow, H.N.; Sprung, J.L.; Ritchie, L.T.; Rollstin, J.A.; Chanin, D.I.

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management. 59 refs., 14 figs., 15 tabs

  14. Challenges on innovations of newly-developed safety analysis codes

    International Nuclear Information System (INIS)

    Yang, Yanhua; Zhang, Hao

    2016-01-01

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  15. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  16. Challenges on innovations of newly-developed safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua [Shanghai Jiao Tong Univ. (China). School of Nuclear Science and Engineering; Zhang, Hao [State Nuclear Power Software Development Center, Beijing (China). Beijing Future Science and Technology City

    2016-05-15

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  17. The use of diagnostic coding in chiropractic practice

    DEFF Research Database (Denmark)

    Testern, Cecilie D; Hestbæk, Lise; French, Simon D

    2015-01-01

    BACKGROUND: Diagnostic coding has several potential benefits, including improving the feasibility of data collection for research and clinical audits and providing a common language to improve interdisciplinary collaboration. The primary aim of this study was to determine the views and perspectives......-2 PLUS) provided the 14 chiropractors with some experience in diagnostic coding, followed by an interview on the topic. The interviews were analysed thematically. The participating chiropractors and an independent coder applied ICPC-2 PLUS terms to the diagnoses of 10 patients. Then the level...... of agreement between the chiropractors and the coder was determined and Cohen's Kappa was used to determine the agreement beyond that expected by chance. RESULTS: From the interviews the three emerging themes were: 1) Advantages and disadvantages of using a clinical coding system in chiropractic practice, 2...

  18. Application of coupled codes for safety analysis and licensing issues

    International Nuclear Information System (INIS)

    Langenbuch, S.; Velkov, K.

    2006-01-01

    An overview is given on the development and the advantages of coupled codes which integrate 3D neutron kinetics into thermal-hydraulic system codes. The work performed within GRS by coupling the thermal-hydraulic system code ATHLET and the 3D neutronics code QUABOX/CUBBOX is described as an example. The application of the coupled codes as best-estimate simulation tools for safety analysis is discussed. Some examples from German licensing practices are given which demonstrate how the improved analytical methods of coupled codes have contributed to solve licensing issues related to optimized and more economical use of fuel. (authors)

  19. A method for modeling co-occurrence propensity of clinical codes with application to ICD-10-PCS auto-coding.

    Science.gov (United States)

    Subotin, Michael; Davis, Anthony R

    2016-09-01

    Natural language processing methods for medical auto-coding, or automatic generation of medical billing codes from electronic health records, generally assign each code independently of the others. They may thus assign codes for closely related procedures or diagnoses to the same document, even when they do not tend to occur together in practice, simply because the right choice can be difficult to infer from the clinical narrative. We propose a method that injects awareness of the propensities for code co-occurrence into this process. First, a model is trained to estimate the conditional probability that one code is assigned by a human coder, given than another code is known to have been assigned to the same document. Then, at runtime, an iterative algorithm is used to apply this model to the output of an existing statistical auto-coder to modify the confidence scores of the codes. We tested this method in combination with a primary auto-coder for International Statistical Classification of Diseases-10 procedure codes, achieving a 12% relative improvement in F-score over the primary auto-coder baseline. The proposed method can be used, with appropriate features, in combination with any auto-coder that generates codes with different levels of confidence. The promising results obtained for International Statistical Classification of Diseases-10 procedure codes suggest that the proposed method may have wider applications in auto-coding. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Short-Block Protograph-Based LDPC Codes

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher

    2010-01-01

    Short-block low-density parity-check (LDPC) codes of a special type are intended to be especially well suited for potential applications that include transmission of command and control data, cellular telephony, data communications in wireless local area networks, and satellite data communications. [In general, LDPC codes belong to a class of error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.] The codes of the present special type exhibit low error floors, low bit and frame error rates, and low latency (in comparison with related prior codes). These codes also achieve low maximum rate of undetected errors over all signal-to-noise ratios, without requiring the use of cyclic redundancy checks, which would significantly increase the overhead for short blocks. These codes have protograph representations; this is advantageous in that, for reasons that exceed the scope of this article, the applicability of protograph representations makes it possible to design highspeed iterative decoders that utilize belief- propagation algorithms.