WorldWideScience

Sample records for source code development

  1. Development of in-vessel source term analysis code, tracer

    International Nuclear Information System (INIS)

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  2. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  3. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    Science.gov (United States)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third

  4. Development of Coupled Interface System between the FADAS Code and a Source-term Evaluation Code XSOR for CANDU Reactors

    International Nuclear Information System (INIS)

    Son, Han Seong; Song, Deok Yong; Kim, Ma Woong; Shin, Hyeong Ki; Lee, Sang Kyu; Kim, Hyun Koon

    2006-01-01

    An accident prevention system is essential to the industrial security of nuclear industry. Thus, the more effective accident prevention system will be helpful to promote safety culture as well as to acquire public acceptance for nuclear power industry. The FADAS(Following Accident Dose Assessment System) which is a part of the Computerized Advisory System for a Radiological Emergency (CARE) system in KINS is used for the prevention against nuclear accident. In order to enhance the FADAS system more effective for CANDU reactors, it is necessary to develop the various accident scenarios and reliable database of source terms. This study introduces the construction of the coupled interface system between the FADAS and the source-term evaluation code aimed to improve the applicability of the CANDU Integrated Safety Analysis System (CISAS) for CANDU reactors

  5. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  6. Developing open-source codes for electromagnetic geophysics using industry support

    Science.gov (United States)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  7. Code Forking, Governance, and Sustainability in Open Source Software

    OpenAIRE

    Juho Lindman; Linus Nyman

    2013-01-01

    The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibilit...

  8. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  9. Present state of the SOURCES computer code

    International Nuclear Information System (INIS)

    Shores, Erik F.

    2002-01-01

    In various stages of development for over two decades, the SOURCES computer code continues to calculate neutron production rates and spectra from four types of problems: homogeneous media, two-region interfaces, three-region interfaces and that of a monoenergetic alpha particle beam incident on a slab of target material. Graduate work at the University of Missouri - Rolla, in addition to user feedback from a tutorial course, provided the impetus for a variety of code improvements. Recently upgraded to version 4B, initial modifications to SOURCES focused on updates to the 'tape5' decay data library. Shortly thereafter, efforts focused on development of a graphical user interface for the code. This paper documents the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) and describes additional library modifications in more detail. Minor improvements and planned enhancements are discussed.

  10. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  11. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  12. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  13. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  14. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  15. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  16. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  17. Transmission imaging with a coded source

    International Nuclear Information System (INIS)

    Stoner, W.W.; Sage, J.P.; Braun, M.; Wilson, D.T.; Barrett, H.H.

    1976-01-01

    The conventional approach to transmission imaging is to use a rotating anode x-ray tube, which provides the small, brilliant x-ray source needed to cast sharp images of acceptable intensity. Stationary anode sources, although inherently less brilliant, are more compatible with the use of large area anodes, and so they can be made more powerful than rotating anode sources. Spatial modulation of the source distribution provides a way to introduce detailed structure in the transmission images cast by large area sources, and this permits the recovery of high resolution images, in spite of the source diameter. The spatial modulation is deliberately chosen to optimize recovery of image structure; the modulation pattern is therefore called a ''code.'' A variety of codes may be used; the essential mathematical property is that the code possess a sharply peaked autocorrelation function, because this property permits the decoding of the raw image cast by th coded source. Random point arrays, non-redundant point arrays, and the Fresnel zone pattern are examples of suitable codes. This paper is restricted to the case of the Fresnel zone pattern code, which has the unique additional property of generating raw images analogous to Fresnel holograms. Because the spatial frequency of these raw images are extremely coarse compared with actual holograms, a photoreduction step onto a holographic plate is necessary before the decoded image may be displayed with the aid of coherent illumination

  18. Image authentication using distributed source coding.

    Science.gov (United States)

    Lin, Yao-Chung; Varodayan, David; Girod, Bernd

    2012-01-01

    We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.

  19. The Astrophysics Source Code Library by the numbers

    Science.gov (United States)

    Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.

  20. Code Forking, Governance, and Sustainability in Open Source Software

    Directory of Open Access Journals (Sweden)

    Juho Lindman

    2013-01-01

    Full Text Available The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibility of forking code, affects the governance and sustainability of open source initiatives on three distinct levels: software, community, and ecosystem. On the software level, the right to fork makes planned obsolescence, versioning, vendor lock-in, end-of-support issues, and similar initiatives all but impossible to implement. On the community level, forking impacts both sustainability and governance through the power it grants the community to safeguard against unfavourable actions by corporations or project leaders. On the business-ecosystem level forking can serve as a catalyst for innovation while simultaneously promoting better quality software through natural selection. Thus, forking helps keep open source initiatives relevant and presents opportunities for the development and commercialization of current and abandoned programs.

  1. Development status of TUF code

    International Nuclear Information System (INIS)

    Liu, W.S.; Tahir, A.; Zaltsgendler

    1996-01-01

    An overview of the important development of the TUF code in 1995 is presented. The development in the following areas is presented: control of round-off error propagation, gas resolution and release models, and condensation induced water hammer. This development is mainly generated from station requests for operational support and code improvement. (author)

  2. Studying the co-evolution of production and test code in open source and industrial developer test processes through repository mining

    NARCIS (Netherlands)

    Zaidman, A.; Van Rompaey, B.; Van Deursen, A.; Demeyer, S.

    2010-01-01

    Many software production processes advocate rigorous development testing alongside functional code writing, which implies that both test code and production code should co-evolve. To gain insight in the nature of this co-evolution, this paper proposes three views (realized by a tool called TeMo)

  3. Accident consequence assessment code development

    International Nuclear Information System (INIS)

    Homma, T.; Togawa, O.

    1991-01-01

    This paper describes the new computer code system, OSCAAR developed for off-site consequence assessment of a potential nuclear accident. OSCAAR consists of several modules which have modeling capabilities in atmospheric transport, foodchain transport, dosimetry, emergency response and radiological health effects. The major modules of the consequence assessment code are described, highlighting the validation and verification of the models. (author)

  4. Transmission from theory to practice: Experiences using open-source code development and a virtual short course to increase the adoption of new theoretical approaches

    Science.gov (United States)

    Harman, C. J.

    2015-12-01

    Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.

  5. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  6. Development of Level-2 PSA Technology: A Development of the Database of the Parametric Source Term for Kori Unit 1 Using the MAAP4 Code

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Chang Soon; Mun, Ju Hyun; Yun, Jeong Ick; Cho, Young Hoo; Kim, Chong Uk [Seoul National University, Seoul (Korea, Republic of)

    1997-07-15

    To quantify the severe accident source term of the parametric model method, the uncertainty of the parameters should be analyzed. Generally, to analyze the uncertainties, the cumulative distribution functions(CDF`S) of the parameters are derived. This report introduces a method of derivation of the CDF`s of the basic parameters, FCOR, FVES and FDCH. The calculation tool of the source term is the MAAP version 4.0. In the MAAP code, there are model parameters to consider an uncertain physical and/or chemical phenomenon. In general, the parameters have not a point value but a range. In this paper, considering this point, the input values of model parameters influencing each parameter are sampled using LHS. Then, the calculation results are shown in the cumulative distribution form. For a case study, the CDF`s of FCOR, FVES and FDCH of KORI unit 1 are derived. The target scenarios for the calculation are the ones whose initial events are large LOCA, small LOCA and transient, respectively. It is found that the distributions of this study are consistent to those of NUREG-1150 and are proven to be adequate in assessing the uncertainties in the severe accident source term of KORI Unit 1. 15 refs., 27 tabs., 4 figs. (author)

  7. Coarse mesh code development

    Energy Technology Data Exchange (ETDEWEB)

    Lieberoth, J.

    1975-06-15

    The numerical solution of the neutron diffusion equation plays a very important role in the analysis of nuclear reactors. A wide variety of numerical procedures has been proposed, at which most of the frequently used numerical methods are fundamentally based on the finite- difference approximation where the partial derivatives are approximated by the finite difference. For complex geometries, typical of the practical reactor problems, the computational accuracy of the finite-difference method is seriously affected by the size of the mesh width relative to the neutron diffusion length and by the heterogeneity of the medium. Thus, a very large number of mesh points are generally required to obtain a reasonably accurate approximate solution of the multi-dimensional diffusion equation. Since the computation time is approximately proportional to the number of mesh points, a detailed multidimensional analysis, based on the conventional finite-difference method, is still expensive even with modern large-scale computers. Accordingly, there is a strong incentive to develop alternatives that can reduce the number of mesh-points and still retain accuracy. One of the promising alternatives is the finite element method, which consists of the expansion of the neutron flux by piecewise polynomials. One of the advantages of this procedure is its flexibility in selecting the locations of the mesh points and the degree of the expansion polynomial. The small number of mesh points of the coarse grid enables to store the results of several of the least outer iterations and to calculate well extrapolated values of them by comfortable formalisms. This holds especially if only one energy distribution of fission neutrons is assumed for all fission processes in the reactor, because the whole information of an outer iteration is contained in a field of fission rates which has the size of all mesh points of the coarse grid.

  8. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  9. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  10. Coupled geochemical and solute transport code development

    International Nuclear Information System (INIS)

    Morrey, J.R.; Hostetler, C.J.

    1985-01-01

    A number of coupled geochemical hydrologic codes have been reported in the literature. Some of these codes have directly coupled the source-sink term to the solute transport equation. The current consensus seems to be that directly coupling hydrologic transport and chemical models through a series of interdependent differential equations is not feasible for multicomponent problems with complex geochemical processes (e.g., precipitation/dissolution reactions). A two-step process appears to be the required method of coupling codes for problems where a large suite of chemical reactions must be monitored. Two-step structure requires that the source-sink term in the transport equation is supplied by a geochemical code rather than by an analytical expression. We have developed a one-dimensional two-step coupled model designed to calculate relatively complex geochemical equilibria (CTM1D). Our geochemical module implements a Newton-Raphson algorithm to solve heterogeneous geochemical equilibria, involving up to 40 chemical components and 400 aqueous species. The geochemical module was designed to be efficient and compact. A revised version of the MINTEQ Code is used as a parent geochemical code

  11. Development of the DTNTES code

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Morales Dorado, M.D.; Alonso Santos, A.

    1987-01-01

    The DTNTES code has been developed in the Department of Nuclear Technology of the Polytechnical University in Madrid as a part of the Research Program on Quantitative Risk Analysis. DTNTES code calculates several time-dependent probabilistic characteristics of basic events, minimal cut sets and the top event of a fault tree. The code assumes that basic events are statistically independent, and they have failure and repair distributions. It computes the minimal cut upper bound approach for the top event unavailability, and the time-dependent unreliability of the top event by means of different methods, selected by the user. These methods are: expected number of system failures, failure rate, Barlow-Proschan bound, steady-state upper bound, and T* method. (author)

  12. Development of TIME2 code

    International Nuclear Information System (INIS)

    1986-02-01

    The paper reviews the progress on the development of a computer model TIME2, for modelling the long term evolution of shallow burial site environments for low- and intermediate-level radioactive waste disposal. The subject is discussed under the five topic headings: 1) background studies, including geomorphology, climate, human-induced effects, and seismicity, 2) development of the TIME2 code, 3) verification and testing, 4) documentation, and, 5) role of TIME2 in radiological risk assessment. (U.K.)

  13. The development of code benchmarks

    International Nuclear Information System (INIS)

    Glass, R.E.

    1986-01-01

    Sandia National Laboratories has undertaken a code benchmarking effort to define a series of cask-like problems having both numerical solutions and experimental data. The development of the benchmarks includes: (1) model problem definition, (2) code intercomparison, and (3) experimental verification. The first two steps are complete and a series of experiments are planned. The experiments will examine the elastic/plastic behavior of cylinders for both the end and side impacts resulting from a nine meter drop. The cylinders will be made from stainless steel and aluminum to give a range of plastic deformations. This paper presents the results of analyses simulating the model's behavior using materials properties for stainless steel and aluminum

  14. Source Code Stylometry Improvements in Python

    Science.gov (United States)

    2017-12-14

    grant (Caliskan-Islam et al. 2015) ............. 1 Fig. 2 Corresponding abstract syntax tree from de-anonymizing programmers’ paper (Caliskan-Islam et...person can be identified via their handwriting or an author identified by their style or prose, programmers can be identified by their code...Provided a labelled training set of code samples (example in Fig. 1), the techniques used in stylometry can identify the author of a piece of code or even

  15. Source Code Vulnerabilities in IoT Software Systems

    Directory of Open Access Journals (Sweden)

    Saleh Mohamed Alnaeli

    2017-08-01

    Full Text Available An empirical study that examines the usage of known vulnerable statements in software systems developed in C/C++ and used for IoT is presented. The study is conducted on 18 open source systems comprised of millions of lines of code and containing thousands of files. Static analysis methods are applied to each system to determine the number of unsafe commands (e.g., strcpy, strcmp, and strlen that are well-known among research communities to cause potential risks and security concerns, thereby decreasing a system’s robustness and quality. These unsafe statements are banned by many companies (e.g., Microsoft. The use of these commands should be avoided from the start when writing code and should be removed from legacy code over time as recommended by new C/C++ language standards. Each system is analyzed and the distribution of the known unsafe commands is presented. Historical trends in the usage of the unsafe commands of 7 of the systems are presented to show how the studied systems evolved over time with respect to the vulnerable code. The results show that the most prevalent unsafe command used for most systems is memcpy, followed by strlen. These results can be used to help train software developers on secure coding practices so that they can write higher quality software systems.

  16. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  17. Bit rates in audio source coding

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.

    1992-01-01

    The goal is to introduce and solve the audio coding optimization problem. Psychoacoustic results such as masking and excitation pattern models are combined with results from rate distortion theory to formulate the audio coding optimization problem. The solution of the audio optimization problem is a

  18. Source SDK development essentials

    CERN Document Server

    Bernier, Brett

    2014-01-01

    The Source Authoring Tools are the pieces of software used to create custom content for games made with Valve's Source engine. Creating mods and maps for your games without any programming knowledge can be time consuming. These tools allow you to create your own maps and levels without the need for any coding knowledge. All the tools that you need to start creating your own levels are built-in and ready to go! This book will teach you how to use the Authoring Tools provided with Source games and will guide you in creating your first maps and mods (modifications) using Source. You will learn ho

  19. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  20. Development and validation of sodium fire codes

    International Nuclear Information System (INIS)

    Morii, Tadashi; Himeno Yoshiaki; Miyake, Osamu

    1989-01-01

    Development, verification, and validation of the spray fire code, SPRAY-3M, the pool fire codes, SOFIRE-M2 and SPM, the aerosol behavior code, ABC-INTG, and the simultaneous spray and pool fires code, ASSCOPS, are presented. In addition, the state-of-the-art of development of the multi-dimensional natural convection code, SOLFAS, for the analysis of heat-mass transfer during a fire, is presented. (author)

  1. Reactor safety computer code development at INEL

    International Nuclear Information System (INIS)

    Johnsen, G.W.

    1985-01-01

    This report provides a brief overview of the computer code development programs being conducted at EG and G Idaho, Inc. on behalf of US Nuclear Regulatory Commission and the Department of Energy, Idaho Operations Office. Included are descriptions of the codes being developed, their development status as of the date of this report, and resident code development expertise

  2. Data processing with microcode designed with source coding

    Science.gov (United States)

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  3. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  4. Development of MCNP interface code in HFETR

    International Nuclear Information System (INIS)

    Qiu Liqing; Fu Rong; Deng Caiyu

    2007-01-01

    In order to describe the HFETR core with MCNP method, the interface code MCNPIP for HFETR and MCNP code is developed. This paper introduces the core DXSY and flowchart of MCNPIP code, and the handling of compositions of fuel elements and requirements on hardware and software. Finally, MCNPIP code is validated against the practical application. (authors)

  5. Development of a tritium dispersion code

    International Nuclear Information System (INIS)

    Bell, R.P.; Davis, M.W.; Joseph, S.; Wong, K.Y.

    1985-01-01

    This paper describes the development and verification of a computer code designed to calculate the radiation dose to man following acute or chronic atmospheric releases of tritium gas and oxide from a point source. The Ontario Hydro Tritium Dispersion Code calculates tritium concentrations in air, soil, and vegetation and doses to man resulting from inhalation/immersion and ingestion of food, milk meat and water. The deposition of HT to soil, conversion of HT to HTO by soil enzymes and resuspension of HTO to air have been incorporated into the terrestrial compartment model and are unique features of the code. Sensitivity analysis has identified the HT deposition velocity and the equivalent water depth of the vegetation compartment as two parameters which have a strong influence on dose calculations. Tritium concentrations in vegetation and soil calculated by the code were in reasonable agreement with experimental results. The radiological significance of including the mechanisms of HT to HTO conversion and resuspension of HTO to air is illustrated

  6. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  7. Monte Carlo code development in Los Alamos

    International Nuclear Information System (INIS)

    Carter, L.L.; Cashwell, E.D.; Everett, C.J.; Forest, C.A.; Schrandt, R.G.; Taylor, W.M.; Thompson, W.L.; Turner, G.D.

    1974-01-01

    The present status of Monte Carlo code development at Los Alamos Scientific Laboratory is discussed. A brief summary is given of several of the most important neutron, photon, and electron transport codes. 17 references. (U.S.)

  8. Recent activities in accelerator code development

    International Nuclear Information System (INIS)

    Copper, R.K.; Ryne, R.D.

    1992-01-01

    In this paper we will review recent activities in the area of code development as it affects the accelerator community. We will first discuss the changing computing environment. We will review how the computing environment has changed in the last 10 years, with emphasis on computing power, operating systems, computer languages, graphics standards, and massively parallel processing. Then we will discuss recent code development activities in the areas of electromagnetics codes and beam dynamics codes

  9. Recent developments in the Los Alamos radiation transport code system

    International Nuclear Information System (INIS)

    Forster, R.A.; Parsons, K.

    1997-01-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results

  10. Iterative List Decoding of Concatenated Source-Channel Codes

    Directory of Open Access Journals (Sweden)

    Hedayat Ahmadreza

    2005-01-01

    Full Text Available Whenever variable-length entropy codes are used in the presence of a noisy channel, any channel errors will propagate and cause significant harm. Despite using channel codes, some residual errors always remain, whose effect will get magnified by error propagation. Mitigating this undesirable effect is of great practical interest. One approach is to use the residual redundancy of variable length codes for joint source-channel decoding. In this paper, we improve the performance of residual redundancy source-channel decoding via an iterative list decoder made possible by a nonbinary outer CRC code. We show that the list decoding of VLC's is beneficial for entropy codes that contain redundancy. Such codes are used in state-of-the-art video coders, for example. The proposed list decoder improves the overall performance significantly in AWGN and fully interleaved Rayleigh fading channels.

  11. Status of SPACE Safety Analysis Code Development

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2009-01-01

    In 2006, the Korean the Korean nuclear industry started developing a thermal-hydraulic analysis code for safety analysis of PWR(Pressurized Water Reactor). The new code is named as SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). The SPACE code can solve two-fluid, three-field governing equations in one dimensional or three dimensional geometry. The SPACE code has many component models required for modeling a PWR, such as reactor coolant pump, safety injection tank, etc. The programming language used in the new code is C++, for new generation of engineers who are more comfortable with C/C++ than old FORTRAN language. This paper describes general characteristics of SPACE code and current status of SPACE code development

  12. Development and application of the waste code

    International Nuclear Information System (INIS)

    Morison, I.W.

    1984-01-01

    This paper discusses the objectives and general approach underlying the Australian Code of Practice on the Management of Radioactive Wastes arising from the Mining and Milling of Radioactive Ores 1982. Background to the development of the Code is provided and the guidelines which supplement the Code are considered

  13. Development of computer code in PNC, 3

    International Nuclear Information System (INIS)

    Ohtaki, Akira; Ohira, Hiroaki

    1990-01-01

    Super-COPD, a code which is integrated by calculation modules, has been developed in order to evaluate kinds of dynamics of LMFBR plant by improving COPD. The code involves all models and its advanced models of COPD in module structures. The code makes it possible to simulate the system dynamics of LMFBR plant of any configurations and components. (author)

  14. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  15. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  16. Distributed Remote Vector Gaussian Source Coding with Covariance Distortion Constraints

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2014-01-01

    In this paper, we consider a distributed remote source coding problem, where a sequence of observations of source vectors is available at the encoder. The problem is to specify the optimal rate for encoding the observations subject to a covariance matrix distortion constraint and in the presence...

  17. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  18. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  19. Distributed coding of multiview sparse sources with joint recovery

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Deligiannis, Nikos; Forchhammer, Søren

    2016-01-01

    In support of applications involving multiview sources in distributed object recognition using lightweight cameras, we propose a new method for the distributed coding of sparse sources as visual descriptor histograms extracted from multiview images. The problem is challenging due to the computati...... transform (SIFT) descriptors extracted from multiview images shows that our method leads to bit-rate saving of up to 43% compared to the state-of-the-art distributed compressed sensing method with independent encoding of the sources....

  20. Preliminary investigation study of code of developed country for developing Korean fuel cycle code

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Ko, Won Il; Lee, Ho Hee; Cho, Dong Keun; Park, Chang Je

    2012-01-01

    In order to develop Korean fuel cycle code, the analyses has been performed with the fuel cycle codes which are used in advanced country. Also, recommendations were proposed for future development. The fuel cycle codes are AS FLOOWS: VISTA which has been developed by IAEA, DANESS code which developed by ANL and LISTO, and VISION developed by INL for the Advanced Fuel Cycle Initiative (AFCI) system analysis. The recommended items were proposed for software, program scheme, material flow model, isotope decay model, environmental impact analysis model, and economics analysis model. The described things will be used for development of Korean nuclear fuel cycle code in future

  1. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  2. Code of conduct on the safety and security of radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    The objective of this Code is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through tile fostering of international co-operation. In particular, this Code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost.

  3. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    2001-03-01

    The objective of this Code is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through tile fostering of international co-operation. In particular, this Code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost

  4. NASA space radiation transport code development consortium

    International Nuclear Information System (INIS)

    Townsend, L. W.

    2005-01-01

    Recently, NASA established a consortium involving the Univ. of Tennessee (lead institution), the Univ. of Houston, Roanoke College and various government and national laboratories, to accelerate the development of a standard set of radiation transport computer codes for NASA human exploration applications. This effort involves further improvements of the Monte Carlo codes HETC and FLUKA and the deterministic code HZETRN, including developing nuclear reaction databases necessary to extend the Monte Carlo codes to carry out heavy ion transport, and extending HZETRN to three dimensions. The improved codes will be validated by comparing predictions with measured laboratory transport data, provided by an experimental measurements consortium, and measurements in the upper atmosphere on the balloon-borne Deep Space Test Bed (DSTB). In this paper, we present an overview of the consortium members and the current status and future plans of consortium efforts to meet the research goals and objectives of this extensive undertaking. (authors)

  5. Source Coding for Wireless Distributed Microphones in Reverberant Environments

    DEFF Research Database (Denmark)

    Zahedi, Adel

    2016-01-01

    . However, it comes with the price of several challenges, including the limited power and bandwidth resources for wireless transmission of audio recordings. In such a setup, we study the problem of source coding for the compression of the audio recordings before the transmission in order to reduce the power...... consumption and/or transmission bandwidth by reduction in the transmission rates. Source coding for wireless microphones in reverberant environments has several special characteristics which make it more challenging in comparison with regular audio coding. The signals which are acquired by the microphones......Modern multimedia systems are more and more shifting toward distributed and networked structures. This includes audio systems, where networks of wireless distributed microphones are replacing the traditional microphone arrays. This allows for flexibility of placement and high spatial diversity...

  6. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    Science.gov (United States)

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  7. Automating RPM Creation from a Source Code Repository

    Science.gov (United States)

    2012-02-01

    apps/usr --with- libpq=/apps/ postgres make rm -rf $RPM_BUILD_ROOT umask 0077 mkdir -p $RPM_BUILD_ROOT/usr/local/bin mkdir -p $RPM_BUILD_ROOT...from a source code repository. %pre %prep %setup %build ./autogen.sh ; ./configure --with-db=/apps/db --with-libpq=/apps/ postgres make

  8. Source Coding in Networks with Covariance Distortion Constraints

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2016-01-01

    results to a joint source coding and denoising problem. We consider a network with a centralized topology and a given weighted sum-rate constraint, where the received signals at the center are to be fused to maximize the output SNR while enforcing no linear distortion. We show that one can design...

  9. Comparing the co-evolution of production and test code in open source and industrial developer test processes through repository mining

    NARCIS (Netherlands)

    Van Rompaey, B.; Zaidman, A.E.; Van Deursen, A.; Demeyer, S.

    2008-01-01

    This paper represents an extension to our previous work: Mining software repositories to study coevolution of production & test code. Proceedings of the International Conference on Software Testing, Verification, and Validation (ICST), IEEE Computer Society, 2008; doi:10.1109/ICST.2008.47

  10. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  11. Particle beam source development

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    Electron beam research directed toward providing improved in-diode pinched beam sources and establishing the efficiency and feasibility for superposition of many beams progressed in three major areas. Focusing stability has been improved from large effective aspect ratio (radius/gap of emitting surface) diodes. Substantial progress toward establishing the feasibility of combining beams guided along ionized current-carrying channels has been made. Two beams have been transported and overlayed on a target. Theoretical and experimental measurements on channel formation have resulted in specifications for the capacitor bank channel initiation system for a 12-beam combination experiment on Proto II. An additional area of beam research has been the development of a small pulsed X-ray source to yield high quality flash X-radiography of pellets. A source yielding approximately 100-μm resolution of objects has been demonstrated and work continues to improve the convenience and reliability of this source. The effort to extend the capability of higher power conventional pulse power generators to accelerate ions (rather than electrons), and assess the feasibility of this technology variation for target experiments and reactors has progressed. Progress toward development of a multistage accelerator for ions with pulse power technology centered on development of a new laboratory facility and design and procurement of hardware for a five-stage test apparatus for the Pulslac concept

  12. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  13. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  14. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  15. Coded aperture imaging of alpha source spatial distribution

    International Nuclear Information System (INIS)

    Talebitaher, Alireza; Shutler, Paul M.E.; Springham, Stuart V.; Rawat, Rajdeep S.; Lee, Paul

    2012-01-01

    The Coded Aperture Imaging (CAI) technique has been applied with CR-39 nuclear track detectors to image alpha particle source spatial distributions. The experimental setup comprised: a 226 Ra source of alpha particles, a laser-machined CAI mask, and CR-39 detectors, arranged inside a vacuum enclosure. Three different alpha particle source shapes were synthesized by using a linear translator to move the 226 Ra source within the vacuum enclosure. The coded mask pattern used is based on a Singer Cyclic Difference Set, with 400 pixels and 57 open square holes (representing ρ = 1/7 = 14.3% open fraction). After etching of the CR-39 detectors, the area, circularity, mean optical density and positions of all candidate tracks were measured by an automated scanning system. Appropriate criteria were used to select alpha particle tracks, and a decoding algorithm applied to the (x, y) data produced the de-coded image of the source. Signal to Noise Ratio (SNR) values obtained for alpha particle CAI images were found to be substantially better than those for corresponding pinhole images, although the CAI-SNR values were below the predictions of theoretical formulae. Monte Carlo simulations of CAI and pinhole imaging were performed in order to validate the theoretical SNR formulae and also our CAI decoding algorithm. There was found to be good agreement between the theoretical formulae and SNR values obtained from simulations. Possible reasons for the lower SNR obtained for the experimental CAI study are discussed.

  16. NSLS source development laboratory

    International Nuclear Information System (INIS)

    Ben-Zvi, I.; Blum, E.; Johnson, E.D.

    1995-01-01

    The National Synchrotron Light Source (NSLS) has initiated an ambitious project to develop fourth generation radiation sources. To achieve this goal, the Source Development Laboratory (SDL) builds on the experience gained at the NSLS, and at the highly successful BNL Accelerator Test Facility. The SDL accelerator system will consist of a high brightness short pulse linac, a station for coherent synchrotron and transition radiation experiments, a short bunch storage ring, and an ultra-violet free electron laser utilizing the NISUS wiggler. The electrons will be provided by a laser photocathode gun feeding a 210 MeV S-band electron linac, with magnetic bunch compression at 80 MeV. Electron bunches as short as 100 μm with 1 nC charge will be used for pump-probe experiments utilizing coherent transition radiation. Beam will also be injected into a compact storage ring which will be a source of millimeter wave coherent synchrotron radiation. The linac will also serve as the driver for an FEL designed to allow the study of various aspects of single pass amplifiers. The first FEL configuration will be as a self-amplified spontaneous emission (SASE) FEL at 900 nm. Seeded beam and sub-harmonic seeded beam operations will push the output wavelength below 200 nm. Chirped pulse amplification (CPA) operation will also be possible, and a planned energy upgrade (by powering a fifth linac section) to 310 MeV will extend the wavelength range of the FEL to below 100 nm

  17. Distributed Source Coding Techniques for Lossless Compression of Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Barni Mauro

    2007-01-01

    Full Text Available This paper deals with the application of distributed source coding (DSC theory to remote sensing image compression. Although DSC exhibits a significant potential in many application fields, up till now the results obtained on real signals fall short of the theoretical bounds, and often impose additional system-level constraints. The objective of this paper is to assess the potential of DSC for lossless image compression carried out onboard a remote platform. We first provide a brief overview of DSC of correlated information sources. We then focus on onboard lossless image compression, and apply DSC techniques in order to reduce the complexity of the onboard encoder, at the expense of the decoder's, by exploiting the correlation of different bands of a hyperspectral dataset. Specifically, we propose two different compression schemes, one based on powerful binary error-correcting codes employed as source codes, and one based on simpler multilevel coset codes. The performance of both schemes is evaluated on a few AVIRIS scenes, and is compared with other state-of-the-art 2D and 3D coders. Both schemes turn out to achieve competitive compression performance, and one of them also has reduced complexity. Based on these results, we highlight the main issues that are still to be solved to further improve the performance of DSC-based remote sensing systems.

  18. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  19. Development of ADINA-J-integral code

    International Nuclear Information System (INIS)

    Kurihara, Ryoichi

    1988-07-01

    A general purpose finite element program ADINA (Automatic Dynamic Incremental Nonlinear Analysis), which was developed by Bathe et al., was revised to be able to calculate the J- and J-integral. This report introduced the numerical method to add this capability to the code, and the evaluation of the revised ADINA-J code by using a few of examples of the J estimation model, i.e. a compact tension specimen, a center cracked panel subjected to dynamic load, and a thick shell cylinder having inner axial crack subjected to thermal load. The evaluation testified the function of the revised code. (author)

  20. Development of HTGR plant dynamics simulation code

    International Nuclear Information System (INIS)

    Ohashi, Kazutaka; Tazawa, Yujiro; Mitake, Susumu; Suzuki, Katsuo.

    1987-01-01

    Plant dynamics simulation analysis plays an important role in the design work of nuclear power plant especially in the plant safety analysis, control system analysis, and transient condition analysis. The authors have developed the plant dynamics simulation code named VESPER, which is applicable to the design work of High Temperature Engineering Test Reactor, and have been improving the code corresponding to the design changes made in the subsequent design works. This paper describes the outline of VESPER code and shows its sample calculation results selected from the recent design work. (author)

  1. Discrete fracture network code development

    Energy Technology Data Exchange (ETDEWEB)

    Dershowitz, W.; Doe, T.; Shuttle, D.; Eiben, T.; Fox, A.; Emsley, S.; Ahlstrom, E. [Golder Associates Inc., Redmond, Washington (United States)

    1999-02-01

    This report presents the results of fracture flow model development and application performed by Golder Associates Inc. during the fiscal year 1998. The primary objective of the Golder Associates work scope was to provide theoretical and modelling support to the JNC performance assessment effort in fiscal year 2000. In addition, Golder Associates provided technical support to JNC for the Aespoe project. Major efforts for performance assessment support included extensive flow and transport simulations, analysis of pathway simplification, research on excavation damage zone effects, software verification and cross-verification, and analysis of confidence bounds on Monte Carlo simulations. In addition, a Fickian diffusion algorithm was implemented for Laplace Transform Galerkin solute transport. Support for the Aespoe project included predictive modelling of sorbing tracer transport in the TRUE-1 rock block, analysis of 1 km geochemical transport pathways for Task 5', and data analysis and experimental design for the TRUE Block Scale experiment. Technical information about Golder Associates support to JNC is provided in the appendices to this report. (author)

  2. Optics code development at Los Alamos

    International Nuclear Information System (INIS)

    Mottershead, C.T.; Lysenko, W.P.

    1988-01-01

    This paper is an overview of part of the beam optics code development effort in the Accelerator Technology Division at Los Alamos National Laboratory. The aim of this effort is to improve our capability to design advanced beam optics systems. The work reported is being carried out by a collaboration of permanent staff members, visiting consultants, and student research assistants. The main components of the effort are building a new framework of common supporting utilities and software tools to facilitate further development. research and development on basic computational techniques in classical mechanics and electrodynamics, and evaluation and comparison of existing beam optics codes, and support for their continuing development

  3. Optics code development at Los Alamos

    International Nuclear Information System (INIS)

    Mottershead, C.T.; Lysenko, W.P.

    1988-01-01

    This paper is an overview of part of the beam optics code development effort in the Accelerator Technology Division at Los Alamos National Laboratory. The aim of this effort is to improve our capability to design advanced beam optics systems. The work reported is being carried out by a collaboration of permanent staff members, visiting consultants, and student research assistants. The main components of the effort are: building a new framework of common supporting utilities and software tools to facilitate further development; research and development on basic computational techniques in classical mechanics and electrodynamics; and evaluation and comparison of existing beam optics codes, and support for their continuing development. 17 refs

  4. Test of Effective Solid Angle code for the efficiency calculation of volume source

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    It is hard to determine a full energy (FE) absorption peak efficiency curve for an arbitrary volume source by experiment. That's why the simulation and semi-empirical methods have been preferred so far, and many works have progressed in various ways. Moens et al. determined the concept of effective solid angle by considering an attenuation effect of γ-rays in source, media and detector. This concept is based on a semi-empirical method. An Effective Solid Angle code (ESA code) has been developed for years by the Applied Nuclear Physics Group in Seoul National University. ESA code converts an experimental FE efficiency curve determined by using a standard point source to that for a volume source. To test the performance of ESA Code, we measured the point standard sources and voluminous certified reference material (CRM) sources of γ-ray, and compared with efficiency curves obtained in this study. 200∼1500 KeV energy region is fitted well. NIST X-ray mass attenuation coefficient data is used currently to check for the effect of linear attenuation only. We will use the interaction cross-section data obtained from XCOM code to check the each contributing factor like photoelectric effect, incoherent scattering and coherent scattering in the future. In order to minimize the calculation time and code simplification, optimization of algorithm is needed.

  5. The Astrophysics Source Code Library: Supporting software publication and citation

    Science.gov (United States)

    Allen, Alice; Teuben, Peter

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.

  6. Verification test calculations for the Source Term Code Package

    International Nuclear Information System (INIS)

    Denning, R.S.; Wooton, R.O.; Alexander, C.A.; Curtis, L.A.; Cybulskis, P.; Gieseke, J.A.; Jordan, H.; Lee, K.W.; Nicolosi, S.L.

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs

  7. Development of 2-d cfd code

    International Nuclear Information System (INIS)

    Mirza, S.A.

    1999-01-01

    In the present study, a two-dimensional computer code has been developed in FORTRAN using CFD technique, which is basically a numerical scheme. This computer code solves the Navier Stokes equations and continuity equation to find out the velocity and pressure fields within a given domain. This analysis has been done for the developed within a square cavity driven by the upper wall which has become a bench mark for testing and comparing the newly developed numerical schemes. Before to handle this task, different one-dimensional cases have been studied by CFD technique and their FORTRAN programs written. The cases studied are Couette flow, Poiseuille flow with and without using symmetric boundary condition. Finally a comparison between CFD results and analytical results has also been made. For the cavity flow the results from the developed code have been obtained for different Reynolds numbers which are finally presented in the form of velocity vectors. The comparison of the developed code results have been made with the results obtained from the share ware version of a commercially available code for Reynolds number of 10.0. The disagreement in the results quantitatively and qualitatively at some grid points of the calculation domain have been discussed and future recommendations in this regard have also been made. (author)

  8. Health Code Number (HCN) Development Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Petrocchi, Rocky; Craig, Douglas K.; Bond, Jayne-Anne; Trott, Donna M.; Yu, Xiao-Ying

    2013-09-01

    This report provides the detailed description of health code numbers (HCNs) and the procedure of how each HCN is assigned. It contains many guidelines and rationales of HCNs. HCNs are used in the chemical mixture methodology (CMM), a method recommended by the department of energy (DOE) for assessing health effects as a result of exposures to airborne aerosols in an emergency. The procedure is a useful tool for proficient HCN code developers. Intense training and quality assurance with qualified HCN developers are required before an individual comprehends the procedure to develop HCNs for DOE.

  9. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  10. Development of Coolant Radioactivity Interpretation Code

    International Nuclear Information System (INIS)

    Kim, Kiyoung; Jung, Youngsuk; Kim, Kyounghyun; Kim, Jangwook

    2013-01-01

    In Korea, the coolant radioactivity analysis has been performed by using the computer codes of foreign companies such as CADE (Westinghouse), IODYNE and CESIUM (ABB-CE). However, these computer codes are too conservative and have involved considerable errors. Furthermore, since these codes are DOS-based program, their easy operability is not satisfactory. Therefore it is required development of an enhanced analysis algorithm applying an analytical method reflecting the change of operational environments of domestic nuclear power plants and a fuel failure evaluation software considering user' conveniences. We have developed a nuclear fuel failure evaluation code able to estimate the number of failed fuel rods and the burn-up of failed fuels during nuclear power plant operation cycle. A Coolant Radio-activity Interpretation Code (CRIC) for LWR has been developed as the output of the project 'Development of Fuel Reliability Enhanced Technique' organized by Korea Institute of Energy Technology Evaluation and Planning (KETEP). The CRIC is Windows based-software able to evaluate the number of failed fuel rods and the burn-up of failed fuel region by analyzing coolant radioactivity of LWR in operation. The CRIC is based on the model of fission products release commonly known as 'three region model' (pellet region, gap region, and coolant region), and we are verifying the CRIC results based on the cases of domestic fuel failures. CRIC users are able to estimate the number of failed fuel rods, burn-up and regions of failed fuel considered enrichment and power distribution of fuel region by using operational cycle data, coolant activity data, fuel loading pattern, Cs-134/Cs-137 ratio according to burn-up and U-235 enrichment provided in the code. Due to development of the CRIC, it is secured own unique fuel failure evaluation code. And, it is expected to have the following significant meaning. This is that the code reflecting a proprietary technique for quantitatively

  11. Software requirements specification document for the AREST code development

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.; Whitney, P.D.; Gray, W.J.; Williford, R.E.; White, M.D.; Eslinger, P.W.; Altenhofen, M.K.

    1993-11-01

    The Analysis of the Repository Source Term (AREST) computer code was selected in 1992 by the U.S. Department of Energy. The AREST code will be used to analyze the performance of an underground high level nuclear waste repository. The AREST code is being modified by the Pacific Northwest Laboratory (PNL) in order to evaluate the engineered barrier and waste package designs, model regulatory compliance, analyze sensitivities, and support total systems performance assessment modeling. The current version of the AREST code was developed to be a very useful tool for analyzing model uncertainties and sensitivities to input parameters. The code has also been used successfully in supplying source-terms that were used in a total systems performance assessment. The current version, however, has been found to be inadequate for the comparison and selection of a design for the waste package. This is due to the assumptions and simplifications made in the selection of the process and system models. Thus, the new version of the AREST code will be designed to focus on the details of the individual processes and implementation of more realistic models. This document describes the requirements of the new models that will be implemented. Included in this document is a section describing the near-field environmental conditions for this waste package modeling, description of the new process models that will be implemented, and a description of the computer requirements for the new version of the AREST code

  12. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    2004-01-01

    The objectives of the Code of Conduct are, through the development, harmonization and implementation of national policies, laws and regulations, and through the fostering of international co-operation, to: (i) achieve and maintain a high level of safety and security of radioactive sources; (ii) prevent unauthorized access or damage to, and loss, theft or unauthorized transfer of, radioactive sources, so as to reduce the likelihood of accidental harmful exposure to such sources or the malicious use of such sources to cause harm to individuals, society or the environment; and (iii) mitigate or minimize the radiological consequences of any accident or malicious act involving a radioactive source. These objectives should be achieved through the establishment of an adequate system of regulatory control of radioactive sources, applicable from the stage of initial production to their final disposal, and a system for the restoration of such control if it has been lost. This Code relies on existing international standards relating to nuclear, radiation, radioactive waste and transport safety and to the control of radioactive sources. It is intended to complement existing international standards in these areas. The Code of Conduct serves as guidance in general issues, legislation and regulations, regulatory bodies as well as import and export of radioactive sources. A list of radioactive sources covered by the code is provided which includes activities corresponding to thresholds of categories

  13. Code of conduct on the safety and security of radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-01-01

    The objectives of the Code of Conduct are, through the development, harmonization and implementation of national policies, laws and regulations, and through the fostering of international co-operation, to: (i) achieve and maintain a high level of safety and security of radioactive sources; (ii) prevent unauthorized access or damage to, and loss, theft or unauthorized transfer of, radioactive sources, so as to reduce the likelihood of accidental harmful exposure to such sources or the malicious use of such sources to cause harm to individuals, society or the environment; and (iii) mitigate or minimize the radiological consequences of any accident or malicious act involving a radioactive source. These objectives should be achieved through the establishment of an adequate system of regulatory control of radioactive sources, applicable from the stage of initial production to their final disposal, and a system for the restoration of such control if it has been lost. This Code relies on existing international standards relating to nuclear, radiation, radioactive waste and transport safety and to the control of radioactive sources. It is intended to complement existing international standards in these areas. The Code of Conduct serves as guidance in general issues, legislation and regulations, regulatory bodies as well as import and export of radioactive sources. A list of radioactive sources covered by the code is provided which includes activities corresponding to thresholds of categories.

  14. Monte Carlo calculation for the development of a BNCT neutron source (1eV-10KeV) using MCNP code.

    Science.gov (United States)

    El Moussaoui, F; El Bardouni, T; Azahra, M; Kamili, A; Boukhal, H

    2008-09-01

    Different materials have been studied in order to produce the epithermal neutron beam between 1eV and 10KeV, which are extensively used to irradiate patients with brain tumors such as GBM. For this purpose, we have studied three different neutrons moderators (H(2)O, D(2)O and BeO) and their combinations, four reflectors (Al(2)O(3), C, Bi, and Pb) and two filters (Cd and Bi). Results of calculation showed that the best obtained assembly configuration corresponds to the combination of the three moderators H(2)O, BeO and D(2)O jointly to Al(2)O(3) reflector and two filter Cd+Bi optimize the spectrum of the epithermal neutron at 72%, and minimize the thermal neutron to 4% and thus it can be used to treat the deep tumor brain. The calculations have been performed by means of the Monte Carlo N (particle code MCNP 5C). Our results strongly encourage further studying of irradiation of the head with epithermal neutron fields.

  15. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    International Nuclear Information System (INIS)

    Etienne, Zachariah B; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L

    2015-01-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores. (paper)

  16. Development of code PRETOR for stellarator simulation

    International Nuclear Information System (INIS)

    Dies, J.; Fontanet, J.; Fontdecaba, J.M.; Castejon, F.; Alejandre, C.

    1998-01-01

    The Department de Fisica i Enginyeria Nuclear (DFEN) of the UPC has some experience in the development of the transport code PRETOR. This code has been validated with shots of DIII-D, JET and TFTR, it has also been used in the simulation of operational scenarios of ITER fast burnt termination. Recently, the association EURATOM-CIEMAT has started the operation of the TJ-II stellarator. Due to the need of validating the results given by others transport codes applied to stellarators and because all of them made some approximations, as a averaging magnitudes in each magnetic surface, it was thought suitable to adapt the PRETOR code to devices without axial symmetry, like stellarators, which is very suitable for the specific needs of the study of TJ-II. Several modifications are required in PRETOR; the main concerns to the models of: magnetic equilibrium, geometry and transport of energy and particles. In order to solve the complex magnetic equilibrium geometry the powerful numerical code VMEC has been used. This code gives the magnetic surface shape as a Fourier series in terms of the harmonics (m,n). Most of the geometric magnitudes are also obtained from the VMEC results file. The energy and particle transport models will be replaced by other phenomenological models that are better adapted to stellarator simulation. Using the proposed models, it is pretended to reproduce experimental data available from present stellarators, given especial attention to the TJ-II of the association EURATOM-CIEMAT. (Author)

  17. Asymmetric Joint Source-Channel Coding for Correlated Sources with Blind HMM Estimation at the Receiver

    Directory of Open Access Journals (Sweden)

    Ser Javier Del

    2005-01-01

    Full Text Available We consider the case of two correlated sources, and . The correlation between them has memory, and it is modelled by a hidden Markov chain. The paper studies the problem of reliable communication of the information sent by the source over an additive white Gaussian noise (AWGN channel when the output of the other source is available as side information at the receiver. We assume that the receiver has no a priori knowledge of the correlation statistics between the sources. In particular, we propose the use of a turbo code for joint source-channel coding of the source . The joint decoder uses an iterative scheme where the unknown parameters of the correlation model are estimated jointly within the decoding process. It is shown that reliable communication is possible at signal-to-noise ratios close to the theoretical limits set by the combination of Shannon and Slepian-Wolf theorems.

  18. Towards Holography via Quantum Source-Channel Codes

    Science.gov (United States)

    Pastawski, Fernando; Eisert, Jens; Wilming, Henrik

    2017-07-01

    While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.

  19. Developing an Australian code of construction ethics

    Directory of Open Access Journals (Sweden)

    Sean Francis McCarthy

    2012-05-01

    Full Text Available This article looks at the increasing need to consider the role of ethics in construction. The industry, historically, has been challenged by allegations of a serious shortfall in ethical standards. Only limited attempts to date in Australia have been made to address that concern. Any ethical analysis should consider the definition of ethics and its historical development. This paper considers major historical developments in ethical thinking as well as contemporary thinking on ethics for professional sub-sets. A code could be developed specific to construction. Current methods of addressing ethics in construction and in other industries are also reviewed. This paper argues that developing a code of ethics, supported by other measures is the way forward. The author’s aim is to promote further discussion and promote the drafting of a code. This paper includes a summary of other ethical codes that may provide a starting point. The time for reform is upon us, and there is an urgent need for an independent body to take the lead, for fear of floundering and having only found ‘another debating topic’ (Uff 2006.

  20. Multiple application coded switch development report

    International Nuclear Information System (INIS)

    Bernal, E.L.; Kestly, J.D.

    1979-03-01

    The development of the Multiple Application Coded Switch (MACS) and its related controller are documented; the functional and electrical characteristics are described; the interface requirements defined, and a troubleshooting guide provided. The system was designed for the Safe Secure Trailer System used for secure transportation of nuclear material

  1. Health physics source document for codes of practice

    International Nuclear Information System (INIS)

    Pearson, G.W.; Meggitt, G.C.

    1989-05-01

    Personnel preparing codes of practice often require basic Health Physics information or advice relating to radiological protection problems and this document is written primarily to supply such information. Certain technical terms used in the text are explained in the extensive glossary. Due to the pace of change in the field of radiological protection it is difficult to produce an up-to-date document. This document was compiled during 1988 however, and therefore contains the principle changes brought about by the introduction of the Ionising Radiations Regulations (1985). The paper covers the nature of ionising radiation, its biological effects and the principles of control. It is hoped that the document will provide a useful source of information for both codes of practice and wider areas and stimulate readers to study radiological protection issues in greater depth. (author)

  2. Running the source term code package in Elebra MX-850

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-01-01

    The source term package (STCP) is one of the main tools applied in calculations of behavior of fission products from nuclear power plants. It is a set of computer codes to assist the calculations of the radioactive materials leaving from the metallic containment of power reactors to the environment during a severe reactor accident. The original version of STCP runs in SDC computer systems, but as it has been written in FORTRAN 77, is possible run it in others systems such as IBM, Burroughs, Elebra, etc. The Elebra MX-8500 version of STCP contains 5 codes:March 3, Trapmelt, Tcca, Vanessa and Nava. The example presented in this report has taken into consideration a small LOCA accident into a PWR type reactor. (M.I.)

  3. Microdosimetry computation code of internal sources - MICRODOSE 1

    International Nuclear Information System (INIS)

    Li Weibo; Zheng Wenzhong; Ye Changqing

    1995-01-01

    This paper describes a microdosimetry computation code, MICRODOSE 1, on the basis of the following described methods: (1) the method of calculating f 1 (z) for charged particle in the unit density tissues; (2) the method of calculating f(z) for a point source; (3) the method of applying the Fourier transform theory to the calculation of the compound Poisson process; (4) the method of using fast Fourier transform technique to determine f(z) and, giving some computed examples based on the code, MICRODOSE 1, including alpha particles emitted from 239 Pu in the alveolar lung tissues and from radon progeny RaA and RAC in the human respiratory tract. (author). 13 refs., 6 figs

  4. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Fossorier Marc

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope -ary phase shift key ( -PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded -PSK signaling (with . Then, it is extended to include coded -PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded -PSK signaling performs 3.1 to 5.2 dB better than uncoded -PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  5. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Marc Fossorier

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope M-ary phase shift key (M-PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded M-PSK signaling (with M=2k. Then, it is extended to include coded M-PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded M-PSK signaling performs 3.1 to 5.2 dB better than uncoded M-PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  6. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  7. Development of chemical equilibrium analysis code 'CHEEQ'

    International Nuclear Information System (INIS)

    Nagai, Shuichiro

    2006-08-01

    'CHEEQ' code which calculates the partial pressure and the mass of the system consisting of ideal gas and pure condensed phase compounds, was developed. Characteristics of 'CHEEQ' code are as follows. All the chemical equilibrium equations were described by the formation reactions from the mono-atomic gases in order to simplify the code structure and input preparation. Chemical equilibrium conditions, Σν i μ i =0 for the gaseous compounds and precipitated condensed phase compounds and Σν i μ i > 0 for the non-precipitated condensed phase compounds, were applied. Where, ν i and μ i are stoichiometric coefficient and chemical potential of component i. Virtual solid model was introduced to perform the calculation of constant partial pressure condition. 'CHEEQ' was consisted of following 3 parts, (1) analysis code, zc132. f. (2) thermodynamic data base, zmdb01 and (3) input data file, zindb. 'CHEEQ' code can calculate the system which consisted of elements (max.20), condensed phase compounds (max.100) and gaseous compounds. (max.200). Thermodynamic data base, zmdb01 contains about 1000 elements and compounds, and 200 of them were Actinide elements and their compounds. This report describes the basic equations, the outline of the solution procedure and instructions to prepare the input data and to evaluate the calculation results. (author)

  8. Cooperation of experts' opinion, experiment and computer code development

    International Nuclear Information System (INIS)

    Wolfert, K.; Hicken, E.

    The connection between code development, code assessment and confidence in the analysis of transients will be discussed. In this manner, the major sources of errors in the codes and errors in applications of the codes will be shown. Standard problem results emphasize that, in order to have confidence in licensing statements, the codes must be physically realistic and the code user must be qualified and experienced. We will discuss why there is disagreement between the licensing authority and vendor concerning assessment of the fullfillment of safety goal requirements. The answer to the question lies in the different confidence levels of the assessment of transient analysis. It is expected that a decrease in the disagreement will result from an increased confidence level. Strong efforts will be made to increase this confidence level through improvements in the codes, experiments and related organizational strcutures. Because of the low probability for loss-of-coolant-accidents in the nuclear industry, assessment must rely on analytical techniques and experimental investigations. (orig./HP) [de

  9. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  10. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  11. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    Anon.

    2001-01-01

    The objective of the code of conduct is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through the fostering of international co-operation. In particular, this code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost. (N.C.)

  12. H- source developments

    International Nuclear Information System (INIS)

    Allison, P.W.

    1978-01-01

    The design and operation of a Penning discharge, cold cathode, surface plasma H - ion source are described. A high current density, about 2 A/cm 2 , is extracted from the source by putting about 20 keV across the 2 to 2 1 / 2 mm gap

  13. Revised IAEA Code of Conduct on the Safety and Security of Radioactive Sources

    International Nuclear Information System (INIS)

    Wheatley, J. S.

    2004-01-01

    The revised Code of Conduct on the Safety and Security of Radioactive Sources is aimed primarily at Governments, with the objective of achieving and maintaining a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations; and through the fostering of international co-operation. It focuses on sealed radioactive sources and provides guidance on legislation, regulations and the regulatory body, and import/export controls. Nuclear materials (except for sources containing 239Pu), as defined in the Convention on the Physical Protection of Nuclear Materials, are not covered by the revised Code, nor are radioactive sources within military or defence programmes. An earlier version of the Code was published by IAEA in 2001. At that time, agreement was not reached on a number of issues, notably those relating to the creation of comprehensive national registries for radioactive sources, obligations of States exporting radioactive sources, and the possibility of unilateral declarations of support. The need to further consider these and other issues was highlighted by the events of 11th September 2001. Since then, the IAEA's Secretariat has been working closely with Member States and relevant International Organizations to achieve consensus. The text of the revised Code was finalized at a meeting of technical and legal experts in August 2003, and it was submitted to IAEA's Board of Governors for approval in September 2003, with a recommendation that the IAEA General Conference adopt it and encourage its wide implementation. The IAEA General Conference, in September 2003, endorsed the revised Code and urged States to work towards following the guidance contained within it. This paper summarizes the history behind the revised Code, its content and the outcome of the discussions within the IAEA Board of Governors and General Conference. (Author) 8 refs

  14. Development of steam explosion simulation code JASMINE

    Energy Technology Data Exchange (ETDEWEB)

    Moriyama, Kiyofumi; Yamano, Norihiro; Maruyama, Yu; Kudo, Tamotsu; Sugimoto, Jun [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Nagano, Katsuhiro; Araki, Kazuhiro

    1995-11-01

    A steam explosion is considered as a phenomenon which possibly threatens the integrity of the containment vessel of a nuclear power plant in a severe accident condition. A numerical calculation code JASMINE (JAeri Simulator for Multiphase INteraction and Explosion) purposed to simulate the whole process of steam explosions has been developed. The premixing model is based on a multiphase flow simulation code MISTRAL by Fuji Research Institute Co. In JASMINE code, the constitutive equations and the flow regime map are modified for the simulation of premixing related phenomena. The numerical solution method of the original code is succeeded, i.e. the basic equations are discretized semi-implicitly, BCGSTAB method is used for the matrix solver to improve the stability and convergence, also TVD scheme is applied to capture a steep phase distribution accurately. Test calculations have been performed for the conditions correspond to the experiments by Gilbertson et al. and Angelini et al. in which mixing of solid particles and water were observed in iso-thermal condition and with boiling, respectively. (author).

  15. Development of steam explosion simulation code JASMINE

    International Nuclear Information System (INIS)

    Moriyama, Kiyofumi; Yamano, Norihiro; Maruyama, Yu; Kudo, Tamotsu; Sugimoto, Jun; Nagano, Katsuhiro; Araki, Kazuhiro.

    1995-11-01

    A steam explosion is considered as a phenomenon which possibly threatens the integrity of the containment vessel of a nuclear power plant in a severe accident condition. A numerical calculation code JASMINE (JAeri Simulator for Multiphase INteraction and Explosion) purposed to simulate the whole process of steam explosions has been developed. The premixing model is based on a multiphase flow simulation code MISTRAL by Fuji Research Institute Co. In JASMINE code, the constitutive equations and the flow regime map are modified for the simulation of premixing related phenomena. The numerical solution method of the original code is succeeded, i.e. the basic equations are discretized semi-implicitly, BCGSTAB method is used for the matrix solver to improve the stability and convergence, also TVD scheme is applied to capture a steep phase distribution accurately. Test calculations have been performed for the conditions correspond to the experiments by Gilbertson et al. and Angelini et al. in which mixing of solid particles and water were observed in iso-thermal condition and with boiling, respectively. (author)

  16. GEOS Code Development Road Map - May, 2013

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Settgast, Randolph [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fu, Pengcheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Antoun, Tarabay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ryerson, F. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-05-03

    GEOS is a massively parallel computational framework designed to enable HPC-based simulations of subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS will enable coupling of different solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. The overall architecture of the framework includes consistent data structures and will allow incorporation of additional physical and materials models as demanded by future applications. Along with predicting the initiation, propagation and reactivation of fractures, GEOS will also generate a seismic source term that can be linked with seismic wave propagation codes to generate synthetic microseismicity at surface and downhole arrays. Similarly, the output from GEOS can be linked with existing fluid/thermal transport codes. GEOS can also be linked with existing, non-intrusive uncertainty quantification schemes to constrain uncertainty in its predictions and sensitivity to the various parameters describing the reservoir and stimulation operations. We anticipate that an implicit-explicit 3D version of GEOS, including a preliminary seismic source model, will be available for parametric testing and validation against experimental and field data by Oct. 1, 2013.

  17. Towards Product Lining Model-Driven Development Code Generators

    OpenAIRE

    Roth, Alexander; Rumpe, Bernhard

    2015-01-01

    A code generator systematically transforms compact models to detailed code. Today, code generation is regarded as an integral part of model-driven development (MDD). Despite its relevance, the development of code generators is an inherently complex task and common methodologies and architectures are lacking. Additionally, reuse and extension of existing code generators only exist on individual parts. A systematic development and reuse based on a code generator product line is still in its inf...

  18. From system requirements to source code: transitions in UML and RUP

    Directory of Open Access Journals (Sweden)

    Stanisław Wrycza

    2011-06-01

    Full Text Available There are many manuals explaining language specification among UML-related books. Only some of books mentioned concentrate on practical aspects of using the UML language in effective way using CASE tools and RUP. The current paper presents transitions from system requirements specification to structural source code, useful while developing an information system.

  19. Open source development

    DEFF Research Database (Denmark)

    Ulhøi, John Parm

    2004-01-01

    This paper addresses innovations based on open source or non-proprietary knowledge. Viewed through the lens of private property theory, such agency appears to be a true anomaly. However, by a further turn of the theoretical kaleidoscope, we will show that there may be perfectly justifiable reasons...... for not regarding open source innovations as anomalies. The paper is based on three sectorial and generic cases of open source innovation, which is an offspring of contemporary theory made possible by combining elements of the model of private agency with those of the model of collective agency. In closing...

  20. Development of a new EMP code at LANL

    Science.gov (United States)

    Colman, J. J.; Roussel-Dupré, R. A.; Symbalisty, E. M.; Triplett, L. A.; Travis, B. J.

    2006-05-01

    A new code for modeling the generation of an electromagnetic pulse (EMP) by a nuclear explosion in the atmosphere is being developed. The source of the EMP is the Compton current produced by the prompt radiation (γ-rays, X-rays, and neutrons) of the detonation. As a first step in building a multi- dimensional EMP code we have written three kinetic codes, Plume, Swarm, and Rad. Plume models the transport of energetic electrons in air. The Plume code solves the relativistic Fokker-Planck equation over a specified energy range that can include ~ 3 keV to 50 MeV and computes the resulting electron distribution function at each cell in a two dimensional spatial grid. The energetic electrons are allowed to transport, scatter, and experience Coulombic drag. Swarm models the transport of lower energy electrons in air, spanning 0.005 eV to 30 keV. The swarm code performs a full 2-D solution to the Boltzmann equation for electrons in the presence of an applied electric field. Over this energy range the relevant processes to be tracked are elastic scattering, three body attachment, two body attachment, rotational excitation, vibrational excitation, electronic excitation, and ionization. All of these occur due to collisions between the electrons and neutral bodies in air. The Rad code solves the full radiation transfer equation in the energy range of 1 keV to 100 MeV. It includes effects of photo-absorption, Compton scattering, and pair-production. All of these codes employ a spherical coordinate system in momentum space and a cylindrical coordinate system in configuration space. The "z" axis of the momentum and configuration spaces is assumed to be parallel and we are currently also assuming complete spatial symmetry around the "z" axis. Benchmarking for each of these codes will be discussed as well as the way forward towards an integrated modern EMP code.

  1. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  2. HELIAS module development for systems codes

    Energy Technology Data Exchange (ETDEWEB)

    Warmer, F., E-mail: Felix.Warmer@ipp.mpg.de; Beidler, C.D.; Dinklage, A.; Egorov, K.; Feng, Y.; Geiger, J.; Schauer, F.; Turkin, Y.; Wolf, R.; Xanthopoulos, P.

    2015-02-15

    In order to study and design next-step fusion devices such as DEMO, comprehensive systems codes are commonly employed. In this work HELIAS-specific models are proposed which are designed to be compatible with systems codes. The subsequently developed models include: a geometry model based on Fourier coefficients which can represent the complex 3-D plasma shape, a basic island divertor model which assumes diffusive cross-field transport and high radiation at the X-point, and a coil model which combines scaling aspects based on the HELIAS 5-B reactor design in combination with analytic inductance and field calculations. In addition, stellarator-specific plasma transport is discussed. A strategy is proposed which employs a predictive confinement time scaling derived from 1-D neoclassical and 3-D turbulence simulations. This paper reports on the progress of the development of the stellarator-specific models while an implementation and verification study within an existing systems code will be presented in a separate work. This approach is investigated to ultimately allow one to conduct stellarator system studies, develop design points of HELIAS burning plasma devices, and to facilitate a direct comparison between tokamak and stellarator DEMO and power plant designs.

  3. Optimization of Coding of AR Sources for Transmission Across Channels with Loss

    DEFF Research Database (Denmark)

    Arildsen, Thomas

    Source coding concerns the representation of information in a source signal using as few bits as possible. In the case of lossy source coding, it is the encoding of a source signal using the fewest possible bits at a given distortion or, at the lowest possible distortion given a specified bit rate....... Channel coding is usually applied in combination with source coding to ensure reliable transmission of the (source coded) information at the maximal rate across a channel given the properties of this channel. In this thesis, we consider the coding of auto-regressive (AR) sources which are sources that can...... compared to the case where the encoder is unaware of channel loss. We finally provide an extensive overview of cross-layer communication issues which are important to consider due to the fact that the proposed algorithm interacts with the source coding and exploits channel-related information typically...

  4. On transform coding tools under development for VP10

    Science.gov (United States)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  5. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  6. Development of Nuclear Energy Security Code

    International Nuclear Information System (INIS)

    Shimamura, Takehisa; Suzuki, Atsuyuki; Okubo, Hiroo; Kikuchi, Masahiro.

    1990-01-01

    In establishing of the nuclear fuel cycle in Japan that have a vulnerability in own energy structure, an effectiveness of energy security should be taken into account as well as an economy based on the balance of supply and demand of nuclear fuels. NMCC develops the 'Nuclear Energy Security Code' which was able to evaluate the effectiveness of energy security. Evaluation method adopted in this code is 'Import Premium' which was proposed in 'World Oil', EMF Report 6. The viewpoints of evaluation are as follows: 1. How much uranium fuel quantity can be reduced by using plutonium fuel? 2. How much a sudden rise of fuel cost can be absorbed by establishing the plutonium cycle beforehand the energy crisis? (author)

  7. ICARE/CATHARE and ASTEC code development trends

    International Nuclear Information System (INIS)

    Chatelard, P.; Dorsselaere, J.-P. van

    2000-01-01

    Regarding the computer code development for simulation of LWR severe accidents, IPSN developed a two-tier approach based on detailed codes such as ICARE/CATHARE and simplified models to be assembled in the ASTEC integral code. The ICARE/CATHARE code results from the coupling between the ICARE2 code modelling the core degradation phenomena and the thermalhydraulics code CATHARE2. It allows to calculate PWR and VVER severe accident sequences in the whole RCS. The modelling of the early degradation phase can be considered as rather complete in the ICARE/CATHARE V1 mod1 version (to be released by mid-2000) whereas some models are still missing for the late phase. The main future developments (ICARE/CATHARE V2) will concern the multi-dimensional thermalhydraulics, the quenching of partially damaged cores (mechanical and chemical effects), the debris bed two-phase thermalhydraulics (including reflooding) and the corium behaviour in the lower head. The main other physical improvements should concern the behaviour of boron carbide control rods, the processes governing the core loss of geometry (transition phase) and the oxidation of relocated melts. The ASTEC (Accident Source Term Evaluation Code) integral code, commonly developed by IPSN and GRS, aims to predict an entire LWR (PWR, VVER and BWR) severe accident sequence from the initiating event through to FP release out of the containment, for source term, PSA level 2, or accident management studies. The version ASTEC VO.3 to be released by mid-2000 can be considered now as robust and fast-running enough (between 2 and 12 hours for a one day accident) and allows to perform, with a containment multi-compartment configuration, any scenario accident study accounting for the main safety systems and operator procedures (spray, recombiner, etc.). The next version ASTEC V1, to be released beginning of 2002, will include the frontend simulation and improve modelling of in-vessel core degradation. A large validation activity will

  8. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  9. A Comparison of Source Code Plagiarism Detection Engines

    Science.gov (United States)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  10. Development of covariance capabilities in EMPIRE code

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Pigni, M.T.; Oblozinsky, P.; Mughabghab, S.F.; Mattoon, C.M.; Capote, R.; Cho, Young-Sik; Trkov, A.

    2008-06-24

    The nuclear reaction code EMPIRE has been extended to provide evaluation capabilities for neutron cross section covariances in the thermal, resolved resonance, unresolved resonance and fast neutron regions. The Atlas of Neutron Resonances by Mughabghab is used as a primary source of information on uncertainties at low energies. Care is taken to ensure consistency among the resonance parameter uncertainties and those for thermal cross sections. The resulting resonance parameter covariances are formatted in the ENDF-6 File 32. In the fast neutron range our methodology is based on model calculations with the code EMPIRE combined with experimental data through several available approaches. The model-based covariances can be obtained using deterministic (Kalman) or stochastic (Monte Carlo) propagation of model parameter uncertainties. We show that these two procedures yield comparable results. The Kalman filter and/or the generalized least square fitting procedures are employed to incorporate experimental information. We compare the two approaches analyzing results for the major reaction channels on {sup 89}Y. We also discuss a long-standing issue of unreasonably low uncertainties and link it to the rigidity of the model.

  11. Software development an open source approach

    CERN Document Server

    Tucker, Allen; de Silva, Chamindra

    2011-01-01

    Overview and Motivation Software Free and Open Source Software (FOSS)Two Case Studies Working with a Project Team Key FOSS Activities Client-Oriented vs. Community-Oriented Projects Working on a Client-Oriented Project Joining a Community-Oriented Project Using Project Tools Collaboration Tools Code Management Tools Run-Time System ConstraintsSoftware Architecture Architectural Patterns Layers, Cohesion, and Coupling Security Concurrency, Race Conditions, and DeadlocksWorking with Code Bad Smells and Metrics Refactoring Testing Debugging Extending the Software for a New ProjectDeveloping the D

  12. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  13. Development of disruption thermal analysis code DREAM

    Energy Technology Data Exchange (ETDEWEB)

    Yamazaki, Seiichiro; Kobayahsi, Takeshi [Kawasaki Heavy Industries Ltd., Kobe (Japan); Seki, Masahiro

    1989-07-01

    When a plasma disruption takes place in a tokamak type fusion reactor, plasma facing componenets such as first wall and divertor/limiter are subjected to a intensse heat load in a short duration. At the surface of the wall, temperature rapidly rises, and melting and evaporation occurs. It causes reduction of wall thickness and crack initiation/propagation. As lifetime of the components is significantly affected by them, the transient analysis in consideration of phase changes and radiation heat loss in required in the design of these components. This paper describes the computer code DREAM, developed to perform the disruption thermal analysis, taking phase changes and radiation into account. (author).

  14. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  15. Development of disruption thermal analysis code DREAM

    International Nuclear Information System (INIS)

    Yamazaki, Seiichiro; Kobayahsi, Takeshi; Seki, Masahiro.

    1989-01-01

    When a plasma disruption takes place in a tokamak type fusion reactor, plasma facing componenets such as first wall and divertor/limiter are subjected to a intensse heat load in a short duration. At the surface of the wall, temperature rapidly rises, and melting and evaporation occurs. It causes reduction of wall thickness and crack initiation/propagation. As lifetime of the components is significantly affected by them, the transient analysis in consideration of phase changes and radiation heat loss in required in the design of these components. This paper describes the computer code DREAM, developed to perform the disruption thermal analysis, taking phase changes and radiation into account. (author)

  16. Code development for nuclear reactor simulation

    International Nuclear Information System (INIS)

    Chauliac, C.; Verwaerde, D.; Pavageau, O.

    2006-01-01

    Full text of publication follows: Since several years, CEA, EDF and FANP have developed several numerical codes which are currently used for nuclear industry applications and will be remain in use for the coming years. Complementary to this set of codes and in order to better meet the present and future needs, a new system is being developed through a joint venture between CEA, EDF and FANP, with a ten year prospect and strong intermediate milestones. The focus is put on a multi-scale and multi-physics approach enabling to take into account phenomena from microscopic to macroscopic scale, and to describe interactions between various physical fields such as neutronics (DESCARTES), thermal-hydraulics (NEPTUNE) and fuel behaviour (PLEIADES). This approach is based on a more rational design of the softwares and uses a common integration platform providing pre-processing, supervision of computation and post-processing. This paper will describe the overall system under development and present the first results obtained. (authors)

  17. Coded moderator approach for fast neutron source detection and localization at standoff

    Energy Technology Data Exchange (ETDEWEB)

    Littell, Jennifer [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Lukosi, Eric, E-mail: elukosi@utk.edu [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Institute for Nuclear Security, University of Tennessee, 1640 Cumberland Avenue, Knoxville, TN 37996 (United States); Hayward, Jason; Milburn, Robert; Rowan, Allen [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States)

    2015-06-01

    Considering the need for directional sensing at standoff for some security applications and scenarios where a neutron source may be shielded by high Z material that nearly eliminates the source gamma flux, this work focuses on investigating the feasibility of using thermal neutron sensitive boron straw detectors for fast neutron source detection and localization. We utilized MCNPX simulations to demonstrate that, through surrounding the boron straw detectors by a HDPE coded moderator, a source-detector orientation-specific response enables potential 1D source localization in a high neutron detection efficiency design. An initial test algorithm has been developed in order to confirm the viability of this detector system's localization capabilities which resulted in identification of a 1 MeV neutron source with a strength equivalent to 8 kg WGPu at 50 m standoff within ±11°.

  18. Development of DUST: A computer code that calculates release rates from a LLW disposal unit

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1992-01-01

    Performance assessment of a Low-Level Waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the disposal unit source term). The major physical processes that influence the source term are water flow, container degradation, waste form leaching, and radionuclide transport. A computer code, DUST (Disposal Unit Source Term) has been developed which incorporates these processes in a unified manner. The DUST code improves upon existing codes as it has the capability to model multiple container failure times, multiple waste form release properties, and radionuclide specific transport properties. Verification studies performed on the code are discussed

  19. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  20. Python-Assisted MODFLOW Application and Code Development

    Science.gov (United States)

    Langevin, C.

    2013-12-01

    The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.

  1. FDA Developments: Food Code 2013 and Proposed Trans Fat Determination

    NARCIS (Netherlands)

    Grossman, M.R.

    2014-01-01

    268 Reports EFFL 4|2014 USA FDA Developments: Food Code 2013 and Proposed Trans Fat Determination Margaret Rosso Grossman* I. Food Code 2013 and Food Code Reference System Since 1993, the US Food and Drug Administration has published a Food Code, now updated every four years. In November 2013, the

  2. TRAC code development status and plans

    International Nuclear Information System (INIS)

    Spore, J.W.; Liles, D.R.; Nelson, R.A.

    1986-01-01

    This report summarizes the characteristics and current status of the TRAC-PF1/MOD1 computer code. Recent error corrections and user-convenience features are described, and several user enhancements are identified. Current plans for the release of the TRAC-PF1/MOD2 computer code and some preliminary MOD2 results are presented. This new version of the TRAC code implements stability-enhancing two-step numerics into the 3-D vessel, using partial vectorization to obtain a code that has run 400% faster than the MOD1 code

  3. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  4. The PARTRAC code: Status and recent developments

    Science.gov (United States)

    Friedland, Werner; Kundrat, Pavel

    Biophysical modeling is of particular value for predictions of radiation effects due to manned space missions. PARTRAC is an established tool for Monte Carlo-based simulations of radiation track structures, damage induction in cellular DNA and its repair [1]. Dedicated modules describe interactions of ionizing particles with the traversed medium, the production and reactions of reactive species, and score DNA damage determined by overlapping track structures with multi-scale chromatin models. The DNA repair module describes the repair of DNA double-strand breaks (DSB) via the non-homologous end-joining pathway; the code explicitly simulates the spatial mobility of individual DNA ends in parallel with their processing by major repair enzymes [2]. To simulate the yields and kinetics of radiation-induced chromosome aberrations, the repair module has been extended by tracking the information on the chromosome origin of ligated fragments as well as the presence of centromeres [3]. PARTRAC calculations have been benchmarked against experimental data on various biological endpoints induced by photon and ion irradiation. The calculated DNA fragment distributions after photon and ion irradiation reproduce corresponding experimental data and their dose- and LET-dependence. However, in particular for high-LET radiation many short DNA fragments are predicted below the detection limits of the measurements, so that the experiments significantly underestimate DSB yields by high-LET radiation [4]. The DNA repair module correctly describes the LET-dependent repair kinetics after (60) Co gamma-rays and different N-ion radiation qualities [2]. First calculations on the induction of chromosome aberrations have overestimated the absolute yields of dicentrics, but correctly reproduced their relative dose-dependence and the difference between gamma- and alpha particle irradiation [3]. Recent developments of the PARTRAC code include a model of hetero- vs euchromatin structures to enable

  5. Modelling RF sources using 2-D PIC codes

    Energy Technology Data Exchange (ETDEWEB)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field ( port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  6. Modelling RF sources using 2-D PIC codes

    Energy Technology Data Exchange (ETDEWEB)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT`S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (``port approximation``). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  7. Modelling RF sources using 2-D PIC codes

    International Nuclear Information System (INIS)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (''port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation

  8. ER@CEBAF: Modeling code developments

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Roblin, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-04-13

    A proposal for a multiple-pass, high-energy, energy-recovery experiment using CEBAF is under preparation in the frame of a JLab-BNL collaboration. In view of beam dynamics investigations regarding this project, in addition to the existing model in use in Elegant a version of CEBAF is developed in the stepwise ray-tracing code Zgoubi, Beyond the ER experiment, it is also planned to use the latter for the study of polarization transport in the presence of synchrotron radiation, down to Hall D line where a 12 GeV polarized beam can be delivered. This Note briefly reports on the preliminary steps, and preliminary outcomes, based on an Elegant to Zgoubi translation.

  9. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    Science.gov (United States)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  10. Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma

    Science.gov (United States)

    Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.

    2017-10-01

    For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that

  11. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  12. Status of development and verification of the CTFD code FLUBOX

    International Nuclear Information System (INIS)

    Graf, U.; Paradimitriou, P.

    2004-01-01

    The Computational Two-Fluid Dynamics (CTFD) code FLUBOX is developed at GRS for the multidimensional simulation of two-phase flows. FLUBOX will also be used as a multidimensional module for the German system code ATHLET. The Benchmark test cases of the European ASTAR project were used to verify the ability of the code FLUBOX to calculate typical two-phase flow phenomena and conditions: void and pressure wave propagation, phase transitions, countercurrent flows, sharp interface movements, compressible (vapour) and nearly incompressible (water) conditions, thermal and mechanical non-equilibrium, stiff source terms due to mass and heat transfer between the phases. Realistic simulations of two-phase require beside the pure conservation equations additional transport equations for the interfacial area, turbulent energy and dissipation. A transport equation for the interfacial area density covering the whole two-phase flow range is in development. First validation calculations are presented in the paper. Turbulent shear stress for two-phase flows will be modelled by the development of transport equations for the turbulent kinetic energy and the turbulent dissipation rate. The development of the transport equations is mainly based on first principles on bubbles or drops and is largely free from empiricism. (author)

  13. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  14. The source development lab linac at BNL

    International Nuclear Information System (INIS)

    Graves, W.S.; Johnson, E.D.

    1996-12-01

    A 210 MeV SLAC-type electron linac is currently under construction at BNL as part of the Source Development Laboratory. A 1.6 cell RF photoinjector is employed as the high brightness electron source which is excited by a frequency tripled Titanium:Sapphire laser. This linac will be used for several source development projects including a short bunch storage ring, and a series of FEL experiments based on the 10 m long NISUS undulator. The FEL will be operated as either a SASE or seeded beam device using the Ti:Sapp laser. For the seeded beam experiments; direct amplification, harmonic generation, and chirped pulse amplification modes will be studied, spanning an output wavelength range from 900 nm down to 100 nm. This paper presents the project's design parameters and results of recent modeling using the PARMELA and MAD simulation codes

  15. Hot Hydrogen Heat Source Development

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to develop a  hot hydrogen heat source that would produce  a high temperature hydrogen flow which would be comparable to that produced...

  16. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  17. Recent negative ion source developments

    International Nuclear Information System (INIS)

    Alton, G.D.

    1978-01-01

    This report describes recent results obtained from studies associated with the development of negative ion sources which utilize sputtering in a diffuse cesium plasma as a means of ion beam generation. Data are presented which relate negative ion yield and important operational parameters such as cesium oven temperature and sputter probe voltage from each of the following sources: (1) A source based in principle according to the University of Aarhus design and (2) an axial geometry source. The important design aspects of the sources are given--along with a list of the negative ion intensities observed to date. Also a qualitative description and interpretation of the negative ion generation mechanism in sources which utilize sputtering in the presence of cesium is given

  18. Optimal source coding, removable noise elimination, and natural coordinate system construction for general vector sources using replicator neural networks

    Science.gov (United States)

    Hecht-Nielsen, Robert

    1997-04-01

    A new universal one-chart smooth manifold model for vector information sources is introduced. Natural coordinates (a particular type of chart) for such data manifolds are then defined. Uniformly quantized natural coordinates form an optimal vector quantization code for a general vector source. Replicator neural networks (a specialized type of multilayer perceptron with three hidden layers) are the introduced. As properly configured examples of replicator networks approach minimum mean squared error (e.g., via training and architecture adjustment using randomly chosen vectors from the source), these networks automatically develop a mapping which, in the limit, produces natural coordinates for arbitrary source vectors. The new concept of removable noise (a noise model applicable to a wide variety of real-world noise processes) is then discussed. Replicator neural networks, when configured to approach minimum mean squared reconstruction error (e.g., via training and architecture adjustment on randomly chosen examples from a vector source, each with randomly chosen additive removable noise contamination), in the limit eliminate removable noise and produce natural coordinates for the data vector portions of the noise-corrupted source vectors. Consideration regarding selection of the dimension of a data manifold source model and the training/configuration of replicator neural networks are discussed.

  19. Neutron spallation source and the Dubna cascade code

    CERN Document Server

    Kumar, V; Goel, U; Barashenkov, V S

    2003-01-01

    Neutron multiplicity per incident proton, n/p, in collision of high energy proton beam with voluminous Pb and W targets has been estimated from the Dubna cascade code and compared with the available experimental data for the purpose of benchmarking of the code. Contributions of various atomic and nuclear processes for heat production and isotopic yield of secondary nuclei are also estimated to assess the heat and radioactivity conditions of the targets. Results obtained from the code show excellent agreement with the experimental data at beam energy, E < 1.2 GeV and differ maximum up to 25% at higher energy. (author)

  20. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  1. Lysimeter data as input to performance assessment source term codes

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.; Rogers, R.D.; Sullivan, T.

    1992-01-01

    The Field Lysimeter Investigation: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-II c prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. In this paper, radionuclide releases from waste forms in the first seven years of sampling are presented and discussed. Application of lysimeter data to be used in performance assessment source term models is presented. Initial results from use of data in two models are discussed

  2. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    Science.gov (United States)

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  3. Stars with shell energy sources. Part 1. Special evolutionary code

    International Nuclear Information System (INIS)

    Rozyczka, M.

    1977-01-01

    A new version of the Henyey-type stellar evolution code is described and tested. It is shown, as a by-product of the tests, that the thermal time scale of the core of a red giant approaching the helium flash is of the order of the evolutionary time scale. The code itself appears to be a very efficient tool for investigations of the helium flash, carbon flash and the evolution of a white dwarf accreting mass. (author)

  4. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  5. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  6. Development of a definitive internal dosimetry code

    International Nuclear Information System (INIS)

    Miller, G.; Inkret, W.C.; Schillaci, M.E.

    1996-01-01

    Internal dosimetry may be divided into tow main problems: (1) the forward (scientific) problem of determining biokinetics models that describe how radionuclides are taken into the body, distributed in body tissues, and excreted, and (2) the inverse (mathematical) problem: given the measured amounts in excreta and assuming a biokinetic model, to determine the times and amounts of intakes into the body. The inverse problem of internal dosimetry is, in fact, a generic problem studied in other fields (e.g., image reconstruction, spectral deconvulution, and model parameter fitting). We have developed a code for plutonium internal dosimetry using the maximum entropy method, a method for solving underdetermined inverse problems with a positivity constraint. Within the framework of Bayesian statistics, we believe the definitive approach is to examine the Bayesian posterior probability describing the probability of an intake scenario (X i ) read ( ... ) as open-quotes the set of,close quotes where X i denotes the intake amount that occurs on the with day. For plutonium, for a worker with a long employment history, this is a very high dimensional probability space, since there may be on the order of 10,000 days when intakes may have occurred. Within this high dimensional space, we calculate the mean intake scenario as i > where denotes the expectation value over the posterior probability distribution. Similarly, we calculate uncertainties and other relevant quantities, such as X 2 , as expectation values over the posterior distribution. Thanks to a recent breakthrough in describing the mathematical structure of the intake process (a Poisson sum representation of intakes), we have developed the initial version of a Bayesian expectation-value algorithm for internal dosimetry reconstructions

  7. A Source Term Calculation for the APR1400 NSSS Auxiliary System Components Using the Modified SHIELD Code

    International Nuclear Information System (INIS)

    Park, Hong Sik; Kim, Min; Park, Seong Chan; Seo, Jong Tae; Kim, Eun Kee

    2005-01-01

    The SHIELD code has been used to calculate the source terms of NSSS Auxiliary System (comprising CVCS, SIS, and SCS) components of the OPR1000. Because the code had been developed based upon the SYSTEM80 design and the APR1400 NSSS Auxiliary System design is considerably changed from that of SYSTEM80 or OPR1000, the SHIELD code cannot be used directly for APR1400 radiation design. Thus the hand-calculation is needed for the portion of design changes using the results of the SHIELD code calculation. In this study, the SHIELD code is modified to incorporate the APR1400 design changes and the source term calculation is performed for the APR1400 NSSS Auxiliary System components

  8. The development of fluid codes for the laser compression of plasma

    International Nuclear Information System (INIS)

    Nicholas, D.J.

    1982-08-01

    Notes are given on the construction and use of simulation codes in plasma physics requiring only a limited background knowledge in numerical analysis and finite-difference techniques. The development of a 1-D Eulerian codes to source form is followed as an example. (U.K.)

  9. Development of codes for physical calculations of WWER

    International Nuclear Information System (INIS)

    Novikov, A.N.

    2000-01-01

    A package of codes for physical calculations of WWER reactors, used at the RRC 'Kurchatov Institute' is discussed including the purpose of these codes, approximations used, degree of data verification, possibilities of automation of calculations and presentation of results, trends of further development of the codes. (Authors)

  10. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  11. OSSMETER D3.2 – Report on Source Code Activity Metrics

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and initial prototypes of the tools that are needed for source code activity analysis. It builds upon the Deliverable 3.1 where infra-structure and a domain analysis have been

  12. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  13. Developing HYDMN code to include the transient of MNSR

    International Nuclear Information System (INIS)

    Al-Barhoum, M.

    2000-11-01

    A description of the programs added to HYDMN code (a code for thermal-hydraulic steady state of MNSR) to include the transient of the same MNSR is presented. The code asks the initial conditions for the power (in k W) and the cold initial core inlet temperature (in degrees centigrade). A time-dependent study of the coolant inlet and outlet temperature, its speed, pool and tank temperatures is done for MNSR in general and for the Syrian MNSR in particular. The study solves the differential equations taken from reference (1) by using some numerical methods found in reference (3). The code becomes this way independent of any external information source. (Author)

  14. Code Development and Analysis Program: developmental checkout of the BEACON/MOD2A code

    International Nuclear Information System (INIS)

    Ramsthaler, J.A.; Lime, J.F.; Sahota, M.S.

    1978-12-01

    A best-estimate transient containment code, BEACON, is being developed by EG and G Idaho, Inc. for the Nuclear Regulatory Commission's reactor safety research program. This is an advanced, two-dimensional fluid flow code designed to predict temperatures and pressures in a dry PWR containment during a hypothetical loss-of-coolant accident. The most recent version of the code, MOD2A, is presently in the final stages of production prior to being released to the National Energy Software Center. As part of the final code checkout, seven sample problems were selected to be run with BEACON/MOD2A

  15. Open Genetic Code: on open source in the life sciences

    OpenAIRE

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first ...

  16. Theoretical atomic physics code development III TAPS: A display code for atomic physics data

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Kramer, S.P.

    1988-12-01

    A large amount of theoretical atomic physics data is becoming available through use of the computer codes CATS and ACE developed at Los Alamos National Laboratory. A new code, TAPS, has been written to access this data, perform averages over terms and configurations, and display information in graphical or text form. 7 refs., 13 figs., 1 tab

  17. MIDAS/PK code development using point kinetics model

    International Nuclear Information System (INIS)

    Song, Y. M.; Park, S. H.

    1999-01-01

    In this study, a MIDAS/PK code has been developed for analyzing the ATWS (Anticipated Transients Without Scram) which can be one of severe accident initiating events. The MIDAS is an integrated computer code based on the MELCOR code to develop a severe accident risk reduction strategy by Korea Atomic Energy Research Institute. In the mean time, the Chexal-Layman correlation in the current MELCOR, which was developed under a BWR condition, is appeared to be inappropriate for a PWR. So as to provide ATWS analysis capability to the MIDAS code, a point kinetics module, PKINETIC, has first been developed as a stand-alone code whose reference model was selected from the current accident analysis codes. In the next step, the MIDAS/PK code has been developed via coupling PKINETIC with the MIDAS code by inter-connecting several thermal hydraulic parameters between the two codes. Since the major concern in the ATWS analysis is the primary peak pressure during the early few minutes into the accident, the peak pressure from the PKINETIC module and the MIDAS/PK are compared with the RETRAN calculations showing a good agreement between them. The MIDAS/PK code is considered to be valuable for analyzing the plant response during ATWS deterministically, especially for the early domestic Westinghouse plants which rely on the operator procedure instead of an AMSAC (ATWS Mitigating System Actuation Circuitry) against ATWS. This capability of ATWS analysis is also important from the view point of accident management and mitigation

  18. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  19. Development of health effect assessment software using MACCS2 code

    International Nuclear Information System (INIS)

    Hwang, Seok-Won; Park, Jong-Woon; Kang, Kyung Min; Jae, Moosung

    2008-01-01

    The extended regulatory interests in severe accidents management and enhanced safety regulatory requirements raise a need of more accurate analysis of the effect to the public health by users with diverse disciplines. This facilitates this work to develop web-based radiation health effect assessment software, RASUM, by using the MACCS2 code and HTML language to provide diverse users (regulators, operators, and public) with easy understanding, modeling, calculating, analyzing, documenting and reporting of the radiation health effect under hypothetical severe accidents. The engine of the web-based RASUM uses the MACCS2 as a base code developed by NRC and is composed of five modules such as development module, PSA training module, output module, input data module (source term, population distribution, meteorological data, etc.), and MACCS2 run module. For verification and demonstration of the RASUM, the offsite consequence analysis using the RASUM frame is performed for such as early fatality risk, organ does, and whole body does for two selected scenarios. Moreover, CCDF results from the RASUM for KSNP and CANDU type reactors are presented and compared. (author)

  20. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    Science.gov (United States)

    2010-12-01

    technical competence for the type of tests and calibrations SCALe undertakes. Testing and calibration laboratories that comply with ISO / IEC 17025 ...and exec t [ ISO / IEC 2005]. f a software system indicates that the SCALe analysis di by a CERT secure coding standard. Successful conforma antees that...to be more secure than non- systems. However, no study has yet been performed to p t ssment in accordance with ISO / IEC 17000: “a demonstr g to a

  1. Open Genetic Code : On open source in the life sciences

    NARCIS (Netherlands)

    Deibel, E.

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life

  2. Ion source and injector development

    International Nuclear Information System (INIS)

    Curtis, C.D.

    1976-01-01

    This is a survey of low energy accelerators which inject into proton linacs. Laboratories covered include Argonne, Brookhaven, CERN, Chalk River, Fermi, ITEP, KEK, Rutherford, and Saclay. This paper emphasizes complete injector systems, comparing significant hardware features and beam performance data, including recent additions. There is increased activity now in the acceleration of polarized protons, H + and H - , and of unpolarized H - . New source development and programs for these ion beams is outlined at the end of the report. Heavy-ion sources are not included

  3. Open Genetic Code: on open source in the life sciences.

    Science.gov (United States)

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  4. SOURCE 1ST 2.0: development and beta testing

    International Nuclear Information System (INIS)

    Barber, D.H.; Iglesias, F.C.; Hoang, Y.; Dickson, L.W.; Dickson, R.S.; Richards, M.J.; Gibb, R.A.

    1999-01-01

    SOURCE 1ST 2.0 is the Industry Standard fission product release code that is being developed by Ontario Power Generation, New Brunswick Power, Hydro-Quebec, and Atomic Energy of Canada Ltd. This paper is a report on recent progress on requirement specification, code development, and module verification and validation activities. The theoretical basis for each model in the code is described in a module Software Theory Manual. The development of SOURCE IST 2.0 has required code design decisions about how to implement the software requirements. Development and module testing of the β1 release of SOURCE IST 2.0 (released in July 1999) have led to some interesting insights into fission product release modelling. The beta testing process has allowed code developers and analysts to refine the software requirements for the code. The need to verify physical reference data has guided some decisions on the code and data structure design. Examples of these design decisions are provided. Module testing, and verification and validation activities are discussed. These activities include code-targeted testing, stress testing, code inspection, comparison of code with requirements, and comparison of code results with independent algebraic, numerical, or semi-algebraic calculations. The list of isotopes to be modelled by SOURCE IST 2.0 provides an example of a subset of a reference data set. Isotopes are present on the list for a variety of reasons: personnel or public dose, equipment dose (for environmental qualification), fission rate and actinide modelling, or stable (or long-lived) targets for activation processes. To accommodate controlled changes to the isotope list, the isotope list and associated nuclear data are contained in a reference data file. The questions of multiple computing platforms, and of Year 2000 compliance have been addressed by programming rules for the code. By developing and testing modules on most of the different platforms on which the code is intended

  5. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  6. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  7. Integrated code development for studying laser driven plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Takabe, Hideaki; Nagatomo, Hideo; Sunahara, Atsusi; Ohnishi, Naofumi; Naruo, Syuji; Mima, Kunioki [Osaka Univ., Suita (Japan). Inst. of Laser Engineering

    1998-03-01

    Present status and plan for developing an integrated implosion code are briefly explained by focusing on motivation, numerical scheme and issues to be developed more. Highly nonlinear stage of Rayleigh-Taylor instability of ablation front by laser irradiation has been simulated so as to be compared with model experiments. Improvement in transport and rezoning/remapping algorithms in ILESTA code is described. (author)

  8. Challenges on innovations of newly-developed safety analysis codes

    International Nuclear Information System (INIS)

    Yang, Yanhua; Zhang, Hao

    2016-01-01

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  9. Challenges on innovations of newly-developed safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua [Shanghai Jiao Tong Univ. (China). School of Nuclear Science and Engineering; Zhang, Hao [State Nuclear Power Software Development Center, Beijing (China). Beijing Future Science and Technology City

    2016-05-15

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  10. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Santos-Villalobos, Hector J [ORNL; Gregor, Jens [University of Tennessee, Knoxville (UTK); Bingham, Philip R [ORNL

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  11. Development of cold neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Oong; Cho, M. S.; Park, K. N. and others

    1999-05-01

    The purpose of this study is to develop the CNS facility in Hanaro to extend the scope of the neutron utilization and to carry out the works impossible by thermal neutrons. According to the project schedule, the establishment of the CNS concept and the basic design are performed in the phase 1, and the elementary technologies for basic design will be developed in the phase 2. Finally in the phase 3, the design of CNS will be completed, and the fabrication, the installation will be ended and then the development plan of spectrometers will be decided to establish the foothold to carry out the basic researches. This study is aimed to produce the design data and utilize them in the future basic and detail design, which include the estimation and the measurement of the heat load, the code development for the design of the in pile assembly and the heat removal system, the measurement of the shape of the CN hole, the performance test of thermosiphon and the concept of the general layout of the whole system etc.. (author)

  12. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    Science.gov (United States)

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-07-09

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.

  13. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    Science.gov (United States)

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  14. Development of a coupling code for PWR reactor cavity radiation streaming calculation

    International Nuclear Information System (INIS)

    Zheng, Z.; Wu, H.; Cao, L.; Zheng, Y.; Zhang, H.; Wang, M.

    2012-01-01

    PWR reactor cavity radiation streaming is important for the safe of the personnel and equipment, thus calculation has to be performed to evaluate the neutron flux distribution around the reactor. For this calculation, the deterministic codes have difficulties in fine geometrical modeling and need huge computer resource; and the Monte Carlo codes require very long sampling time to obtain results with acceptable precision. Therefore, a coupling method has been developed to eliminate the two problems mentioned above in each code. In this study, we develop a coupling code named DORT2MCNP to link the Sn code DORT and Monte Carlo code MCNP. DORT2MCNP is used to produce a combined surface source containing top, bottom and side surface simultaneously. Because SDEF card is unsuitable for the combined surface source, we modify the SOURCE subroutine of MCNP and compile MCNP for this application. Numerical results demonstrate the correctness of the coupling code DORT2MCNP and show reasonable agreement between the coupling method and the other two codes (DORT and MCNP). (authors)

  15. Development of AGNES, a kinetics code for fissile solutions, 1

    International Nuclear Information System (INIS)

    Nakajima, Ken; Ohnishi, Nobuaki

    1986-01-01

    A kinetics code for fissile solutions, AGNES (Accidentally Generated Nuclear Excursion Simulation code), has been developed. This code calculates the radiolytic gas void effect as a reactivity feedback. Physical and calculative models of the radiolytic gas void are summarized and the usage of AGNES is described. In addition, some benchmark calculations were performed and results of calculations show good agreement with those of experiments. (author)

  16. Building guide : how to build Xyce from source code.

    Energy Technology Data Exchange (ETDEWEB)

    Keiter, Eric Richard; Russo, Thomas V.; Schiek, Richard Louis; Sholander, Peter E.; Thornquist, Heidi K.; Mei, Ting; Verley, Jason C.

    2013-08-01

    While Xyce uses the Autoconf and Automake system to configure builds, it is often necessary to perform more than the customary %E2%80%9C./configure%E2%80%9D builds many open source users have come to expect. This document describes the steps needed to get Xyce built on a number of common platforms.

  17. Subchannel analysis code development for CANDU fuel channel

    International Nuclear Information System (INIS)

    Park, J. H.; Suk, H. C.; Jun, J. S.; Oh, D. J.; Hwang, D. H.; Yoo, Y. J.

    1998-07-01

    Since there are several subchannel codes such as COBRA and TORC codes for a PWR fuel channel but not for a CANDU fuel channel in our country, the subchannel analysis code for a CANDU fuel channel was developed for the prediction of flow conditions on the subchannels, for the accurate assessment of the thermal margin, the effect of appendages, and radial/axial power profile of fuel bundles on flow conditions and CHF and so on. In order to develop the subchannel analysis code for a CANDU fuel channel, subchannel analysis methodology and its applicability/pertinence for a fuel channel were reviewed from the CANDU fuel channel point of view. Several thermalhydraulic and numerical models for the subchannel analysis on a CANDU fuel channel were developed. The experimental data of the CANDU fuel channel were collected, analyzed and used for validation of a subchannel analysis code developed in this work. (author). 11 refs., 3 tabs., 50 figs

  18. Low complexity source and channel coding for mm-wave hybrid fiber-wireless links

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Vegas Olmos, Juan José; Pang, Xiaodan

    2014-01-01

    We report on the performance of channel and source coding applied for an experimentally realized hybrid fiber-wireless W-band link. Error control coding performance is presented for a wireless propagation distance of 3 m and 20 km fiber transmission. We report on peak signal-to-noise ratio perfor...

  19. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  20. Coded aperture detector for high precision gamma-ray burst source locations

    International Nuclear Information System (INIS)

    Helmken, H.; Gorenstein, P.

    1977-01-01

    Coded aperture collimators in conjunction with position-sensitive detectors are very useful in the study of transient phenomenon because they combine broad field of view, high sensitivity, and an ability for precise source locations. Since the preceeding conference, a series of computer simulations of various detector designs have been carried out with the aid of a CDC 6400. Particular emphasis was placed on the development of a unit consisting of a one-dimensional random or periodic collimator in conjunction with a two-dimensional position-sensitive Xenon proportional counter. A configuration involving four of these units has been incorporated into the preliminary design study of the Transient Explorer (ATREX) satellite and are applicable to any SAS or HEAO type satellite mission. Results of this study, including detector response, fields of view, and source location precision, will be presented

  1. Development of thermal hydraulic evaluation code for CANDU reactors

    International Nuclear Information System (INIS)

    Kim, Man Woong; Yu, Seon Oh; Choi, Yong Seog; Shin, Chull; Hwang, Soo Hyun

    2004-02-01

    To enhance the safety of operating CANDU reactors, the establishment of the safety analysis codes system for CANDU reactors is in progress. As for the development of thermal-hydraulic analysis code for CANDU system, the studies for improvement of evaluation model inside RELAP/CANDU code and the development of safety assessment methodology for GSI (Generic Safety Issues) are in progress as a part of establishment of CANDU safety assessment system. To develop the 3-D thermal-hydraulic analysis code for moderator system, the CFD models for analyzing the CANDU-6 moderator circulation are developed. One model uses a structured grid system with the porous media approach for the 380 Calandria tubes in the core region. The other uses a unstructured grid system on the real geometry of 380 Calandria tubes, so that the detailed fluid flow between the Calandria tubes can be observed. As to the development of thermal-hydraulic analysis code for containment, the study on the applicability of CONTAIN 2.0 code to a CANDU containment was conducted and a simulation of the thermal-hydraulic phenomena during the accident was performed. Besides, the model comparison of ESFs (Engineered Safety Features) inside CONTAIN 2.0 code and PRESCON code has also conducted

  2. Development of a nuclear power plant system analysis code

    International Nuclear Information System (INIS)

    Sim, Suk K.; Jeong, J. J.; Ha, K. S.; Moon, S. K.; Park, J. W.; Yang, S. K.; Song, C. H.; Chun, S. Y.; Kim, H. C.; Chung, B. D.; Lee, W. J.; Kwon, T. S.

    1997-07-01

    During the period of this study, TASS 1.0 code has been prepared for the non-LOCA licensing and reload safety analyses of the Westinghouse and the Korean Standard Nuclear Power Plants (KSNPP) type reactors operating in Korea. TASS-NPA also has been developed for a real time simulation of the Kori-3/4 transients using on-line graphical interactions. TASS 2.0 code has been further developed to timely apply the TASS 2.0 code for the design certification of the KNGR. The COBRA/RELAP5 code, a multi-dimensional best estimate system code, has been developed by integrating the realistic three-dimensional reactor vessel model with the RELAP5 /MOD3.2 code, a one-dimensional system code. Also, a 3D turbulent two-phase flow analysis code, FEMOTH-TF, has been developed using finite element technique to analyze local thermal hydraulic phenomena in support of the detailed design analysis for the development of the advanced reactors. (author). 84 refs., 27 tabs., 83 figs

  3. Development of Parallel Code for the Alaska Tsunami Forecast Model

    Science.gov (United States)

    Bahng, B.; Knight, W. R.; Whitmore, P.

    2014-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.

  4. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  5. Recent developments in the CONTAIN-LMR code

    International Nuclear Information System (INIS)

    Murata, K.K.

    1990-01-01

    Through an international collaborative effort, a special version of the CONTAIN code is being developed for integrated mechanistic analysis of the conditions in liquid metal reactor (LMR) containments during severe accidents. The capabilities of the most recent code version, CONTAIN LMR/1B-Mod.1, are discussed. These include new models for the treatment of two condensables, sodium condensation on aerosols, chemical reactions, hygroscopic aerosols, and concrete outgassing. This code version also incorporates all of the previously released LMR model enhancements. The results of an integral demonstration calculation of a sever core-melt accident scenario are given to illustrate the features of this code version. 11 refs., 7 figs., 1 tab

  6. Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2014-01-01

    In this paper, we consider the problem of remote vector Gaussian source coding for a wireless acoustic sensor network. Each node receives messages from multiple nodes in the network and decodes these messages using its own measurement of the sound field as side information. The node’s measurement...... and the estimates of the source resulting from decoding the received messages are then jointly encoded and transmitted to a neighboring node in the network. We show that for this distributed source coding scenario, one can encode a so-called conditional sufficient statistic of the sources instead of jointly...

  7. Recent developments in KTF. Code optimization and improved numerics

    International Nuclear Information System (INIS)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin

    2012-01-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  8. Recent developments in KTF. Code optimization and improved numerics

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin [Karlsruhe Institute of Technology (KIT) (Germany). Inst. for Neutron Physics and Reactor Technology (INR)

    2012-11-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  9. Developments of fuel performance analysis codes in KEPCO NF

    International Nuclear Information System (INIS)

    Han, H. T.; Choi, J. M.; Jung, C. D.; Yoo, J. S.

    2012-01-01

    The KEPCO NF has developed fuel performance analysis and design code named as ROPER, and utility codes of XGCOL and XDNB in order to perform fuel rod design evaluation for Korean nuclear power plants. The ROPER code intends to cover full range of fuel performance evaluation. The XGCOL code is for the clad flattening evaluation and the XDNB code is for the extensive DNB propagation evaluation. In addition to these, the KEPCO NF is now in the developing stage for 3-dimensional fuel performance analysis code, named as OPER3D, using 3-dimensional FEM for the nest generation within the joint project CANDU ENERGY in order to analyze PCMI behavior and fuel performance under load following operation. Of these, the ROPER code is now in the stage of licensing activities by Korean regulatory body and the other two are almost in the final developing stage. After finishing the developing, licensing activities are to be performed. These activities are intending to acquire competitiveness, originality, vendor-free ownership of fuel performance codes in the KEPCO NF

  10. Twelve gordian knots when developing an organizational code of ethics

    NARCIS (Netherlands)

    Kaptein, Muel; Wempe, Johan

    1998-01-01

    Following the example of the many organizations in the United States which have a code of ethics, an increasing interest on the part of companies, trade organizations, (semi-)governmental organizations and professions in the Netherlands to develop codes of ethics can be witnessed. We have been able

  11. Graphical user interface development for the MARS code

    International Nuclear Information System (INIS)

    Jeong, J.-J.; Hwang, M.; Lee, Y.J.; Kim, K.D.; Chung, B.D.

    2003-01-01

    KAERI has developed the best-estimate thermal-hydraulic system code MARS using the RELAP5/MOD3 and COBRA-TF codes. To exploit the excellent features of the two codes, we consolidated the two codes. Then, to improve the readability, maintainability, and portability of the consolidated code, all the subroutines were completely restructured by employing a modular data structure. At present, a major part of the MARS code development program is underway to improve the existing capabilities. The code couplings with three-dimensional neutron kinetics, containment analysis, and transient critical heat flux calculations have also been carried out. At the same time, graphical user interface (GUI) tools have been developed for user friendliness. This paper presents the main features of the MARS GUI. The primary objective of the GUI development was to provide a valuable aid for all levels of MARS users in their output interpretation and interactive controls. Especially, an interactive control function was designed to allow operator actions during simulation so that users can utilize the MARS code like conventional nuclear plant analyzers (NPAs). (author)

  12. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  13. Use of source term code package in the ELEBRA MX-850 system

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-12-01

    The implantation of source term code package in the ELEBRA-MX850 system is presented. The source term is formed when radioactive materials generated in nuclear fuel leakage toward containment and the external environment to reactor containment. The implantated version in the ELEBRA system are composed of five codes: MARCH 3, TRAPMELT 3, THCCA, VANESA and NAVA. The original example case was used. The example consists of a small loca accident in a PWR type reactor. A sensitivity study for the TRAPMELT 3 code was carried out, modifying the 'TIME STEP' to estimate the processing time of CPU for executing the original example case. (M.C.K.) [pt

  14. Eu-NORSEWInD - Assessment of Viability of Open Source CFD Code for the Wind Industry

    DEFF Research Database (Denmark)

    Stickland, Matt; Scanlon, Tom; Fabre, Sylvie

    2009-01-01

    Part of the overall NORSEWInD project is the use of LiDAR remote sensing (RS) systems mounted on offshore platforms to measure wind velocity profiles at a number of locations offshore. The data acquired from the offshore RS measurements will be fed into a large and novel wind speed dataset suitab...... between the results of simulations created by the commercial code FLUENT and the open source code OpenFOAM. An assessment of the ease with which the open source code can be used is also included....

  15. An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

    Directory of Open Access Journals (Sweden)

    Tu Zhenyu

    2005-01-01

    Full Text Available A simple but powerful scheme exploiting the binning concept for asymmetric lossless distributed source coding is proposed. The novelty in the proposed scheme is the introduction of a syndrome former (SF in the source encoder and an inverse syndrome former (ISF in the source decoder to efficiently exploit an existing linear channel code without the need to modify the code structure or the decoding strategy. For most channel codes, the construction of SF-ISF pairs is a light task. For parallelly and serially concatenated codes and particularly parallel and serial turbo codes where this appear less obvious, an efficient way for constructing linear complexity SF-ISF pairs is demonstrated. It is shown that the proposed SF-ISF approach is simple, provenly optimal, and generally applicable to any linear channel code. Simulation using conventional and asymmetric turbo codes demonstrates a compression rate that is only 0.06 bit/symbol from the theoretical limit, which is among the best results reported so far.

  16. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    Science.gov (United States)

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  17. Development and assessment of ASTEC code for severe accident simulation

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Pignet, S.; Seropian, C.; Montanelli, T.; Giordano, P.; Jacq, F.; Schwinges, B.

    2005-01-01

    Full text of publication follows: The ASTEC integral code, jointly developed by IRSN and GRS since several years for evaluation of source term during a severe accident (SA) in a Light Water Reactor, will play a central role in the SARNET network of excellence of the 6. Framework Programme (FwP) of the European Commission which started in spring 2004. It should become the reference European SA integral code in the next years. The version V1.1, released in June 2004, allows to model most of the main physical phenomena (except steam explosion) near or at the state of the art. In order to allow to study a great number of scenarios, a compromise must be found between precision of results and calculation time: one day of accident time usually takes less than one day of real time to be simulated on a PC computer. Important efforts are being made on validation by covering more than 30 reference experiments, often International Standard Problems from OECD (CORA, LOFT, PACTEL, BETA, VANAM, ACE-RTF, Phebus.FPT1...). The code is also used for the detailed interpretation of all the integral Phebus.FP experiments. Eighteen European partners performed a first independent evaluation of the code capabilities in 2000-03 within the frame of the EVITA 5. FwP project on one hand by comparison to experiments and on another hand by benchmarking with MAAP4 and MELCOR integral codes on plant applications on PWR and VVER. Their main conclusions were the needs of improvement of code robustness (especially the 2 new modules CESAR and DIVA simulating respectively circuit thermal hydraulics and core degradation) and of post-processing tools. Some improvements have already been achieved in the latest version V 1.1 on these two aspects. A new module MEDICIS devoted to Molten Core Concrete Interaction (MCCI) is implemented in this version, with a tight coupling to the containment thermal hydraulics module CPA. The paper presents a detailed analysis of a TMLB sequence on a French 900 MWe PWR, from

  18. Development of 'SKYSHINE-CG' code. A line-beam method code equipped with combinatorial geometry routine

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Takahiro; Ochiai, Katsuharu [Plant and System Planning Department, Toshiba Corporation, Yokohama, Kanagawa (Japan); Uematsu, Mikio; Hayashida, Yoshihisa [Department of Nuclear Engineering, Toshiba Engineering Corporation, Yokohama, Kanagawa (Japan)

    2000-03-01

    A boiling water reactor (BWR) plant has a single loop coolant system, in which main steam generated in the reactor core proceeds directly into turbines. Consequently, radioactive {sup 16}N (6.2 MeV photon emitter) contained in the steam contributes to gamma-ray skyshine dose in the vicinity of the BWR plant. The skyshine dose analysis is generally performed with the line-beam method code SKYSHINE, in which calculational geometry consists of a rectangular turbine building and a set of isotropic point sources corresponding to an actual distribution of {sup 16}N sources. For the purpose of upgrading calculational accuracy, the SKYSHINE-CG code has been developed by incorporating the combinatorial geometry (CG) routine into the SKYSHINE code, so that shielding effect of in-building equipment can be properly considered using a three-dimensional model composed of boxes, cylinders, spheres, etc. Skyshine dose rate around a 500 MWe BWR plant was calculated with both SKYSHINE and SKYSHINE-CG codes, and the calculated results were compared with measured data obtained with a NaI(Tl) scintillation detector. The C/E values for SKYSHINE-CG calculation were scattered around 4.0, whereas the ones for SKYSHINE calculation were as large as 6.0. Calculational error was found to be reduced by adopting three-dimensional model based on the combinatorial geometry method. (author)

  19. SCATTER: Source and Transport of Emplaced Radionuclides: Code documentation

    International Nuclear Information System (INIS)

    Longsine, D.E.

    1987-03-01

    SCATTER simulated several processes leading to the release of radionuclides to the site subsystem and then simulates transport via the groundwater of the released radionuclides to the biosphere. The processes accounted for to quantify release rates to a ground-water migration path include radioactive decay and production, leaching, solubilities, and the mixing of particles with incoming uncontaminated fluid. Several decay chains of arbitrary length can be considered simultaneously. The release rates then serve as source rates to a numerical technique which solves convective-dispersive transport for each decay chain. The decay chains are allowed to have branches and each member can have a different radioactive factor. Results are cast as radionuclide discharge rates to the accessible environment

  20. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  1. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  2. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  3. Foundational development of an advanced nuclear reactor integrated safety code

    International Nuclear Information System (INIS)

    Clarno, Kevin; Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth; Hooper, Russell Warren; Humphries, Larry LaRon

    2010-01-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  4. Foundational development of an advanced nuclear reactor integrated safety code.

    Energy Technology Data Exchange (ETDEWEB)

    Clarno, Kevin (Oak Ridge National Laboratory, Oak Ridge, TN); Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth (Ktech Corporation, Albuquerque, NM); Hooper, Russell Warren; Humphries, Larry LaRon

    2010-02-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  5. Developments of HTGR thermofluid dynamic analysis codes and HTGR plant dynamic simulation code

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1983-01-01

    In nuclear power plants as well as high temperature gas-cooled reactor plants, the design is mostly performed on the basis of the results after their characteristics have been grasped by carrying out the numerical simulation using the analysis code. Also in Kawasaki Heavy Industries Ltd., on the basis of the system engineering accumulated with gas-cooled reactors since several years ago, the preparation and systematization of analysis codes have been advanced, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In this report, a part of the results is described. The example of the analysis applying the two-dimensional compressible flow analysis codes SOLA-VOF and SALE-2D, which were developed by Los Alamos National Laboratory in USA and modified for use in Kawasaki, to HTGR system is reported. Besides, Kawasaki has developed the control characteristics analyzing code DYSCO by which the change of system composition is easy and high versatility is available. The outline, fundamental equations, fundamental algorithms and examples of application of the SOLA-VOF and SALE-2D, the present status of system characteristic simulation codes and the outline of the DYSCO are described. (Kako, I.)

  6. Development of the code for filter calculation

    International Nuclear Information System (INIS)

    Gritzay, O.O.; Vakulenko, M.M.

    2012-01-01

    This paper describes a calculation method, which commonly used in the Neutron Physics Department to develop a new neutron filter or to improve the existing neutron filter. This calculation is the first step of the traditional filter development procedure. It allows easy selection of the qualitative and quantitative contents of a composite filter in order to receive the filtered neutron beam with given parameters

  7. An efficient chaotic source coding scheme with variable-length blocks

    International Nuclear Information System (INIS)

    Lin Qiu-Zhen; Wong Kwok-Wo; Chen Jian-Yong

    2011-01-01

    An efficient chaotic source coding scheme operating on variable-length blocks is proposed. With the source message represented by a trajectory in the state space of a chaotic system, data compression is achieved when the dynamical system is adapted to the probability distribution of the source symbols. For infinite-precision computation, the theoretical compression performance of this chaotic coding approach attains that of optimal entropy coding. In finite-precision implementation, it can be realized by encoding variable-length blocks using a piecewise linear chaotic map within the precision of register length. In the decoding process, the bit shift in the register can track the synchronization of the initial value and the corresponding block. Therefore, all the variable-length blocks are decoded correctly. Simulation results show that the proposed scheme performs well with high efficiency and minor compression loss when compared with traditional entropy coding. (general)

  8. Authorship attribution of source code by using back propagation neural network based on particle swarm optimization.

    Science.gov (United States)

    Yang, Xinyu; Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao

    2017-01-01

    Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead.

  9. Development of the next generation reactor analysis code system, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Nagaya, Yasunobu; Chiba, Go; Kugo, Teruhiko; Ishikawa, Makoto; Tatsumi, Masahiro; Hirai, Yasushi; Hyoudou, Hideaki; Numata, Kazuyuki; Iwai, Takehiko; Jin, Tomoyuki

    2011-03-01

    A next generation reactor analysis code system, MARBLE, has been developed. MARBLE is a successor of the fast reactor neutronics analysis code systems, JOINT-FR and SAGEP-FR (conventional systems), which were developed for so-called JUPITER standard analysis methods. MARBLE has the equivalent analysis capability to the conventional system because MARBLE can utilize sub-codes included in the conventional system without any change. On the other hand, burnup analysis functionality for power reactors is improved compared with the conventional system by introducing models on fuel exchange treatment and control rod operation and so on. In addition, MARBLE has newly developed solvers and some new features of burnup calculation by the Krylov sub-space method and nuclear design accuracy evaluation by the extended bias factor method. In the development of MARBLE, the object oriented technology was adopted from the view-point of improvement of the software quality such as flexibility, expansibility, facilitation of the verification by the modularization and assistance of co-development. And, software structure called the two-layer system consisting of scripting language and system development language was applied. As a result, MARBLE is not an independent analysis code system which simply receives input and returns output, but an assembly of components for building an analysis code system (i.e. framework). Furthermore, MARBLE provides some pre-built analysis code systems such as the fast reactor neutronics analysis code system. SCHEME, which corresponds to the conventional code and the fast reactor burnup analysis code system, ORPHEUS. (author)

  10. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    Science.gov (United States)

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  11. Development of Regulatory Audit Core Safety Code : COREDAX

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Chae Yong; Jo, Jong Chull; Roh, Byung Hwan [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Lee, Jae Jun; Cho, Nam Zin [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2005-07-01

    Korea Institute of Nuclear Safety (KINS) has developed a core neutronics simulator, COREDAX code, for verifying core safety of SMART-P reactor, which is technically supported by Korea Advanced Institute of Science and Technology (KAIST). The COREDAX code would be used for regulatory audit calculations of 3- dimendional core neutronics. The COREDAX code solves the steady-state and timedependent multi-group neutron diffusion equation in hexagonal geometry as well as rectangular geometry by analytic function expansion nodal (AFEN) method. AFEN method was developed at KAIST, and it was internationally verified that its accuracy is excellent. The COREDAX code is originally programmed based on the AFEN method. Accuracy of the code on the AFEN method was excellent for the hexagonal 2-dimensional problems, but there was a need for improvement for hexagonal-z 3-dimensional problems. Hence, several solution routines of the AFEN method are improved, and finally the advanced AFEN method is created. COREDAX code is based on the advanced AFEN method . The initial version of COREDAX code is to complete a basic framework, performing eigenvalue calculations and kinetics calculations with thermal-hydraulic feedbacks, for audit calculations of steady-state core design and reactivity-induced accidents of SMART-P reactor. This study describes the COREDAX code for hexagonal geometry.

  12. Development of probabilistic fracture mechanics code PASCAL and user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Katsuyuki; Onizawa, Kunio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Li, Yinsheng; Kato, Daisuke [Fuji Research Institute Corporation, Tokyo (Japan)

    2001-03-01

    As a part of the aging and structural integrity research for LWR components, a new PFM (Probabilistic Fracture Mechanics) code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed since FY1996. This code evaluates the failure probability of an aged reactor pressure vessel subjected to transient loading such as PTS (Pressurized Thermal Shock). The development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering the recent development in the fracture mechanics methodologies and computer performance. The code has some new functions in optimized sampling and cell dividing procedure in stratified Monte Carlo simulation, elastic-plastic fracture criterion of R6 method, extension analysis models in semi-elliptical crack, evaluation of effect of thermal annealing and etc. In addition, an input data generator of temperature and stress distribution time histories was also prepared in the code. Functions and performance of the code have been confirmed based on the verification analyses and some case studies on the influence parameters. The present phase of the development will be completed in FY2000. Thus this report provides the user's manual and theoretical background of the code. (author)

  13. Development of the CRIPTE Code for Electromagnetic Coupling

    National Research Council Canada - National Science Library

    Parmantier, Jean-Philippe

    2005-01-01

    .... This code was originally developed as part of an experiment performed under the joint US-France international data exchange program on the atmospheric electricity/aircraft interactions, DEA-AF-79-7336...

  14. Development of code SFINEL (Spent fuel integrity evaluator)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Soo; Min, Chin Young; Ohk, Young Kil; Yang, Yong Sik; Kim, Dong Ju; Kim, Nam Ku [Hanyang University, Seoul (Korea)

    1999-01-01

    SFINEL code, an integrated computer program for predicting the spent fuel rod integrity based on burn-up history and major degradation mechanisms, has been developed through this project. This code can sufficiently simulate the power history of a fuel rod during the reactor operation and estimate the degree of deterioration of spent fuel cladding using the recently-developed models on the degradation mechanisms. SFINEL code has been thoroughly benchmarked against the collected in-pile data and operating experiences: deformation and rupture, and cladding oxidation, rod internal pressure creep, then comprehensive whole degradation process. (author). 75 refs., 51 figs., 5 tabs.

  15. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  16. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Directory of Open Access Journals (Sweden)

    Pierre Siohan

    2005-05-01

    Full Text Available Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC and variable-length source codes (VLC widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  17. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Science.gov (United States)

    Guillemot, Christine; Siohan, Pierre

    2005-12-01

    Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  18. CVExplorer: identifying candidate developers by mining and exploring their open source contributions.

    CSIR Research Space (South Africa)

    Greene, GJ

    2016-09-01

    Full Text Available Open source code contributions contain a large amount of technical skill information about developers, which can help to identify suitable candidates for a particular development job and therefore impact the success of a development team. We develop...

  19. Fine-Grained Energy Modeling for the Source Code of a Mobile Application

    DEFF Research Database (Denmark)

    Li, Xueliang; Gallagher, John Patrick

    2016-01-01

    The goal of an energy model for source code is to lay a foundation for the application of energy-aware programming techniques. State of the art solutions are based on source-line energy information. In this paper, we present an approach to constructing a fine-grained energy model which is able...

  20. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  1. Comparison of DT neutron production codes MCUNED, ENEA-JSI source subroutine and DDT

    Energy Technology Data Exchange (ETDEWEB)

    Čufar, Aljaž, E-mail: aljaz.cufar@ijs.si [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Lengar, Igor; Kodeli, Ivan [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Milocco, Alberto [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Sauvan, Patrick [Departamento de Ingeniería Energética, E.T.S. Ingenieros Industriales, UNED, C/Juan del Rosal 12, 28040 Madrid (Spain); Conroy, Sean [VR Association, Uppsala University, Department of Physics and Astronomy, PO Box 516, SE-75120 Uppsala (Sweden); Snoj, Luka [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2016-11-01

    Highlights: • Results of three codes capable of simulating the accelerator based DT neutron generators were compared on a simple model where only a thin target made of mixture of titanium and tritium is present. Two typical deuteron beam energies, 100 keV and 250 keV, were used in the comparison. • Comparisons of the angular dependence of the total neutron flux and spectrum as well as the neutron spectrum of all the neutrons emitted from the target show general agreement of the results but also some noticeable differences. • A comparison of figures of merit of the calculations using different codes showed that the computational time necessary to achieve the same statistical uncertainty can vary for more than 30× when different codes for the simulation of the DT neutron generator are used. - Abstract: As the DT fusion reaction produces neutrons with energies significantly higher than in fission reactors, special fusion-relevant benchmark experiments are often performed using DT neutron generators. However, commonly used Monte Carlo particle transport codes such as MCNP or TRIPOLI cannot be directly used to analyze these experiments since they do not have the capabilities to model the production of DT neutrons. Three of the available approaches to model the DT neutron generator source are the MCUNED code, the ENEA-JSI DT source subroutine and the DDT code. The MCUNED code is an extension of the well-established and validated MCNPX Monte Carlo code. The ENEA-JSI source subroutine was originally prepared for the modelling of the FNG experiments using different versions of the MCNP code (−4, −5, −X) and was later extended to allow the modelling of both DT and DD neutron sources. The DDT code prepares the DT source definition file (SDEF card in MCNP) which can then be used in different versions of the MCNP code. In the paper the methods for the simulation of the DT neutron production used in the codes are briefly described and compared for the case of a

  2. Theoretical Atomic Physics code development IV: LINES, A code for computing atomic line spectra

    International Nuclear Information System (INIS)

    Abdallah, J. Jr.; Clark, R.E.H.

    1988-12-01

    A new computer program, LINES, has been developed for simulating atomic line emission and absorption spectra using the accurate fine structure energy levels and transition strengths calculated by the (CATS) Cowan Atomic Structure code. Population distributions for the ion stages are obtained in LINES by using the Local Thermodynamic Equilibrium (LTE) model. LINES is also useful for displaying the pertinent atomic data generated by CATS. This report describes the use of LINES. Both CATS and LINES are part of the Theoretical Atomic PhysicS (TAPS) code development effort at Los Alamos. 11 refs., 9 figs., 1 tab

  3. Code-first development with Entity Framework

    CERN Document Server

    Barskiy, Sergey

    2015-01-01

    This book is intended for software developers with some prior experience with the Microsoft .NET framework who want to learn how to use Entity Framework. This book will get you up and running quickly, providing many examples that illustrate all the key concepts of Entity Framework.

  4. The Kepler Science Data Processing Pipeline Source Code Road Map

    Science.gov (United States)

    Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima; hide

    2016-01-01

    We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.

  5. Aeroelastic code development activities in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Wright, A.D. [National Renewable Energy Lab., Golden, Colorado (United States)

    1996-09-01

    Designing wind turbines to be fatigue resistant and to have long lifetimes at minimal cost is a major goal of the federal wind program and the wind industry in the United States. To achieve this goal, we must be able to predict critical loads for a wide variety of different wind turbines operating under extreme conditions. The codes used for wind turbine dynamic analysis must be able to analyze a wide range of different wind turbine configurations as well as rapidly predict the loads due to turbulent wind inflow with a minimal set of degrees of freedom. Code development activities in the US have taken a two-pronged approach in order to satisfy both of these criteria: (1) development of a multi-purpose code which can be used to analyze a wide variety of wind turbine configurations without having to develop new equations of motion with each configuration change, and (2) development of specialized codes with minimal sets of specific degrees of freedom for analysis of two- and three-bladed horizontal axis wind turbines and calculation of machine loads due to turbulent inflow. In the first method we have adapted a commercial multi-body dynamics simulation package for wind turbine analysis. In the second approach we are developing specialized codes with limited degrees of freedom, usually specified in the modal domain. This paper will summarize progress to date in the development, validation, and application of these codes. (au) 13 refs.

  6. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  7. Development of computer code in PNC, 8

    International Nuclear Information System (INIS)

    Ohhira, Mitsuru

    1990-01-01

    Private buildings applied base isolation system, are on the practical stage now. So, under Construction and Maintenance Management Office, we are doing an application study of base isolation system to nuclear fuel facilities. On the process of this study, we have developed Dynamic Analysis Program-Base Isolation System (DAP-BS) which is able to run a 32-bit personal computer. Using this program, we can analyze a 3-dimensional structure, and evaluate the various properties of base isolation parts that are divided into maximum 16 blocks. And from the results of some simulation analyses, we thought that DAP-BS had good reliability and marketability. So, we put DAP-BS on the market. (author)

  8. Development and validation of a nodal code for core calculation

    International Nuclear Information System (INIS)

    Nowakowski, Pedro Mariano

    2004-01-01

    The code RHENO solves the multigroup three-dimensional diffusion equation using a nodal method of polynomial expansion.A comparative study has been made between this code and present internationals nodal diffusion codes, resulting that the RHENO is up to date.The RHENO has been integrated to a calculation line and has been extend to make burnup calculations.Two methods for pin power reconstruction were developed: modulation and imbedded. The modulation method has been implemented in a program, while the implementation of the imbedded method will be concluded shortly.The validation carried out (that includes experimental data of a MPR) show very good results and calculation efficiency

  9. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  10. Development and validation of sodium fire analysis code ASSCOPS

    International Nuclear Information System (INIS)

    Ohno, Shuji

    2001-01-01

    A version 2.1 of the ASSCOPS sodium fire analysis code was developed to evaluate the thermal consequences of a sodium leak and consequent fire in LMFBRs. This report describes the computational models and the validation studies using the code. The ASSCOPS calculates sodium droplet and pool fire, and consequential heat/mass transfer behavior. Analyses of sodium pool or spray fire experiments confirmed that this code and parameters used in the validation studies gave valid results on the thermal consequences of sodium leaks and fires. (author)

  11. Development of safety analysis codes for light water reactor

    International Nuclear Information System (INIS)

    Akimoto, Masayuki

    1985-01-01

    An overview is presented of currently used major codes for the prediction of thermohydraulic transients in nuclear power plants. The overview centers on the two-phase fluid dynamics of the coolant system and the assessment of the codes. Some of two-phase phenomena such as phase separation are not still predicted with engineering accuracy. MINCS-PIPE are briefly introduced. The MINCS-PIPE code is to assess constitutive relations and to aid development of various experimental correlations for 1V1T model to 2V2T model. (author)

  12. SCDAP/RELAP5/MOD3 code development

    International Nuclear Information System (INIS)

    Allison, C.M.; Siefken, J.L.; Coryell, E.W.

    1992-01-01

    The SCOAP/RELAP5/MOD3 computer code is designed to describe the overall reactor coolant system (RCS) thermal-hydraulic response, core damage progression, and fission product release and transport during severe accidents. The code is being developed at the Idaho National Engineering Laboratory (INEL) under the primary sponsorship of the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC). Code development activities are currently focused on three main areas - (a) code usability, (b) early phase melt progression model improvements, and (c) advanced reactor thermal-hydraulic model extensions. This paper describes the first two activities. A companion paper describes the advanced reactor model improvements being performed under RELAP5/MOD3 funding

  13. Development of LWR fuel performance code FEMAXI-6

    International Nuclear Information System (INIS)

    Suzuki, Motoe

    2006-01-01

    LWR fuel performance code: FEMAXI-6 (Finite Element Method in AXIs-symmetric system) is a representative fuel analysis code in Japan. Development history, background, design idea, features of model, and future are stated. Characteristic performance of LWR fuel and analysis code, what is model, development history of FEMAXI, use of FEMAXI code, fuel model, and a special feature of FEMAXI model is described. As examples of analysis, PCMI (Pellet-Clad Mechanical Interaction), fission gas release, gap bonding, and fission gas bubble swelling are reported. Thermal analysis and dynamic analysis system of FEMAXI-6, function block at one time step of FEMAXI-6, analytical example of PCMI in the output increase test by FEMAXI-III, analysis of fission gas release in Halden reactor by FEMAXI-V, comparison of the center temperature of fuel in Halden reactor, and analysis of change of diameter of fuel rod in high burn up BWR fuel are shown. (S.Y.)

  14. SCDAP/RELAP5 code development and assessment

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.

    1996-01-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The current version of the code is SCDAP/RELAP5/MOD3.1e. Although MOD3.1e contains a number of significant improvements since the initial version of MOD3.1 was released, new models to treat the behavior of the fuel and cladding during reflood have had the most dramatic impact on the code's calculations. This paper provides a brief description of the new reflood models, presents highlights of the assessment of the current version of MOD3.1, and discusses future SCDAP/RELAP5/MOD3.2 model development activities

  15. Computer codes developed in FRG to analyse hypothetical meltdown accidents

    International Nuclear Information System (INIS)

    Hassmann, K.; Hosemann, J.P.; Koerber, H.; Reineke, H.

    1978-01-01

    It is the purpose of this paper to give the status of all significant computer codes developed in the core melt-down project which is incorporated in the light water reactor safety research program of the Federal Ministry of Research and Technology. For standard pressurized water reactors, results of some computer codes will be presented, describing the course and the duration of the hypothetical core meltdown accident. (author)

  16. Joint source/channel coding of scalable video over noisy channels

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, G.; Zakhor, A. [Department of Electrical Engineering and Computer Sciences University of California Berkeley, California94720 (United States)

    1997-01-01

    We propose an optimal bit allocation strategy for a joint source/channel video codec over noisy channel when the channel state is assumed to be known. Our approach is to partition source and channel coding bits in such a way that the expected distortion is minimized. The particular source coding algorithm we use is rate scalable and is based on 3D subband coding with multi-rate quantization. We show that using this strategy, transmission of video over very noisy channels still renders acceptable visual quality, and outperforms schemes that use equal error protection only. The flexibility of the algorithm also permits the bit allocation to be selected optimally when the channel state is in the form of a probability distribution instead of a deterministic state. {copyright} {ital 1997 American Institute of Physics.}

  17. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  18. The Development of the World Anti-Doping Code.

    Science.gov (United States)

    Young, Richard

    2017-01-01

    This chapter addresses both the development and substance of the World Anti-Doping Code, which came into effect in 2003, as well as the subsequent Code amendments, which came into effect in 2009 and 2015. Through an extensive process of stakeholder input and collaboration, the World Anti-Doping Code has transformed the hodgepodge of inconsistent and competing pre-2003 anti-doping rules into a harmonized and effective approach to anti-doping. The Code, as amended, is now widely recognized worldwide as the gold standard in anti-doping. The World Anti-Doping Code originally went into effect on January 1, 2004. The first amendments to the Code went into effect on January 1, 2009, and the second amendments on January 1, 2015. The Code and the related international standards are the product of a long and collaborative process designed to make the fight against doping more effective through the adoption and implementation of worldwide harmonized rules and best practices. © 2017 S. Karger AG, Basel.

  19. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  20. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  1. Development of the versatile reactor analysis code system, MARBLE2

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Jin, Tomoyuki; Hazama, Taira; Hirai, Yasushi

    2015-07-01

    The second version of the versatile reactor analysis code system, MARBLE2, has been developed. A lot of new functions have been added in MARBLE2 by using the base technology developed in the first version (MARBLE1). Introducing the remaining functions of the conventional code system (JOINT-FR and SAGEP-FR), MARBLE2 enables one to execute almost all analysis functions of the conventional code system with the unified user interfaces of its subsystem, SCHEME. In particular, the sensitivity analysis functionality is available in MARBLE2. On the other hand, new built-in solvers have been developed, and existing ones have been upgraded. Furthermore, some other analysis codes and libraries developed in JAEA have been consolidated and prepared in SCHEME. In addition, several analysis codes developed in the other institutes have been additionally introduced as plug-in solvers. Consequently, gamma-ray transport calculation and heating evaluation become available. As for another subsystem, ORPHEUS, various functionality updates and speed-up techniques have been applied based on user experience of MARBLE1 to enhance its usability. (author)

  2. Development of Visual CINDER Code with Visual C⧣.NET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Oyeon [Institute for Modeling and Simulation Convergence, Daegu (Korea, Republic of)

    2016-10-15

    CINDER code, CINDER' 90 or CINDER2008 that is integrated with the Monte Carlo code, MCNPX, is widely used to calculate the inventory of nuclides in irradiated materials. The MCNPX code provides decay processes to the particle transport scheme that traditionally only covered prompt processes. The integration schemes serve not only the reactor community (MCNPX burnup) but also the accelerator community as well (residual production information). The big benefit for providing these options lies in the easy cross comparison of the transmutation codes since the calculations are based on exactly the same material, neutron flux and isotope production/destruction inputs. However, it is just frustratingly cumbersome to use. In addition, multiple human interventions may increase the possibility of making errors. The number of significant digits in the input data varies in steps, which may cause big errors for highly nonlinear problems. Thus, it is worthwhile to find a new way to wrap all the codes and procedures in one consistent package which can provide ease of use. The visual CINDER code development is underway with visual C .NET framework. It provides a few benefits for the atomic transmutation simulation with CINDER code. A few interesting and useful properties of visual C .NET framework are introduced. We also showed that the wrapper could make the simulation accurate for highly nonlinear transmutation problems and also increase the possibility of direct combination a radiation transport code MCNPX with CINDER code. Direct combination of CINDER with MCNPX in a wrapper will provide more functionalities for the radiation shielding and prevention study.

  3. Development of Visual CINDER Code with Visual C⧣.NET

    International Nuclear Information System (INIS)

    Kim, Oyeon

    2016-01-01

    CINDER code, CINDER' 90 or CINDER2008 that is integrated with the Monte Carlo code, MCNPX, is widely used to calculate the inventory of nuclides in irradiated materials. The MCNPX code provides decay processes to the particle transport scheme that traditionally only covered prompt processes. The integration schemes serve not only the reactor community (MCNPX burnup) but also the accelerator community as well (residual production information). The big benefit for providing these options lies in the easy cross comparison of the transmutation codes since the calculations are based on exactly the same material, neutron flux and isotope production/destruction inputs. However, it is just frustratingly cumbersome to use. In addition, multiple human interventions may increase the possibility of making errors. The number of significant digits in the input data varies in steps, which may cause big errors for highly nonlinear problems. Thus, it is worthwhile to find a new way to wrap all the codes and procedures in one consistent package which can provide ease of use. The visual CINDER code development is underway with visual C .NET framework. It provides a few benefits for the atomic transmutation simulation with CINDER code. A few interesting and useful properties of visual C .NET framework are introduced. We also showed that the wrapper could make the simulation accurate for highly nonlinear transmutation problems and also increase the possibility of direct combination a radiation transport code MCNPX with CINDER code. Direct combination of CINDER with MCNPX in a wrapper will provide more functionalities for the radiation shielding and prevention study

  4. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    International Nuclear Information System (INIS)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C

  5. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  6. Multi-dimensional Code Development for Safety Analysis of LMR

    International Nuclear Information System (INIS)

    Ha, K. S.; Jeong, H. Y.; Kwon, Y. M.; Lee, Y. B.

    2006-08-01

    A liquid metal reactor loaded a metallic fuel has the inherent safety mechanism due to the several negative reactivity feedback. Although this feature demonstrated through experiments in the EBR-II, any of the computer programs until now did not exactly analyze it because of the complexity of the reactivity feedback mechanism. A multi-dimensional detail program was developed through the International Nuclear Energy Research Initiative(INERI) from 2003 to 2005. This report includes the numerical coupling the multi-dimensional program and SSC-K code which is used to the safety analysis of liquid metal reactors in KAERI. The coupled code has been proved by comparing the analysis results using the code with the results using SAS-SASSYS code of ANL for the UTOP, ULOF, and ULOHS applied to the safety analysis for KALIMER-150

  7. Development of 1D Liner Compression Code for IDL

    Science.gov (United States)

    Shimazu, Akihisa; Slough, John; Pancotti, Anthony

    2015-11-01

    A 1D liner compression code is developed to model liner implosion dynamics in the Inductively Driven Liner Experiment (IDL) where FRC plasmoid is compressed via inductively-driven metal liners. The driver circuit, magnetic field, joule heating, and liner dynamics calculations are performed at each time step in sequence to couple these effects in the code. To obtain more realistic magnetic field results for a given drive coil geometry, 2D and 3D effects are incorporated into the 1D field calculation through use of correction factor table lookup approach. Commercial low-frequency electromagnetic fields solver, ANSYS Maxwell 3D, is used to solve the magnetic field profile for static liner condition at various liner radius in order to derive correction factors for the 1D field calculation in the code. The liner dynamics results from the code is verified to be in good agreement with the results from commercial explicit dynamics solver, ANSYS Explicit Dynamics, and previous liner experiment. The developed code is used to optimize the capacitor bank and driver coil design for better energy transfer and coupling. FRC gain calculations are also performed using the liner compression data from the code for the conceptual design of the reactor sized system for fusion energy gains.

  8. Development of REFLA/TRAC code for engineering work station

    International Nuclear Information System (INIS)

    Ohnuki, Akira; Akimoto, Hajime; Murao, Yoshio

    1994-03-01

    The REFLA/TRAC code is a best-estimate code which is expected to check reactor safety analysis codes for light water reactors (LWRs) and to perform accident analyses for LWRs and also for an advanced LWR. Therefore, a high predictive capability is required and the assessment of each physical model becomes important because the models govern the predictive capability. In the case of the assessment of three-dimensional models in REFLA/TRAC code, a conventional large computer is being used and it is difficult to perform the assessment efficiently because the turnaround time for the calculation and the analysis is long. Then, a REFLA/TRAC code which can run on an engineering work station (EWS) was developed. Calculational speed of the current EWS is the same order as that of large computers and the EWS has an excellent function for multidimensional graphical drawings. Besides, the plotting processors for X-Y drawing and for two-dimensional graphical drawing were developed in order to perform efficient analyses for three-dimensional calculations. In future, we can expect that the assessment of three-dimensional models becomes more efficient by introducing an EWS with higher calculational speed and with improved graphical drawings. In this report, each outline for the following three programs is described: (1) EWS version of REFLA/TRAC code, (2) Plot processor for X-Y drawing and (3) Plot processor for two-dimensional graphical drawing. (author)

  9. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  10. Development of the point-depletion code DEPTH

    International Nuclear Information System (INIS)

    She, Ding; Wang, Kan; Yu, Ganglin

    2013-01-01

    Highlights: ► The DEPTH code has been developed for the large-scale depletion system. ► DEPTH uses the data library which is convenient to couple with MC codes. ► TTA and matrix exponential methods are implemented and compared. ► DEPTH is able to calculate integral quantities based on the matrix inverse. ► Code-to-code comparisons prove the accuracy and efficiency of DEPTH. -- Abstract: The burnup analysis is an important aspect in reactor physics, which is generally done by coupling of transport calculations and point-depletion calculations. DEPTH is a newly-developed point-depletion code of handling large burnup depletion systems and detailed depletion chains. For better coupling with Monte Carlo transport codes, DEPTH uses data libraries based on the combination of ORIGEN-2 and ORIGEN-S and allows users to assign problem-dependent libraries for each depletion step. DEPTH implements various algorithms of treating the stiff depletion systems, including the Transmutation trajectory analysis (TTA), the Chebyshev Rational Approximation Method (CRAM), the Quadrature-based Rational Approximation Method (QRAM) and the Laguerre Polynomial Approximation Method (LPAM). Three different modes are supported by DEPTH to execute the decay, constant flux and constant power calculations. In addition to obtaining the instantaneous quantities of the radioactivity, decay heats and reaction rates, DEPTH is able to calculate the integral quantities by a time-integrated solver. Through calculations compared with ORIGEN-2, the validity of DEPTH in point-depletion calculations is proved. The accuracy and efficiency of depletion algorithms are also discussed. In addition, an actual pin-cell burnup case is calculated to illustrate the DEPTH code performance in coupling with the RMC Monte Carlo code

  11. Development of a national code of practice for structural masonry ...

    African Journals Online (AJOL)

    The problems and constraints faced by most developing countries, particularly Ghana, in developing codes of practice for structural masonry are highlighted. The steps that must be undertaken through the coordinated efforts of the National Standards Boards, Research Institutions, Universities and Professional Bodies in the ...

  12. Development of FBR integrity system code. Basic concept

    International Nuclear Information System (INIS)

    Asayama, Tai

    2001-05-01

    For fast breeder reactors to be commercialized, they must be more reliable, safer, and at the same, economically competitive with future light water reactors. Innovation of elevated temperature structural design standard is necessary to achieve this goal. The most powerful way is to enlarge the scope of structural integrity code to cover items other than design evaluation that has been addressed in existing codes. Items that must be newly covered are prerequisites of design, fabrication, examination, operation and maintenance, etc. This allows designers to choose the most economical combination of design variations to achieve specific reliability that is needed for a particular component. Designing components by this concept, a cost-minimum design of a whole plant can be realized. By determining the reliability that must be achieved for a component by risk technologies, further economical improvement can be expected by avoiding excessive quality. Recognizing the necessity for the codes based on the new concept, the development of 'FBR integrity system code' began in 2000. Research and development will last 10 years. For this development, the basic logistics and system as well as technologies that materialize the concept are necessary. Original logistics and system must be developed, because no existing researches are available in and out of Japan. This reports presents the results of the work done in the first year regarding the basic idea, methodology, and structure of the code. (author)

  13. Development of throughflow calculation code for axial flow compressors

    International Nuclear Information System (INIS)

    Kim, Ji Hwan; Kim, Hyeun Min; No, Hee Cheon

    2005-01-01

    The power conversion systems of the current HTGRs are based on closed Brayton cycle and major concern is thermodynamic performance of the axial flow helium gas turbines. Particularly, the helium compressor has some unique design challenges compared to the air-breathing compressor such as high hub-to-tip ratios throughout the machine and a large number of stages due to the physical property of the helium and thermodynamic cycle. Therefore, it is necessary to develop a design and analysis code for helium compressor that can estimate the design point and off-design performance accurately. KAIST nuclear system laboratory has developed a compressor design and analysis code by means of throughflow calculation and several loss models. This paper presents the outline of the development of a throughflow calculation code and its verification results

  14. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  15. Development and assessment of the COBRA/RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jae Jun; Ha, Kwi Seok; Sim, Seok Ku

    1997-04-01

    The COBRA/RELAP5 code, a merged version of the COBRA-TF and RELAP5/MOD3.2 codes, has been developed to combine the realistic three-dimensional reactor vessel model of COBRA-TF with RELAP5/MOD3, thus to produce an advanced system analysis code with a multidimensional thermal-hydraulic module. This report provides the integration scheme of the two codes and the results of developmental assessments. These includes single channel tests, manometric flow oscillation problem, THTF Test 105, and LOFT L2-3 large-break loss-of-coolant experiment. From the single channel tests the integration scheme and its implementation were proven to be valid. Other simulation results showed good agreement with the experiments. The computational speed was also satisfactory. So it is confirmed that COBRA/RELAP5 can be a promising tool for analysis of complicated, multidimensional two-phase flow transients. The area of further improvements in the code integration are also identified. This report also serves as a user`s manual for the COBRA/RELAP5 code. (author). 6 tabs., 20 figs., 20 refs.

  16. WASTK: A Weighted Abstract Syntax Tree Kernel Method for Source Code Plagiarism Detection

    Directory of Open Access Journals (Sweden)

    Deqiang Fu

    2017-01-01

    Full Text Available In this paper, we introduce a source code plagiarism detection method, named WASTK (Weighted Abstract Syntax Tree Kernel, for computer science education. Different from other plagiarism detection methods, WASTK takes some aspects other than the similarity between programs into account. WASTK firstly transfers the source code of a program to an abstract syntax tree and then gets the similarity by calculating the tree kernel of two abstract syntax trees. To avoid misjudgment caused by trivial code snippets or frameworks given by instructors, an idea similar to TF-IDF (Term Frequency-Inverse Document Frequency in the field of information retrieval is applied. Each node in an abstract syntax tree is assigned a weight by TF-IDF. WASTK is evaluated on different datasets and, as a result, performs much better than other popular methods like Sim and JPlag.

  17. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  18. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  19. Development of particle and heavy ion transport code system

    International Nuclear Information System (INIS)

    Niita, Koji

    2004-01-01

    Particle and heavy ion transport code system (PHITS) is 3 dimension general purpose Monte Carlo simulation codes for description of transport and reaction of particle and heavy ion in materials. It is developed on the basis of NMTC/JAM for design and safety of J-PARC. What is PHITS, it's physical process, physical models and development process of PHITC code are described. For examples of application, evaluation of neutron optics, cancer treatment by heavy particle ray and cosmic radiation are stated. JAM and JQMD model are used as the physical model. Neutron motion in six polar magnetic field and gravitational field, PHITC simulation of trace of C 12 beam and secondary neutron track of small model of cancer treatment device in HIMAC and neutron flux in Space Shuttle are explained. (S.Y.)

  20. BLT [Breach, Leach, and Transport]: A source term computer code for low-level waste shallow land burial

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1990-01-01

    This paper discusses the development of a source term model for low-level waste shallow land burial facilities and separates the problem into four individual compartments. These are water flow, corrosion and subsequent breaching of containers, leaching of the waste forms, and solute transport. For the first and the last compartments, we adopted the existing codes, FEMWATER and FEMWASTE, respectively. We wrote two new modules for the other two compartments in the form of two separate Fortran subroutines -- BREACH and LEACH. They were incorporated into a modified version of the transport code FEMWASTE. The resultant code, which contains all three modules of container breaching, waste form leaching, and solute transport, was renamed BLT (for Breach, Leach, and Transport). This paper summarizes the overall program structure and logistics, and presents two examples from the results of verification and sensitivity tests. 6 refs., 7 figs., 1 tab

  1. Recent developments in seismic analysis in the code Aster

    International Nuclear Information System (INIS)

    Guihot, P.; Devesa, G.; Dumond, A.; Panet, M.; Waeckel, F.

    1996-01-01

    Progress in the field of seismic qualification and design methods made these last few years allows physical phenomena actually in play to be better considered, while cutting down the conservatism associated with some simplified design methods. So following the change in methods and developing the most advantageous ones among them contributes to the process of the seismic margins assessment and the preparation of new design tools for future series. In this paper, the main developments and improvements in methods which have been made these last two years in the Code Aster, in order to improve seismic calculation methods and seismic margin assessment are presented. The first development relates to making the MISS3D soil structure interaction code available, thanks to an interface made with the Code Aster. The second relates to the possibility of making modal basis time calculations on multi-supported structures by considering local non linearities like impact, friction or squeeze fluid forces. Recent developments in random dynamics and postprocessing devoted to earthquake designs are then mentioned. Three applications of these developments are then ut forward. The first application relates to a test case for soil structure interaction design using MISS3D-Aster coupling. The second is a test case for a multi-supported structure. The last application, more for manufacturing, refers to seismic qualification of Main Live Steam stop valves. First results of the independent validation of the Code Aster seismic design functionalities, which provide and improve the quality of software, are also recalled. (authors)

  2. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  3. Development of a domestically-made system code

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    According to lessons learned from the Fukushima-Daiichi NPP accidents, a new safety standard based on state-of-the-art findings has been established by the Japanese Nuclear Regulation Authority (NRA) and will soon come into force in Japan. In order to ensure a precise response to this movement from a technological point of view, it should be required for safety regulation to develop a new system code with much smaller uncertainty and reinforced simulation capability even in application to beyond-DBAs (BDBAs), as well as with the capability of close coupling to a newly developing severe accident code. Accordingly, development of a new domestically-made system code that incorporates 3-dimensional and 3 or more fluid thermal-hydraulics in tandem with a 3-dimensional neutronics has been started in 2012. In 2012, two branches of development activities, the development of 'main body' and advanced features have been started in parallel for development efficiency. The main body has been started from scratch and the following activities have therefore been performed: 1) development and determination of key principles and methodologies to realize a flexible, extensible and robust platform, 2) determination of requirements definition, 3) start of basic program design and coding and 4) start of a development of prototypical GUI-based pre-post processor. As for the advanced features, the following activities have been performed: 1) development of Phenomena Identification and Ranking Tables (PIRTs) and model capability matrix from normal operations to BDBAs in order to address requirements definition for advanced modeling, 2) development of detailed action plan for modification of field equations, numerical schemes and solvers and 3) start of the program development of field equations with an interfacial area concentration transport equation, a robust solver for condensation induced water hammer phenomena and a versatile Newton-Raphson solver. (author)

  4. Time-dependent anisotropic external sources in transient 3-D transport code TORT-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    This paper describes the implementation of a time-dependent distributed external source in TORT-TD by explicitly considering the external source in the ''fixed-source'' term of the implicitly time-discretised 3-D discrete ordinates transport equation. Anisotropy of the external source is represented by a spherical harmonics series expansion similar to the angular fluxes. The YALINA-Thermal subcritical assembly serves as a test case. The configuration with 280 fuel rods has been analysed with TORT-TD using cross sections in 18 energy groups and P1 scattering order generated by the KAPROS code system. Good agreement is achieved concerning the multiplication factor. The response of the system to an artificial time-dependent source consisting of two square-wave pulses demonstrates the time-dependent external source capability of TORT-TD. The result is physically plausible as judged from validation calculations. (orig.)

  5. Development Of A Navier-Stokes Computer Code

    Science.gov (United States)

    Yoon, Seokkwan; Kwak, Dochan

    1993-01-01

    Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.

  6. Application of software engineering to development of reactor safety codes

    International Nuclear Information System (INIS)

    Wilburn, N.P.; Niccoli, L.G.

    1981-01-01

    Software Engineering, which is a systematic methodology by which a large scale software development project is partitioned into manageable pieces, has been applied to the development of LMFBR safety codes. The techniques have been applied extensively in the business and aerospace communities and have provided an answer to the drastically increasing cost of developing and maintaining software. The five phases of software engineering (Survey, Analysis, Design, Implementation, and Testing) were applied in turn to development of these codes, along with Walkthroughs (peer review) at each stage. The application of these techniques has resulted in SUPERIOR SOFTWARE which is well documented, thoroughly tested, easy to modify, easier to use and maintain. The development projects have resulted in lower overall cost. (orig.) [de

  7. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  8. Development of a code of practice for deep geothermal wells

    International Nuclear Information System (INIS)

    Leaver, J.D.; Bolton, R.S.; Dench, N.D.; Fooks, L.

    1990-01-01

    Recent and on-going changes to the structure of the New Zealand geothermal industry has shifted responsibility for the development of geothermal resources from central government to private enterprise. The need for a code of practice for deep geothermal wells was identified by the Geothermal Inspectorate of the Ministry of Commerce to maintain adequate standards of health and safety and to assist with industry deregulation. This paper reports that the Code contains details of methods, procedures, formulae and design data necessary to attain those standards, and includes information which drilling engineers having experience only in the oil industry could not be expected to be familiar with

  9. Code of practice for the use of sealed radioactive sources in borehole logging (1998)

    International Nuclear Information System (INIS)

    1989-12-01

    The purpose of this code is to establish working practices, procedures and protective measures which will aid in keeping doses, arising from the use of borehole logging equipment containing sealed radioactive sources, to as low as reasonably achievable and to ensure that the dose-equivalent limits specified in the National Health and Medical Research Council s radiation protection standards, are not exceeded. This code applies to all situations and practices where a sealed radioactive source or sources are used through wireline logging for investigating the physical properties of the geological sequence, or any fluids contained in the geological sequence, or the properties of the borehole itself, whether casing, mudcake or borehole fluids. The radiation protection standards specify dose-equivalent limits for two categories: radiation workers and members of the public. 3 refs., tabs., ills

  10. Theoretical atomic physics code development I: CATS: Cowan Atomic Structure Code

    International Nuclear Information System (INIS)

    Abdallah, J. Jr.; Clark, R.E.H.; Cowan, R.D.

    1988-12-01

    An adaptation of R.D. Cowan's Atomic Structure program, CATS, has been developed as part of the Theoretical Atomic Physics (TAPS) code development effort at Los Alamos. CATS has been designed to be easy to run and to produce data files that can interface with other programs easily. The CATS produced data files currently include wave functions, energy levels, oscillator strengths, plane-wave-Born electron-ion collision strengths, photoionization cross sections, and a variety of other quantities. This paper describes the use of CATS. 10 refs

  11. Recent developments for the HEADTAIL code: updating and benchmarks

    CERN Document Server

    Quatraro, D; Salvant, B

    2010-01-01

    The HEADTAIL code models the evolution of a single bunch interacting with a localized impedance source or an electron cloud, optionally including space charge. The newest version of HEADTAIL relies on a more detailed optical model of the machine taken from MAD-X and is more flexible in handling and distributing the interaction and observation points along the simulated machine. In addition, the option of the interaction with the wake field of specific accelerator components has been added, such that the user can choose to load dipolar and quadrupolar components of the wake from the impedance database ZBASE. The case of a single LHC-type bunch interacting with the realistic distribution of the kicker wake fields inside the SPS has been successfully compared with a single integrated beta-weighted kick per turn. The current version of the code also contains a new module for the longitudinal dynamics to calculate the evolution of a bunch inside an accelerating bucket.

  12. High power microwave source development

    Science.gov (United States)

    Benford, James N.; Miller, Gabriel; Potter, Seth; Ashby, Steve; Smith, Richard R.

    1995-05-01

    The requirements of this project have been to: (1) improve and expand the sources available in the facility for testing purposes and (2) perform specific tasks under direction of the Defense Nuclear Agency about the applications of high power microwaves (HPM). In this project the HPM application was power beaming. The requirements of this program were met in the following way: (1) We demonstrated that a compact linear induction accelerator can drive HPM sources at repetition rates in excess of 100 HZ at peak microwave powers of a GW. This was done for the relativistic magnetron. Since the conclusion of this contract such specifications have also been demonstrated for the relativistic klystron under Ballistic Missile Defense Organization funding. (2) We demonstrated an L band relativistic magnetron. This device has been used both on our single pulse machines, CAMEL and CAMEL X, and the repetitive system CLIA. (3) We demonstrated that phase locking of sources together in large numbers is a feasible technology and showed the generation of multigigawatt S-band radiation in an array of relativistic magnetrons.

  13. BBU code development for high-power microwave generators

    International Nuclear Information System (INIS)

    Houck, T.L.; Westenskow, G.A.; Yu, S.S.

    1992-01-01

    We are developing a two-dimensional, time-dependent computer code for the simulation of transverse instabilities in support of relativistic klystron-two beam accelerator research at LLNL. The code addresses transient effects as well as both cumulative and regenerative beam breakup modes. Although designed specifically for the transport of high current (kA) beams through traveling-wave structures, it is applicable to devices consisting of multiple combinations of standing-wave, traveling-wave, and induction accelerator structures. In this paper we compare code simulations to analytical solutions for the case where there is no rf coupling between cavities, to theoretical scaling parameters for coupled cavity structures, and to experimental data involving beam breakup in the two traveling-wave output structure of our microwave generator. (Author) 4 figs., tab., 5 refs

  14. Development of dynamic simulation code for fuel cycle fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Isao; Seki, Yasushi [Department of Fusion Engineering Research, Naka Fusion Research Establishment, Japan Atomic Energy Research Institute, Naka, Ibaraki (Japan); Sasaki, Makoto; Shintani, Kiyonori; Kim, Yeong-Chan

    1999-02-01

    A dynamic simulation code for fuel cycle of a fusion experimental reactor has been developed. The code follows the fuel inventory change with time in the plasma chamber and the fuel cycle system during 2 days pulse operation cycles. The time dependence of the fuel inventory distribution is evaluated considering the fuel burn and exhaust in the plasma chamber, purification and supply functions. For each subsystem of the plasma chamber and the fuel cycle system, the fuel inventory equation is written based on the equation of state considering the fuel burn and the function of exhaust, purification, and supply. The processing constants of subsystem for steady states were taken from the values in the ITER Conceptual Design Activity (CDA) report. Using this code, the time dependence of the fuel supply and inventory depending on the burn state and subsystem processing functions are shown. (author)

  15. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  16. Development and improvement of safety analysis code for geological disposal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    In order to confirm the long-term safety concerning geological disposal, probabilistic safety assessment code and other analysis codes, which can evaluate possibility of each event and influence on engineered barrier and natural barrier by the event, were introduced. We confirmed basic functions of those codes and studied the relation between those functions and FEP/PID which should be taken into consideration in safety assessment. We are planning to develop 'Nuclide Migration Assessment System' for the purpose of realizing improvement in efficiency of assessment work, human error prevention for analysis, and quality assurance of the analysis environment and analysis work for safety assessment by using it. As the first step, we defined the system requirements and decided the system composition and functions which should be mounted in them based on those requirements. (author)

  17. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  18. Development of simplified decommissioning cost estimation code for nuclear facilities

    International Nuclear Information System (INIS)

    Tachibana, Mitsuo; Shiraishi, Kunio; Ishigami, Tsutomu

    2010-01-01

    The simplified decommissioning cost estimation code for nuclear facilities (DECOST code) was developed in consideration of features and structures of nuclear facilities and similarity of dismantling methods. The DECOST code could calculate 8 evaluation items of decommissioning cost. Actual dismantling in the Japan Atomic Energy Agency (JAEA) was evaluated; unit conversion factors used to calculate the manpower of dismantling activities were evaluated. Consequently, unit conversion factors of general components could be classified into three kinds. Weights of components and structures of the facility were necessary for calculation of manpower. Methods for evaluating weights of components and structures of the facility were studied. Consequently, the weight of components in the facility was proportional to the weight of structures of the facility. The weight of structures of the facility was proportional to the total area of floors in the facility. Decommissioning costs of 7 nuclear facilities in the JAEA were calculated by using the DECOST code. To verify the calculated results, the calculated manpower was compared with the manpower gained from actual dismantling. Consequently, the calculated manpower and actual manpower were almost equal. The outline of the DECOST code, evaluation results of unit conversion factors, the evaluation method of the weights of components and structures of the facility are described in this report. (author)

  19. Development of a subchannel analysis code MATRA (Ver. α)

    International Nuclear Information System (INIS)

    Yoo, Y. J.; Hwang, D. H.

    1998-04-01

    A subchannel analysis code MATRA-α, an interim version of MATRA, has been developed to be run on an IBM PC or HP WS based on the existing CDC CYBER mainframe version of COBRA-IV-I. This MATRA code is a thermal-hydraulic analysis code based on the subchannel approach for calculating the enthalpy and flow distribution in fuel assemblies and reactor cores for both steady-state and transient conditions. MATRA-α has been provided with an improved structure, various functions, and models to give the more convenient user environment and to increase the code accuracy, various functions, and models to give the more convenient user environment and to increase the code accuracy. Among them, the pressure drop model has been improved to be applied to non-square-lattice rod arrays, and the lateral transport models between adjacent subchannels have been improved to increase the accuracy in predicting two-phase flow phenomena. Also included in this report are the detailed instructions for input data preparation and for auxiliary pre-processors to serve as a guide to those who want to use MATRA-α. In addition, we compared the predictions of MATRA-α with the experimental data on the flow and enthalpy distribution in three sample rod-bundle cases to evaluate the performance of MATRA-α. All the results revealed that the prediction of MATRA-α were better than those of COBRA-IV-I. (author). 16 refs., 1 tab., 13 figs

  20. Health effects estimation code development for accident consequence analysis

    International Nuclear Information System (INIS)

    Togawa, O.; Homma, T.

    1992-01-01

    As part of a computer code system for nuclear reactor accident consequence analysis, two computer codes have been developed for estimating health effects expected to occur following an accident. Health effects models used in the codes are based on the models of NUREG/CR-4214 and are revised for the Japanese population on the basis of the data from the reassessment of the radiation dosimetry and information derived from epidemiological studies on atomic bomb survivors of Hiroshima and Nagasaki. The health effects models include early and continuing effects, late somatic effects and genetic effects. The values of some model parameters are revised for early mortality. The models are modified for predicting late somatic effects such as leukemia and various kinds of cancers. The models for genetic effects are the same as those of NUREG. In order to test the performance of one of these codes, it is applied to the U.S. and Japanese populations. This paper provides descriptions of health effects models used in the two codes and gives comparisons of the mortality risks from each type of cancer for the two populations. (author)

  1. Development of parallel Fokker-Planck code ALLAp

    International Nuclear Information System (INIS)

    Batishcheva, A.A.; Sigmar, D.J.; Koniges, A.E.

    1996-01-01

    We report on our ongoing development of the 3D Fokker-Planck code ALLA for a highly collisional scrape-off-layer (SOL) plasma. A SOL with strong gradients of density and temperature in the spatial dimension is modeled. Our method is based on a 3-D adaptive grid (in space, magnitude of the velocity, and cosine of the pitch angle) and a second order conservative scheme. Note that the grid size is typically 100 x 257 x 65 nodes. It was shown in our previous work that only these capabilities make it possible to benchmark a 3D code against a spatially-dependent self-similar solution of a kinetic equation with the Landau collision term. In the present work we show results of a more precise benchmarking against the exact solutions of the kinetic equation using a new parallel code ALLAp with an improved method of parallelization and a modified boundary condition at the plasma edge. We also report first results from the code parallelization using Message Passing Interface for a Massively Parallel CRI T3D platform. We evaluate the ALLAp code performance versus the number of T3D processors used and compare its efficiency against a Work/Data Sharing parallelization scheme and a workstation version

  2. Identification of Sparse Audio Tampering Using Distributed Source Coding and Compressive Sensing Techniques

    Directory of Open Access Journals (Sweden)

    Valenzise G

    2009-01-01

    Full Text Available In the past few years, a large amount of techniques have been proposed to identify whether a multimedia content has been illegally tampered or not. Nevertheless, very few efforts have been devoted to identifying which kind of attack has been carried out, especially due to the large data required for this task. We propose a novel hashing scheme which exploits the paradigms of compressive sensing and distributed source coding to generate a compact hash signature, and we apply it to the case of audio content protection. The audio content provider produces a small hash signature by computing a limited number of random projections of a perceptual, time-frequency representation of the original audio stream; the audio hash is given by the syndrome bits of an LDPC code applied to the projections. At the content user side, the hash is decoded using distributed source coding tools. If the tampering is sparsifiable or compressible in some orthonormal basis or redundant dictionary, it is possible to identify the time-frequency position of the attack, with a hash size as small as 200 bits/second; the bit saving obtained by introducing distributed source coding ranges between 20% to 70%.

  3. Development of fast and accurate Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mori, Takamasa

    2001-01-01

    The development work of fast and accurate Monte Carlo code MVP has started at JAERI in late 80s. From the beginning, the code was designed to utilize vector supercomputers and achieved higher computation speed by a factor of 10 or more compared with conventional codes. In 1994, the first version of MVP was released together with cross section libraries based on JENDL-3.1 and JENDL-3.2. In 1996, minor revision was made by adding several functions such as treatments of ENDF-B6 file 6 data, time dependent problem, and so on. Since 1996, several works have been carried out for the next version of MVP. The main works are (1) the development of continuous energy Monte Carlo burn-up calculation code MVP-BURN, (2) the development of a system to generate cross section libraries at arbitrary temperature, and (3) the study on error estimations and their biases in Monte Carlo eigenvalue calculations. This paper summarizes the main features of MVP, results of recent studies and future plans for MVP. (author)

  4. The history and development of nonlinear stellar pulsation codes

    International Nuclear Information System (INIS)

    Davis, C.G.

    1987-01-01

    This review is limited to the history and development of nonlinear stellar pulsation codes and methods. The narrative includes examples of practical interest in the application of these numerical methods to problems in stellar pulsation such as Cepheid mass discrepancy, the delineation of the RR Lyrae instability strip, and the question of the development of double-mode pulsation as observed in Cepheids, RR Lyrae and other variable stars. 15 refs

  5. Development of the Multi-Phase/Multi-Dimensional Code BUBBLEX

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Kim, Shin Whan; Kim, Eun Kee

    2005-01-01

    A test version of the two-fluid program has been developed by extending the PISO algorithm. Unlike the conventional industry two-fluid codes, such as, RELAP5 and TRAC, this scheme does not need to develop a pressure matrix. Instead, it adopts the iterative procedure to implement the implicitness of the pressure. In this paper, a brief introduction to the numerical scheme will be presented. Then, its application to bubble column simulation will be described. Some concluding remarks will be followed

  6. Verification of WIMS-ANL to be used as supporting code for WIMS-CANDU development

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Dai Hai; Kim, Won Young; Park, Joo Hwan

    2007-08-15

    The lattice code WIMS-ANL has been tested in order to assess it for the qualification to be used as a supporting code to aide the WIMS-CANDU development. A series of calculations have been performed to determine lattice physics parameters such as multiplication factors, isotopic number densities and coolant void reactivity. The WIMS-ANL results are compared with the predictions of WIMS-AECL/D4/D5 and PPV (POWDERPUFS-V), and the comparisons indicate that WIMS-ANL can be used not only as a supporting code to aide the WIMS-CANDU development, but also as a starting source for the study of developing detailed model that could delineate the realistic situations as it might occur during LOCA such as the asymmetric flux distribution across lattice cell.

  7. SCRIC: a code dedicated to the detailed emission and absorption of heterogeneous NLTE plasmas; application to xenon EUV sources

    International Nuclear Information System (INIS)

    Gaufridy de Dortan, F. de

    2006-01-01

    Nearly all spectral opacity codes for LTE and NLTE plasmas rely on configurations approximate modelling or even supra-configurations modelling for mid Z plasmas. But in some cases, configurations interaction (either relativistic and non relativistic) induces dramatic changes in spectral shapes. We propose here a new detailed emissivity code with configuration mixing to allow for a realistic description of complex mid Z plasmas. A collisional radiative calculation. based on HULLAC precise energies and cross sections. determines the populations. Detailed emissivities and opacities are then calculated and radiative transfer equation is resolved for wide inhomogeneous plasmas. This code is able to cope rapidly with very large amount of atomic data. It is therefore possible to use complex hydrodynamic files even on personal computers in a very limited time. We used this code for comparison with Xenon EUV sources within the framework of nano-lithography developments. It appears that configurations mixing strongly shifts satellite lines and must be included in the description of these sources to enhance their efficiency. (author)

  8. SOURCES-3A: A code for calculating (α, n), spontaneous fission, and delayed neutron sources and spectra

    International Nuclear Information System (INIS)

    Perry, R.T.; Wilson, W.B.; Charlton, W.S.

    1998-04-01

    In many systems, it is imperative to have accurate knowledge of all significant sources of neutrons due to the decay of radionuclides. These sources can include neutrons resulting from the spontaneous fission of actinides, the interaction of actinide decay α-particles in (α,n) reactions with low- or medium-Z nuclides, and/or delayed neutrons from the fission products of actinides. Numerous systems exist in which these neutron sources could be important. These include, but are not limited to, clean and spent nuclear fuel (UO 2 , ThO 2 , MOX, etc.), enrichment plant operations (UF 6 , PuF 4 , etc.), waste tank studies, waste products in borosilicate glass or glass-ceramic mixtures, and weapons-grade plutonium in storage containers. SOURCES-3A is a computer code that determines neutron production rates and spectra from (α,n) reactions, spontaneous fission, and delayed neutron emission due to the decay of radionuclides in homogeneous media (i.e., a mixture of α-emitting source material and low-Z target material) and in interface problems (i.e., a slab of α-emitting source material in contact with a slab of low-Z target material). The code is also capable of calculating the neutron production rates due to (α,n) reactions induced by a monoenergetic beam of α-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 43 actinides. The (α,n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 89 nuclide decay α-particle spectra, 24 sets of measured and/or evaluated (α,n) cross sections and product nuclide level branching fractions, and functional α-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude and spectra of the resultant neutron source. It also provides an

  9. Application of software to development of reactor-safety codes

    International Nuclear Information System (INIS)

    Wilburn, N.P.; Niccoli, L.G.

    1980-09-01

    Over the past two-and-a-half decades, the application of new techniques has reduced hardware cost for digital computer systems and increased computational speed by several orders of magnitude. A corresponding cost reduction in business and scientific software development has not occurred. The same situation is seen for software developed to model the thermohydraulic behavior of nuclear systems under hypothetical accident situations. For all cases this is particularly noted when costs over the total software life cycle are considered. A solution to this dilemma for reactor safety code systems has been demonstrated by applying the software engineering techniques which have been developed over the course of the last few years in the aerospace and business communities. These techniques have been applied recently with a great deal of success in four major projects at the Hanford Engineering Development Laboratory (HEDL): 1) a rewrite of a major safety code (MELT); 2) development of a new code system (CONACS) for description of the response of LMFBR containment to hypothetical accidents, and 3) development of two new modules for reactor safety analysis

  10. Time-dependent anisotropic distributed source capability in transient 3-d transport code tort-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    The transient 3-D discrete ordinates transport code TORT-TD has been extended to account for time-dependent anisotropic distributed external sources. The extension aims at the simulation of the pulsed neutron source in the YALINA-Thermal subcritical assembly. Since feedback effects are not relevant in this zero-power configuration, this offers a unique opportunity to validate the time-dependent neutron kinetics of TORT-TD with experimental data. The extensions made in TORT-TD to incorporate a time-dependent anisotropic external source are described. The steady state of the YALINA-Thermal assembly and its response to an artificial square-wave source pulse sequence have been analysed with TORT-TD using pin-wise homogenised cross sections in 18 prompt energy groups with P 1 scattering order and 8 delayed neutron groups. The results demonstrate the applicability of TORT-TD to subcritical problems with a time-dependent external source. (authors)

  11. Imaging x-ray sources at a finite distance in coded-mask instruments

    International Nuclear Information System (INIS)

    Donnarumma, Immacolata; Pacciani, Luigi; Lapshov, Igor; Evangelista, Yuri

    2008-01-01

    We present a method for the correction of beam divergence in finite distance sources imaging through coded-mask instruments. We discuss the defocusing artifacts induced by the finite distance showing two different approaches to remove such spurious effects. We applied our method to one-dimensional (1D) coded-mask systems, although it is also applicable in two-dimensional systems. We provide a detailed mathematical description of the adopted method and of the systematics introduced in the reconstructed image (e.g., the fraction of source flux collected in the reconstructed peak counts). The accuracy of this method was tested by simulating pointlike and extended sources at a finite distance with the instrumental setup of the SuperAGILE experiment, the 1D coded-mask x-ray imager onboard the AGILE (Astro-rivelatore Gamma a Immagini Leggero) mission. We obtained reconstructed images of good quality and high source location accuracy. Finally we show the results obtained by applying this method to real data collected during the calibration campaign of SuperAGILE. Our method was demonstrated to be a powerful tool to investigate the imaging response of the experiment, particularly the absorption due to the materials intercepting the line of sight of the instrument and the conversion between detector pixel and sky direction

  12. Hybrid digital-analog coding with bandwidth expansion for correlated Gaussian sources under Rayleigh fading

    Science.gov (United States)

    Yahampath, Pradeepa

    2017-12-01

    Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.

  13. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  14. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  15. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  16. Usability in open source software development

    DEFF Research Database (Denmark)

    Andreasen, M. S.; Nielsen, H. V.; Schrøder, S. O.

    2006-01-01

    Open Source Software (OSS) development has gained significant importance in the production of soft-ware products. Open Source Software developers have produced systems with a functionality that is competitive with similar proprietary software developed by commercial software organizations. Yet OSS...

  17. Development of a code for the isotopic analysis of Uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2013-05-15

    To strengthen the national nuclear nonproliferation regime by an establishment of nuclear forensic system, the techniques for nuclear material analysis and the categorization of important domestic nuclear materials are being developed. MGAU and FRAM are commercial software for the isotopic analysis of Uranium by using γ-spectroscopy, but the diversity of detection geometry and some effects - self attenuation, coincidence summing, etc. - suggest an analysis tool under continual improvement and modification. Hence, developing another code for HPGe γ- and x-ray spectrum analysis is started in this study. The analysis of the 87-101 keV region of Uranium spectrum is attempted based on the isotopic responses similar to those developed in MGAU. The code for isotopic analysis of Uranium is started from a fitting.

  18. A plug-in to Eclipse for VHDL source codes: functionalities

    Science.gov (United States)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  19. Current status of ion source development

    International Nuclear Information System (INIS)

    Ishikawa, Junzo

    2001-01-01

    In this report, the current status of ion source development will be discussed. In September 2001, the 9th International Conference on Ion Sources (ICIS01) was held in Oakland, U.S.A. Referring the talks presented at ICIS01, recent topics in the ion source research fields will be described. (author)

  20. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  1. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  2. Development and verifications of fast reactor fuel design code ''Ceptar''

    International Nuclear Information System (INIS)

    Ozawa, T.; Nakazawa, H.; Abe, T.

    2001-01-01

    The annular fuel is very beneficial for fast reactors, because it is available for both high power and high burn-up. Concerning the irradiation behavior of the annular fuel, most of annular pellets irradiated up to high burn-up showed shrinkage of the central hole due to deformation and restructuring of the pellets. It is needed to predict precisely the shrinkage of the central hole during irradiation, because it has a great influence on power-to-melt. In this paper, outline of CEPTAR code (Calculation code to Evaluate fuel pin stability for annular fuel design) developed to meet this need is presented. In this code, the radial profile of fuel density can be computed by using the void migration model, and law of conservation of mass defines the inner diameter. For the mechanical analysis, the fuel and cladding deformation caused by the thermal expansion, swelling and creep is computed by the stress-strain analysis using the approximation of plane-strain. In addition, CEPTAR can also take into account the effect of Joint-Oxide-Gain (JOG) which is observed in fuel-cladding gap of high burn-up fuel. JOG has an effect to decrease the fuel swelling and to improve the gap conductance due to deposition of solid fission product. Based on post-irradiation data on PFR annular fuel, we developed an empirical model for JOG. For code verifications, the thermal and mechanical data obtained from various irradiation tests and post-irradiation examinations were compared with the predictions of this code. In this study, INTA (instrumented test assembly) test in JOYO, PTM (power-to-melt) test in JOYO, EBR-II, FFTF and MTR in Harwell laboratory, and post-irradiation examinations on a number of PFR fuels, were used as verification data. (author)

  3. Development of the criticality accident analysis code, AGNES

    International Nuclear Information System (INIS)

    Nakajima, Ken

    1989-01-01

    In the design works for the facilities which handle nuclear fuel, the evaluation of criticality accidents cannot be avoided even if their possibility is as small as negligible. In particular in the system using solution fuel like uranyl nitrate, solution has the property easily becoming dangerous form, and all the past criticality accidents occurred in the case of solution, therefore, the evaluation of criticality accidents becomes the most important item of safety analysis. When a criticality accident occurred in a solution fuel system, due to the generation and movement of radiolysis gas voids, the oscillation of power output and pressure pulses are observed. In order to evaluate the effect of criticality accidents, these output oscillation and pressure pulses must be calculated accurately. For this purpose, the development of the dynamic characteristic code AGNES (Accidentally Generated Nuclear Excursion Simulation code) was carried out. The AGNES is the reactor dynamic characteristic code having two independent void models. Modified energy model and pressure model, and as the benchmark calculation of the AGNES code, the results of the experimental analysis on the CRAC experiment are reported. (K.I.)

  4. SCDAP/RELAP5/MOD3 code development and assessment

    International Nuclear Information System (INIS)

    Allison, C.M.; Heath, C.H.; Siefken, L.J.; Hohorst, J.K.

    1991-01-01

    The SCDAP/RELAP5/MOD3 computer code is designed to describe the overall reactor coolant system (RCS) thermal-hydraulic response, core damage progression, and fission product release and transport during severe accidents. The code is being developed at the Idaho National Engineering Laboratory (INEL) under the primary sponsorship of the Office of Nuclear Regulatory Research of the Nuclear Regulatory Commission (NRC). SCDAP/RELAP5/MOD3, created in January, 1991, is the result of merging RELAP5/MOD3 with SCDAP and TRAP-MELT models from SCDAP/RELAP5/MOD2.5. The RELAP5 models calculate the overall RCS thermal-hydraulics, control system interactions, reactor kinetics, and the transport of noncondensible gases, fission products, and aerosols. The SCDAP models calculate the damage progression in the core structures, the formation, heatup, and melting of debris, and the creep rupture failure of the lower head and other RCS structures. The TRAP-MELT models calculate the deposition of fission products upon aerosols or structural surfaces; the formation, growth, or deposition of aerosols; and the evaporation of species from surfaces. The systematic assessment of modeling uncertainties in SCDAP/RELAP5 code is currently underway. This assessment includes (a) the evaluation of code-to-data comparisons using stand-alone SCDAP and SCDAP/RELAP5/MOD3, (b) the estimation of modeling and experimental uncertainties, and (c) the determination of the influence of those uncertainties on predicted severe accident behavior

  5. Recent Developments in the Code RITRACKS (Relativistic Ion Tracks)

    Science.gov (United States)

    Plante, Ianik; Ponomarev, Artem L.; Blattnig, Steve R.

    2018-01-01

    The code RITRACKS (Relativistic Ion Tracks) was developed to simulate detailed stochastic radiation track structures of ions of different types and energies. Many new capabilities were added to the code during the recent years. Several options were added to specify the times at which the tracks appear in the irradiated volume, allowing the simulation of dose-rate effects. The code has been used to simulate energy deposition in several targets: spherical, ellipsoidal and cylindrical. More recently, density changes as well as a spherical shell were implemented for spherical targets, in order to simulate energy deposition in walled tissue equivalent proportional counters. RITRACKS is used as a part of the new program BDSTracks (Biological Damage by Stochastic Tracks) to simulate several types of chromosome aberrations in various irradiation conditions. The simulation of damage to various DNA structures (linear and chromatin fiber) by direct and indirect effects has been improved and is ongoing. Many improvements were also made to the graphic user interface (GUI), including the addition of several labels allowing changes of units. A new GUI has been added to display the electron ejection vectors. The parallel calculation capabilities, notably the pre- and post-simulation processing on Windows and Linux machines have been reviewed to make them more portable between different systems. The calculation part is currently maintained in an Atlassian Stash® repository for code tracking and possibly future collaboration.

  6. Survey of source code metrics for evaluating testability of object oriented systems

    OpenAIRE

    Shaheen , Muhammad Rabee; Du Bousquet , Lydie

    2010-01-01

    Software testing is costly in terms of time and funds. Testability is a software characteristic that aims at producing systems easy to test. Several metrics have been proposed to identify the testability weaknesses. But it is sometimes difficult to be convinced that those metrics are really related with testability. This article is a critical survey of the source-code based metrics proposed in the literature for object-oriented software testability. It underlines the necessity to provide test...

  7. Development of source term PIRT of Fukushima Daiichi NPPs accident

    International Nuclear Information System (INIS)

    Suehiro, S.; Okamoto, K.

    2017-01-01

    The severe accident evaluation committee of AESJ (Atomic Energy Society of Japan) developed the thermal hydraulic PIRT (Phenomena Identification and Ranking Table) and the source term PIRT based on findings during the Fukushima Daiichi NPPs accident. These PIRTs aimed to explore the debris distribution and the current condition in the NPPs with high accuracy and to extract higher priority from the aspect of the sophistication of the analytical technology to predict the severe accident phenomena by the code. The source term PIRT was divided into 3 phases for the time domain and 9 categories for the spatial domain. The 68 phenomena were extracted and the importance from viewpoint of the source term was ranked through brainstorming and discussion. This paper describes the developed source term PIRT list and summarized the high ranked phenomena in each phase. (author)

  8. NEACRP comparison of source term codes for the radiation protection assessment of transportation packages

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Locke, H.F.; Avery, A.F.

    1994-01-01

    The results for Problems 5 and 6 of the NEACRP code comparison as submitted by six participating countries are presented in summary. These problems concentrate on the prediction of the neutron and gamma-ray sources arising in fuel after a specified irradiation, the fuel being uranium oxide for problem 5 and a mixture of uranium and plutonium oxides for problem 6. In both problems the predicted neutron sources are in good agreement for all participants. For gamma rays, however, there are differences, largely due to the omission of bremsstrahlung in some calculations

  9. Development of the biosphere code BIOMOD: final report

    International Nuclear Information System (INIS)

    Kane, P.

    1983-05-01

    Final report to DoE on the development of the biosphere code BIOMOD. The work carried out under the contract is itemised. Reference is made to the six documents issued along with the final report. These consist of two technical notes issued as interim consultative documents, a user's guide and a programmer's guide to BIOMOD, a database description, program test document and a technical note entitled ''BIOMOD - preliminary findings''. (author)

  10. Multi-rate control over AWGN channels via analog joint source-channel coding

    KAUST Repository

    Khina, Anatoly; Pettersson, Gustav M.; Kostina, Victoria; Hassibi, Babak

    2017-01-01

    We consider the problem of controlling an unstable plant over an additive white Gaussian noise (AWGN) channel with a transmit power constraint, where the signaling rate of communication is larger than the sampling rate (for generating observations and applying control inputs) of the underlying plant. Such a situation is quite common since sampling is done at a rate that captures the dynamics of the plant and which is often much lower than the rate that can be communicated. This setting offers the opportunity of improving the system performance by employing multiple channel uses to convey a single message (output plant observation or control input). Common ways of doing so are through either repeating the message, or by quantizing it to a number of bits and then transmitting a channel coded version of the bits whose length is commensurate with the number of channel uses per sampled message. We argue that such “separated source and channel coding” can be suboptimal and propose to perform joint source-channel coding. Since the block length is short we obviate the need to go to the digital domain altogether and instead consider analog joint source-channel coding. For the case where the communication signaling rate is twice the sampling rate, we employ the Archimedean bi-spiral-based Shannon-Kotel'nikov analog maps to show significant improvement in stability margins and linear-quadratic Gaussian (LQG) costs over simple schemes that employ repetition.

  11. Multi-rate control over AWGN channels via analog joint source-channel coding

    KAUST Repository

    Khina, Anatoly

    2017-01-05

    We consider the problem of controlling an unstable plant over an additive white Gaussian noise (AWGN) channel with a transmit power constraint, where the signaling rate of communication is larger than the sampling rate (for generating observations and applying control inputs) of the underlying plant. Such a situation is quite common since sampling is done at a rate that captures the dynamics of the plant and which is often much lower than the rate that can be communicated. This setting offers the opportunity of improving the system performance by employing multiple channel uses to convey a single message (output plant observation or control input). Common ways of doing so are through either repeating the message, or by quantizing it to a number of bits and then transmitting a channel coded version of the bits whose length is commensurate with the number of channel uses per sampled message. We argue that such “separated source and channel coding” can be suboptimal and propose to perform joint source-channel coding. Since the block length is short we obviate the need to go to the digital domain altogether and instead consider analog joint source-channel coding. For the case where the communication signaling rate is twice the sampling rate, we employ the Archimedean bi-spiral-based Shannon-Kotel\\'nikov analog maps to show significant improvement in stability margins and linear-quadratic Gaussian (LQG) costs over simple schemes that employ repetition.

  12. D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things.

    Science.gov (United States)

    Aktas, Metin; Kuscu, Murat; Dinc, Ergin; Akan, Ozgur B

    2018-01-01

    Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST).

  13. Scientific codes developed and used at GRS. Nuclear simulation chain

    Energy Technology Data Exchange (ETDEWEB)

    Schaffrath, Andreas; Sonnenkalb, Martin; Sievers, Juergen; Luther, Wolfgang; Velkov, Kiril [Gesellschaft fuer Anlagen und Reaktorsicherheit (GRS) gGmbH, Garching/Muenchen (Germany). Forschungszentrum

    2016-05-15

    Over 60 technical experts of the reactor safety research division of the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH are developing and validating reliable methods and computer codes - summarized under the term nuclear simulation chain - for the safety-related assessment for all types of nuclear power plants (NPP) and other nuclear facilities considering the current state of science and technology. This nuclear simulation chain has to be able to simulate and assess all relevant physical processes and phenomena for all operating states and (severe) accidents. In the present contribution, the nuclear simulation chain developed and applied by GRS as well as selected examples of its application are presented. The latter demonstrate impressively the width of its scope and its performance. The GRS codes can be passed on request to other (national as well as international) organizations. This contributes to a worldwide increase of the nuclear safety standards. The code transfer is especially important for developing and emerging countries lacking the financial means and/or the necessary know-how for this purpose. At the end of this contribution, the respective course of action is described.

  14. New developments with H-sources

    International Nuclear Information System (INIS)

    Sherman, Joseph D.; Rouleau, G.

    2002-01-01

    Existing spallation neutron source upgrades, planned spallation neutron sources, and high-energy accelerators for particle physics place demanding requirements on the Hsources. These requirements ask for increased beam currents and duty factor (df) while generally maintaining state-of-the art H' source emittance. A variety of H sources are being developed to address these challenges. These include volume sources with and without the addition of cesium for enhanced He production, increased df cesiated H' Penning and magnetron sources, and cesiated surface converter H- sources. Research on surface films of tantalum metal for enhanced volume H- production is also being studied. Innovative plasma production techniques to address the longer df requirement without sacrificing H- source reliability and liktime will be reviewed. The physical bases, the goals, and perceived challenges will be discussed.

  15. Basic Pilot Code Development for Two-Fluid, Three-Field Model

    International Nuclear Information System (INIS)

    Jeong, Jae Jun; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Hwang, M.; Ha, K. S.; Kang, D. H.

    2006-03-01

    A basic pilot code for one-dimensional, transient, two-fluid, three-field model has been developed. Using 9 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: - It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. A mist flow was not simulated, but it seems that the basic pilot code can simulate mist flow conditions. - The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. - The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. - It was confirmed that the inlet pressure and velocity boundary conditions work properly. - It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. - During the simulation of a two-phase flow, the calculation reaches a quasisteady state with small-amplitude oscillations. The oscillations seem to be induced by some numerical causes. The research items for the improvement of the basic pilot code are listed in the last section of this report

  16. Basic Pilot Code Development for Two-Fluid, Three-Field Model

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jae Jun; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Hwang, M.; Ha, K. S.; Kang, D. H

    2006-03-15

    A basic pilot code for one-dimensional, transient, two-fluid, three-field model has been developed. Using 9 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: - It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. A mist flow was not simulated, but it seems that the basic pilot code can simulate mist flow conditions. - The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. - The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. - It was confirmed that the inlet pressure and velocity boundary conditions work properly. - It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. - During the simulation of a two-phase flow, the calculation reaches a quasisteady state with small-amplitude oscillations. The oscillations seem to be induced by some numerical causes. The research items for the improvement of the basic pilot code are listed in the last section of this report.

  17. Development status of the lattice physics code in COSINE project

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.; Yu, H.; Li, S.; Liu, Z.; Yan, Y. [State Nuclear Power Software Development Center, SNPTC, National Energy Key Laboratory of Nuclear Power Software NEKLS, North Third Ring Road, Beijing 100029 (China)

    2013-07-01

    LATC is an essential part of COSINE code package, which stands for Core and System Integrated Engine for design and analysis. LATC performs 2D multi-group assembly transport calculation and generates few group constants and the required cross-section data for CORE, the core simulator code. LATC is designed to have the capability of modeling the API 000 series assemblies. The development is a continuously improved process. Currently, LATC uses well-proven technology to achieve the key functions. In the next stage, more advanced methods and modules will be implemented. At present, WIMS and WIMS improved format library could be read in LATC code. For resonance calculation, equivalent relation with rational approximations is utilized. For transport calculation, two options are available. One choice is collision probability method in cell homogenization while discrete coordinate method in assembly homogenization, the other is method of characteristics in assembly homogenization directly. For depletion calculation, an improved linear rate 'constant power' depletion method has been developed. (authors)

  18. Development status of the lattice physics code in COSINE project

    International Nuclear Information System (INIS)

    Chen, Y.; Yu, H.; Li, S.; Liu, Z.; Yan, Y.

    2013-01-01

    LATC is an essential part of COSINE code package, which stands for Core and System Integrated Engine for design and analysis. LATC performs 2D multi-group assembly transport calculation and generates few group constants and the required cross-section data for CORE, the core simulator code. LATC is designed to have the capability of modeling the API 000 series assemblies. The development is a continuously improved process. Currently, LATC uses well-proven technology to achieve the key functions. In the next stage, more advanced methods and modules will be implemented. At present, WIMS and WIMS improved format library could be read in LATC code. For resonance calculation, equivalent relation with rational approximations is utilized. For transport calculation, two options are available. One choice is collision probability method in cell homogenization while discrete coordinate method in assembly homogenization, the other is method of characteristics in assembly homogenization directly. For depletion calculation, an improved linear rate 'constant power' depletion method has been developed. (authors)

  19. Development and preliminary validation of flux map processing code MAPLE

    International Nuclear Information System (INIS)

    Li Wenhuai; Zhang Xiangju; Dang Zhen; Chen Ming'an; Lu Haoliang; Li Jinggang; Wu Yuanbao

    2013-01-01

    The self-reliant flux map processing code MAPLE was developed by China General Nuclear Power Corporation (CGN). Weight coefficient method (WCM), polynomial expand method (PEM) and thin plane spline (TPS) method were applied to fit the deviation between measured and predicted detector signal results for two-dimensional radial plane, to interpolate or extrapolate the non-instrumented location deviation. Comparison of results in the test cases shows that the TPS method can better capture the information of curved fitting lines than the other methods. The measured flux map data of the Lingao Nuclear Power Plant were processed using MAPLE as validation test cases, combined with SMART code. Validation results show that the calculation results of MAPLE are reasonable and satisfied. (authors)

  20. Development of Educational SharePoint portal for coding students

    OpenAIRE

    Colomer Castelló, Gerard

    2016-01-01

    The project will explain what is SharePoint, why is used and who does use it. It will also expose some alternatives, compare them to SharePoint and expose the pros and cons. The main objective of this project will be developing an educational SharePoint portal. Using this portal, coding students will be able to share their solutions to different exercises, vote the best solutions and comment them. This portal will be developed using only software and servers obtained legally without any cost....

  1. An Expert System for the Development of Efficient Parallel Code

    Science.gov (United States)

    Jost, Gabriele; Chun, Robert; Jin, Hao-Qiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    We have built the prototype of an expert system to assist the user in the development of efficient parallel code. The system was integrated into the parallel programming environment that is currently being developed at NASA Ames. The expert system interfaces to tools for automatic parallelization and performance analysis. It uses static program structure information and performance data in order to automatically determine causes of poor performance and to make suggestions for improvements. In this paper we give an overview of our programming environment, describe the prototype implementation of our expert system, and demonstrate its usefulness with several case studies.

  2. Development of the SCHAMBETA code for scoping analysis of HCDA

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Soo Dong; Hahn, D. H

    2000-06-01

    A computer code, SCHAMBETA(Scoping Code for HCDA Analysis using Modified Bethe-Tait Method), is developed to investigate the core disassembly process following a meltdown accident in the framework of a mofified Bethe-Tait method as part of the scoping analysis work to demonstrate the inherent safety of conceptual designs of Korea Advanced Liquid Metal Reactor(KALIMER), A 150 Mwe pool-type sodium cooled prototype fast reactor that uses U-Pu-Zr metallic fuel. The methodologies adopted in the code ared particularly useful to perform various parametric studies for better understanding of core disassembly process of liquid metal fast reactors as well as to estimate upper-limit values of the energy release resulting from a power excursion. In the SCHAMBETA code, the core kinetics and hydraulic behavior of the KALIMER is followed over the period of the super-prompt critical power excursion induced by the ramp reactivity insertion, starting at the time that the sodium-voided core reaches the melting temperature of the metallic fuels. For this purpose, the equations of state of pressure-energy density relationship are derived for the saturated-vapor as well as the solid liquid of metallic uranium fuel, and implemenmted into the formulations of the disassembly reactivity. Mathematical formulations are then developed, in the framework of Modified Bethe-Tait method, in a form relevant to utilize the improved equations of state as well as to consider Doppler effects, for scoping analysis of the super-prompt-critical power excursions driven by a specified rate of reactivity insertion.

  3. Report on FY15 alloy 617 code rules development

    Energy Technology Data Exchange (ETDEWEB)

    Sham, Sam [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jetter, Robert I [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hollinger, Greg [Becht Engineering Co., Inc., Liberty Corner, NJ (United States); Pease, Derrick [Becht Engineering Co., Inc., Liberty Corner, NJ (United States); Carter, Peter [Stress Engineering Services, Inc., Houston, TX (United States); Pu, Chao [Univ. of Tennessee, Knoxville, TN (United States); Wang, Yanli [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    Due to its strength at very high temperatures, up to 950°C (1742°F), Alloy 617 is the reference construction material for structural components that operate at or near the outlet temperature of the very high temperature gas-cooled reactors. However, the current rules in the ASME Section III, Division 5 Subsection HB, Subpart B for the evaluation of strain limits and creep-fatigue damage using simplified methods based on elastic analysis have been deemed inappropriate for Alloy 617 at temperatures above 650°C (1200°F) (Corum and Brass, Proceedings of ASME 1991 Pressure Vessels and Piping Conference, PVP-Vol. 215, p.147, ASME, NY, 1991). The rationale for this exclusion is that at higher temperatures it is not feasible to decouple plasticity and creep, which is the basis for the current simplified rules. This temperature, 650°C (1200°F), is well below the temperature range of interest for this material for the high temperature gas-cooled reactors and the very high temperature gas-cooled reactors. The only current alternative is, thus, a full inelastic analysis requiring sophisticated material models that have not yet been formulated and verified. To address these issues, proposed code rules have been developed which are based on the use of elastic-perfectly plastic (EPP) analysis methods applicable to very high temperatures. The proposed rules for strain limits and creep-fatigue evaluation were initially documented in the technical literature (Carter, Jetter and Sham, Proceedings of ASME 2012 Pressure Vessels and Piping Conference, papers PVP 2012 28082 and PVP 2012 28083, ASME, NY, 2012), and have been recently revised to incorporate comments and simplify their application. Background documents have been developed for these two code cases to support the ASME Code committee approval process. These background documents for the EPP strain limits and creep-fatigue code cases are documented in this report.

  4. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  5. GRHydro: a new open-source general-relativistic magnetohydrodynamics code for the Einstein toolkit

    International Nuclear Information System (INIS)

    Mösta, Philipp; Haas, Roland; Ott, Christian D; Reisswig, Christian; Mundim, Bruno C; Faber, Joshua A; Noble, Scott C; Bode, Tanja; Löffler, Frank; Schnetter, Erik

    2014-01-01

    We present the new general-relativistic magnetohydrodynamics (GRMHD) capabilities of the Einstein toolkit, an open-source community-driven numerical relativity and computational relativistic astrophysics code. The GRMHD extension of the toolkit builds upon previous releases and implements the evolution of relativistic magnetized fluids in the ideal MHD limit in fully dynamical spacetimes using the same shock-capturing techniques previously applied to hydrodynamical evolution. In order to maintain the divergence-free character of the magnetic field, the code implements both constrained transport and hyperbolic divergence cleaning schemes. We present test results for a number of MHD tests in Minkowski and curved spacetimes. Minkowski tests include aligned and oblique planar shocks, cylindrical explosions, magnetic rotors, Alfvén waves and advected loops, as well as a set of tests designed to study the response of the divergence cleaning scheme to numerically generated monopoles. We study the code’s performance in curved spacetimes with spherical accretion onto a black hole on a fixed background spacetime and in fully dynamical spacetimes by evolutions of a magnetized polytropic neutron star and of the collapse of a magnetized stellar core. Our results agree well with exact solutions where these are available and we demonstrate convergence. All code and input files used to generate the results are available on http://einsteintoolkit.org. This makes our work fully reproducible and provides new users with an introduction to applications of the code. (paper)

  6. ASTEC V2. Overview of code development and application at GRS

    International Nuclear Information System (INIS)

    Reinke, N.; Nowack, H.; Sonnenkalb, M.

    2011-01-01

    The integral code ASTEC (Accident Source Term Evaluation Code) commonly developed since 1996 by the French IRSN and the German GRS is a fast running programme, which allows the calculation of entire sequences of severe accidents (SA) in light water reactors from the initiating event up to the release of fission products into the environment, thereby covering all important in-vessel and containment phenomena. Thus, the main ASTEC application fields are intended to be accident sequence studies, uncertainty and sensitivity studies, probabilistic safety analysis level 2 as well as support to experiments. The modular structure of ASTEC allows running each module independently and separately, e.g. for separate effects analyses as well as a combination of multiple modules for coupled effects testing and integral analyses. Subject of this paper is an overview of the new V2 series of the ASTEC code system and presentation of exemplary results for the application to severe accidents sequences at PWRs. (orig.)

  7. Four energy group neutron flux distribution in the Syrian miniature neutron source reactor using the WIMSD4 and CITATION code

    International Nuclear Information System (INIS)

    Khattab, K.; Omar, H.; Ghazi, N.

    2009-01-01

    A 3-D (R, θ , Z) neutronic model for the Miniature Neutron Source Reactor (MNSR) was developed earlier to conduct the reactor neutronic analysis. The group constants for all the reactor components were generated using the WIMSD4 code. The reactor excess reactivity and the four group neutron flux distributions were calculated using the CITATION code. This model is used in this paper to calculate the point wise four energy group neutron flux distributions in the MNSR versus the radius, angle and reactor axial directions. Good agreement is noticed between the measured and the calculated thermal neutron flux in the inner and the outer irradiation site with relative difference less than 7% and 5% respectively. (author)

  8. Development of GUI systems for the MIDAS code

    International Nuclear Information System (INIS)

    Kim, K.R.; Park, S.H.; Kim, D.H.

    2004-01-01

    MIDAS is being developed at KAERI based on MELCOR as an integrated severe accident analysis code with existing model modification and new model addition. MIDAS was restructured to avoid the pointer based variable referencing style of MELCOR, and enhanced the memory effectiveness using the dynamic allocation method of Fortran 90. This paper describes recent activities of developing the GUI environments for MIDAS code at KAERI. Up to now, we have developed the four PC-based subsystems, which are IEDIT, IPLOT, SATS and HyperKAMG. IEDIT is an input management system that can read MELCOR input files and display its information in the Window panels. Users can modify each item in the panel and the input file will be modified according to that changes. IPLOT is a simple plotting system that can draw MIDAS plot variables trend graphs. SATS is developed as a severe accident training simulator that can display nuclear plant behavior graphically. Moreover SATS provides several controllable pumps and valves which appeared in the severe accidence. Together with SATS and the online severe accident guidance HyperKAMG, combined properly, severe accident mitigation scenarios could be presented graphically and dramatically without any change of MELCOR inputs. GUI development as a part of a severe accident management program package, MIDAS. (author)

  9. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.

  10. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.

  11. High current ion source development at Frankfurt

    Energy Technology Data Exchange (ETDEWEB)

    Volk, K.; Klein, H.; Lakatos, A.; Maaser, A.; Weber, M. [Frankfurt Univ. (Germany). Inst. fuer Angewandte Physik

    1995-11-01

    The development of high current positive and negative ion sources is an essential issue for the next generation of high current linear accelerators. Especially, the design of the European Spallation Source facility (ESS) and the International Fusion Material Irradiation Test Facility (IFMIF) have increased the significance of high brightness hydrogen and deuterium sources. As an example, for the ESS facility, two H{sup -}-sources each delivering a 70 mA H{sup -}-beam in 1.45 ms pulses at a repetition rate of 50 Hz are necessary. A low emittance is another important prerequisite. The source must operate, while meeting the performance requirements, with a constancy and reliability over an acceptable period of time. The present paper summarizes the progress achieved in ion sources development of intense, single charge, positive and negative ion beams. (author) 16 figs., 7 refs.

  12. High current ion source development at Frankfurt

    International Nuclear Information System (INIS)

    Volk, K.; Klein, H.; Lakatos, A.; Maaser, A.; Weber, M.

    1995-01-01

    The development of high current positive and negative ion sources is an essential issue for the next generation of high current linear accelerators. Especially, the design of the European Spallation Source facility (ESS) and the International Fusion Material Irradiation Test Facility (IFMIF) have increased the significance of high brightness hydrogen and deuterium sources. As an example, for the ESS facility, two H - -sources each delivering a 70 mA H - -beam in 1.45 ms pulses at a repetition rate of 50 Hz are necessary. A low emittance is another important prerequisite. The source must operate, while meeting the performance requirements, with a constancy and reliability over an acceptable period of time. The present paper summarizes the progress achieved in ion sources development of intense, single charge, positive and negative ion beams. (author) 16 figs., 7 refs

  13. The role of the PIRT process in identifying code improvements and executing code development

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1997-01-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a low probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications

  14. The role of the PIRT process in identifying code improvements and executing code development

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.E. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Boyack, B.E. [Los Alamos National Lab., NM (United States)

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.

  15. H- radio frequency source development at the Spallation Neutron Source.

    Science.gov (United States)

    Welton, R F; Dudnikov, V G; Gawne, K R; Han, B X; Murray, S N; Pennisi, T R; Roseberry, R T; Santana, M; Stockli, M P; Turvey, M W

    2012-02-01

    The Spallation Neutron Source (SNS) now routinely operates nearly 1 MW of beam power on target with a highly persistent ∼38 mA peak current in the linac and an availability of ∼90%. H(-) beam pulses (∼1 ms, 60 Hz) are produced by a Cs-enhanced, multicusp ion source closely coupled with an electrostatic low energy beam transport (LEBT), which focuses the 65 kV beam into a radio frequency quadrupole accelerator. The source plasma is generated by RF excitation (2 MHz, ∼60 kW) of a copper antenna that has been encased with a thickness of ∼0.7 mm of porcelain enamel and immersed into the plasma chamber. The ion source and LEBT normally have a combined availability of ∼99%. Recent increases in duty-factor and RF power have made antenna failures a leading cause of downtime. This report first identifies the physical mechanism of antenna failure from a statistical inspection of ∼75 antennas which ran at the SNS, scanning electron microscopy studies of antenna surface, and cross sectional cuts and analysis of calorimetric heating measurements. Failure mitigation efforts are then described which include modifying the antenna geometry and our acceptance∕installation criteria. Progress and status of the development of the SNS external antenna source, a long-term solution to the internal antenna problem, are then discussed. Currently, this source is capable of delivering comparable beam currents to the baseline source to the SNS and, an earlier version, has briefly demonstrated unanalyzed currents up to ∼100 mA (1 ms, 60 Hz) on the test stand. In particular, this paper discusses plasma ignition (dc and RF plasma guns), antenna reliability, magnet overheating, and insufficient beam persistence.

  16. Development of a neutronic analysis code using data from Monju

    International Nuclear Information System (INIS)

    Rooijen, W.F.G. van; Yamano, N.; Shimazu, Y.

    2015-01-01

    In recent years three major sets of modern evaluated nuclear data have become available: JENDL-4.0, JEFF-3.1.2 and ENDF/B-VII.1. The authors were involved with a research project to establish analysis method for a future commercial-scale LMFBR. This project focused on JENDL-4.0 and conventional Japanese codes. As a cross check, we decided to also apply the fast reactor code ERANOS. This necessitated to produce nuclear data (cross sections, etc) for the ERANOS code system, discussed in this paper. We developed a nuclear data processing system to produce cross sections, probability tables, delayed neutron data, and covariance data from the evaluated nuclear data files for ERANOS. A benchmark calculation on the MZA/MZB benchmark showed very satisfying results. Subsequently, we analyzed the prototype LMFBR Monju with ERANOS and our own sets of nuclear data. The results are very satisfactory. The results from ERANOS indicate that the target accuracies for nuclear data have not been met, although the three sets of evaluated nuclear data all performed very well in our analysis. In the future, the covariance on nuclear data should be reduced to meet the target accuracies on criticality and feedback coefficients. (author)

  17. Development of INCTAC code for analyzing criticality accident phenomena

    International Nuclear Information System (INIS)

    Mitake, Susumu; Hayashi, Yamato; Sakurai, Shungo

    2003-01-01

    Aiming at understanding nuclear transients and thermal- and hydraulic-phenomena of the criticality accident, a code named INCTAC has been newly developed at the Institute of Nuclear Safety. The code is applicable to the analysis of criticality accident transients of aqueous homogenous fuel solution system. Neutronic transient model is composed of equations for the kinetics and for the spatial distributions, which are deduced from the time dependent multi-group transport equations with the quasi steady state assumption. Thermal-hydraulic transient model is composed of a complete set of the mass, momentum and energy equations together with the two-phase flow assumptions. Validation tests of INCTAC were made using the data obtained at TRACY, a transient experiment criticality facility of JAERI. The calculated results with INCTAC showed a very good agreement with the experiment data, except a slight discrepancy of the time when the peak of reactor power was attained. But, the discrepancy was resolved with the use of an adequate model for movement and transfer of the void in the fuel solution mostly generated by radiolysis. With a simulation model for the transport of radioactive materials through ventilation systems to the environment, INCTAC will be used as an overall safety evaluation code of the criticality accident. (author)

  18. Trust in Co-sourced Software Development

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2014-01-01

    Software development projects are increasingly geographical distributed with offshoring. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how dynamic aspects of trust are shaped in co-sourcing activities is limite...... understanding or personal trust relations. The paper suggests how certain work practices among developers and managers can be explained using a dynamic trust lens based on Abstract Systems, especially dis- and re-embedding mechanisms......Software development projects are increasingly geographical distributed with offshoring. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how dynamic aspects of trust are shaped in co-sourcing activities is limited...

  19. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  20. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  1. ON CODE REFACTORING OF THE DIALOG SUBSYSTEM OF CDSS PLATFORM FOR THE OPEN-SOURCE MIS OPENMRS

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2016-08-01

    The open-source MIS OpenMRS developer tools and software API are reviewed. The results of code refactoring of the dialog subsystem of the CDSS platform which is made as module for the open-source MIS OpenMRS are presented. The structure of information model of database of the CDSS dialog subsystem was updated according with MIS OpenMRS requirements. The Model-View-Controller (MVC based approach to the CDSS dialog subsystem architecture was re-implemented with Java programming language using Spring and Hibernate frameworks. The MIS OpenMRS Encounter portlet form for the CDSS dialog subsystem integration is developed as an extension. The administrative module of the CDSS platform is recreated. The data exchanging formats and methods for interaction of OpenMRS CDSS dialog subsystem module and DecisionTree GAE service are re-implemented with help of AJAX technology via jQuery library

  2. Open source software development : some historical perspectives

    NARCIS (Netherlands)

    Nuvolari, A.

    2005-01-01

    In this paper we suggest that historical studies of technology can help us to account for some, perplexing (at least for traditional economic reasoning) features of open source software development. From a historical perspective, open source software seems to be a particular case of what Robert C.

  3. Open source software development : some historical perspectives

    NARCIS (Netherlands)

    Nuvolari, A.

    2003-01-01

    In this paper we suggest that historical studies of technology can help us to account for some, perplexing (at least for traditional economic reasoning) features of open source software development. When looked in historical perspective, open source software seems to be a particular case of what

  4. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    Science.gov (United States)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  5. Integrating HCI Specialists into Open Source Software Development Projects

    Science.gov (United States)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  6. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    Science.gov (United States)

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  7. Chronos sickness: digital reality in Duncan Jones’s Source Code

    Directory of Open Access Journals (Sweden)

    Marcia Tiemy Morita Kawamoto

    2017-01-01

    Full Text Available http://dx.doi.org/10.5007/2175-8026.2017v70n1p249 The advent of the digital technologies unquestionably affected the cinema. The indexical relation and realistic effect with the photographed world much praised by André Bazin and Roland Barthes is just one of the affected aspects. This article discusses cinema in light of the new digital possibilities, reflecting on Steven Shaviro’s consideration of “how a nonindexical realism might be possible” (63 and how in fact a new kind of reality, a digital one, might emerge in the science fiction film Source Code (2013 by Duncan Jones.

  8. A Stigmergy Approach for Open Source Software Developer Community Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Beaver, Justin M [ORNL; Potok, Thomas E [ORNL; Pullum, Laura L [ORNL; Treadwell, Jim N [ORNL

    2009-01-01

    The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agent behaviors selection probability.

  9. Co-sourcing in software development offshoring

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2013-01-01

    Software development projects are increasingly geographical distributed with offshoring, which introduce complex risks that can lead to project failure. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how co......-sourcing shapes the perception and alleviation of common offshoring risks is limited. We present a case study of how a certified CMMI-level 5 Danish software supplier approaches these risks in offshore co-sourcing. The paper explains how common offshoring risks are perceived and alleviated when adopting the co...

  10. SMILEI: A collaborative, open-source, multi-purpose PIC code for the next generation of super-computers

    Science.gov (United States)

    Grech, Mickael; Derouillat, J.; Beck, A.; Chiaramello, M.; Grassi, A.; Niel, F.; Perez, F.; Vinci, T.; Fle, M.; Aunai, N.; Dargent, J.; Plotnikov, I.; Bouchard, G.; Savoini, P.; Riconda, C.

    2016-10-01

    Over the last decades, Particle-In-Cell (PIC) codes have been central tools for plasma simulations. Today, new trends in High-Performance Computing (HPC) are emerging, dramatically changing HPC-relevant software design and putting some - if not most - legacy codes far beyond the level of performance expected on the new and future massively-parallel super computers. SMILEI is a new open-source PIC code co-developed by both plasma physicists and HPC specialists, and applied to a wide range of physics-related studies: from laser-plasma interaction to astrophysical plasmas. It benefits from an innovative parallelization strategy that relies on a super-domain-decomposition allowing for enhanced cache-use and efficient dynamic load balancing. Beyond these HPC-related developments, SMILEI also benefits from additional physics modules allowing to deal with binary collisions, field and collisional ionization and radiation back-reaction. This poster presents the SMILEI project, its HPC capabilities and illustrates some of the physics problems tackled with SMILEI.

  11. Development and validation of computer codes for analysis of PHWR containment behaviour

    International Nuclear Information System (INIS)

    Markandeya, S.G.; Haware, S.K.; Ghosh, A.K.; Venkat Raj, V.

    1997-01-01

    In order to ensure that the design intent of the containment of Indian Pressurised Heavy Water Reactors (IPHWRs) is met, both analytical and experimental studies are being pursued at BARC. As a part of analytical studies, computer codes for predicting the behaviour of containment under various accident scenarios are developed/adapted. These include codes for predicting 1) pressure, temperature transients in the containment following either Loss of Coolant Accident (LOCA) or Main Steam Line Break (MSLB), 2) hydrogen behaviour in respect of its distribution, combustion and the performance of proposed mitigation systems, and 3) behaviour of fission product aerosols in the piping circuits of the primary heat transport system and in the containment. All these codes have undergone thorough validation using data obtained from in-house test facilities or from international sources. Participation in the International Standard Problem (ISP) exercises has also helped in validation of the codes. The present paper briefly describes some of these codes and the various exercises performed for their validation. (author)

  12. Development of a safety analysis code for molten salt reactors

    International Nuclear Information System (INIS)

    Zhang Dalin; Qiu Suizheng; Su Guanghui

    2009-01-01

    The molten salt reactor (MSR) well suited to fulfill the criteria defined by the Generation IV International Forum (GIF) is presently revisited all around the world because of different attractive features of current renewed relevance. The MSRs are characterized by using the fluid-fuel, so that their technologies are fundamentally different from those used in the conventional solid-fuel reactors. In this work, in particular, the attention is focused on the safety characteristic analysis of the MSRs, in which a point kinetic model considering the flow effects of the fuel salt is established for the MSRs and calculated by developing a microcomputer code coupling with a simplified heat transfer model in the core. The founded models and developed code are applied to analyze the safety characteristics of the molten salt actinide recycler and transmuter system (MOSART) by simulating three types of basic transient conditions including the unprotected loss of flow, unprotected overcooling accident and unprotected transient overpower. Some reasonable results are obtained for the MOSART, which show that the MOSART conceptual design is an inherently stable reactor design. The present study provides some valuable information for the research and design of the new generation MSRs.

  13. Synchrotron light sources in developing countries

    Science.gov (United States)

    Mtingwa, Sekazi K.; Winick, Herman

    2018-03-01

    We discuss the role that synchrotron light sources, such as SESAME, could play in improving the socioeconomic conditions in developing countries. After providing a brief description of a synchrotron light source, we discuss the important role that they played in the development of several economically emerging countries. Then we describe the state of synchrotron science in South Africa and that country’s leadership role in founding the African Light Source initiative. Next, we highlight a new initiative called Lightsources for Africa, the Americas & Middle East Project, which is a global initiative led by the International Union of Pure and Applied Physics and the International Union of Crystallography, with initial funding provided by the International Council for Science. Finally, we comment on a new technology called the multibend achromat that has launched a new paradigm for the design of synchrotron light sources that should be attractive for construction in developing countries.

  14. Qualifying codes under software quality assurance: Two examples as guidelines for codes that are existing or under development

    Energy Technology Data Exchange (ETDEWEB)

    Mangold, D.

    1993-05-01

    Software quality assurance is an area of concem for DOE, EPA, and other agencies due to the poor quality of software and its documentation they have received in the past. This report briefly summarizes the software development concepts and terminology increasingly employed by these agencies and provides a workable approach to scientific programming under the new requirements. Following this is a practical description of how to qualify a simulation code, based on a software QA plan that has been reviewed and officially accepted by DOE/OCRWM. Two codes have recently been baselined and qualified, so that they can be officially used for QA Level 1 work under the DOE/OCRWM QA requirements. One of them was baselined and qualified within one week. The first of the codes was the multi-phase multi-component flow code TOUGH version 1, an already existing code, and the other was a geochemistry transport code STATEQ that was under development The way to accomplish qualification for both types of codes is summarized in an easy-to-follow step-by step fashion to illustrate how to baseline and qualify such codes through a relatively painless procedure.

  15. Qualifying codes under software quality assurance: Two examples as guidelines for codes that are existing or under development

    International Nuclear Information System (INIS)

    Mangold, D.

    1993-05-01

    Software quality assurance is an area of concern for DOE, EPA, and other agencies due to the poor quality of software and its documentation they have received in the past. This report briefly summarizes the software development concepts and terminology increasingly employed by these agencies and provides a workable approach to scientific programming under the new requirements. Following this is a practical description of how to qualify a simulation code, based on a software QA plan that has been reviewed and officially accepted by DOE/OCRWM. Two codes have recently been baselined and qualified, so that they can be officially used for QA Level 1 work under the DOE/OCRWM QA requirements. One of them was baselined and qualified within one week. The first of the codes was the multi-phase multi-component flow code TOUGH version 1, an already existing code, and the other was a geochemistry transport code STATEQ that was under development The way to accomplish qualification for both types of codes is summarized in an easy-to-follow step-by step fashion to illustrate how to baseline and qualify such codes through a relatively painless procedure

  16. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  17. A statistical–mechanical view on source coding: physical compression and data compression

    International Nuclear Information System (INIS)

    Merhav, Neri

    2011-01-01

    We draw a certain analogy between the classical information-theoretic problem of lossy data compression (source coding) of memoryless information sources and the statistical–mechanical behavior of a certain model of a chain of connected particles (e.g. a polymer) that is subjected to a contracting force. The free energy difference pertaining to such a contraction turns out to be proportional to the rate-distortion function in the analogous data compression model, and the contracting force is proportional to the derivative of this function. Beyond the fact that this analogy may be interesting in its own right, it may provide a physical perspective on the behavior of optimum schemes for lossy data compression (and perhaps also an information-theoretic perspective on certain physical system models). Moreover, it triggers the derivation of lossy compression performance for systems with memory, using analysis tools and insights from statistical mechanics

  18. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  19. Developing improved MD codes for understanding processive cellulases

    International Nuclear Information System (INIS)

    Crowley, M F; Nimlos, M R; Himmel, M E; Uberbacher, E C; Iii, C L Brooks; Walker, R C

    2008-01-01

    The mechanism of action of cellulose-degrading enzymes is illuminated through a multidisciplinary collaboration that uses molecular dynamics (MD) simulations and expands the capabilities of MD codes to allow simulations of enzymes and substrates on petascale computational facilities. There is a class of glycoside hydrolase enzymes called cellulases that are thought to decrystallize and processively depolymerize cellulose using biochemical processes that are largely not understood. Understanding the mechanisms involved and improving the efficiency of this hydrolysis process through computational models and protein engineering presents a compelling grand challenge. A detailed understanding of cellulose structure, dynamics and enzyme function at the molecular level is required to direct protein engineers to the right modifications or to understand if natural thermodynamic or kinetic limits are in play. Much can be learned about processivity by conducting carefully designed molecular dynamics (MD) simulations of the binding and catalytic domains of cellulases with various substrate configurations, solvation models and thermodynamic protocols. Most of these numerical experiments, however, will require significant modification of existing code and algorithms in order to efficiently use current (terascale) and future (petascale) hardware to the degree of parallelism necessary to simulate a system of the size proposed here. This work will develop MD codes that can efficiently use terascale and petascale systems, not just for simple classical MD simulations, but also for more advanced methods, including umbrella sampling with complex restraints and reaction coordinates, transition path sampling, steered molecular dynamics, and quantum mechanical/molecular mechanical simulations of systems the size of cellulose degrading enzymes acting on cellulose

  20. Development of EASYQAD version β: A Visualization Code System for QAD-CGGP-A Gamma and Neutron Shielding Calculation Code

    International Nuclear Information System (INIS)

    Kim, Jae Cheon; Lee, Hwan Soo; Ha, Pham Nhu Viet; Kim, Soon Young; Shin, Chang Ho; Kim, Jong Kyung

    2007-01-01

    EASYQAD had been previously developed by using MATLAB GUI (Graphical User Interface) in order to perform conveniently gamma and neutron shielding calculations at Hanyang University. It had been completed as version α of radiation shielding analysis code. In this study, EASYQAD was upgraded to version β with many additional functions and more user-friendly graphical interfaces. For general users to run it on Windows XP environment without any MATLAB installation, this version was developed into a standalone code system

  1. The FELIX program of experiments and code development

    International Nuclear Information System (INIS)

    Turner, L.R.

    1983-01-01

    An experimental program and test bed called FELIX (Fusion Electromagnetic Induction Experiment) which is under construction at Argonne National Laboratory is described. The facility includes the following facilities; (a) a sizable constant field, analogous to a tokamak toroidal field or the confining field of a mirror reactor, (b) a pulsed field with a sizable rate of change, analogous to a pulsed poloidal field or to the changing field of a plasma disruption, perpendicular to the constant field, and (c) a sufficiently large volume to assure that large, complex test pieces can be tested, and that the forces, torques, currents, and field distortions which are developed are large enough to be measured accurately. The development of the necessary computer codes and the experimental program are examined. (U.K.)

  2. Electron Storage Ring Development for ICS Sources

    Energy Technology Data Exchange (ETDEWEB)

    Loewen, Roderick [Lyncean Technologies, Inc., Palo Alto, CA (United States)

    2015-09-30

    There is an increasing world-wide interest in compact light sources based on Inverse Compton Scattering. Development of these types of light sources includes leveraging the investment in accelerator technology first developed at DOE National Laboratories. Although these types of light sources cannot replace the larger user-supported synchrotron facilities, they offer attractive alternatives for many x-ray science applications. Fundamental research at the SLAC National Laboratory in the 1990’s led to the idea of using laser-electron storage rings as a mechanism to generate x-rays with many properties of the larger synchrotron light facilities. This research led to a commercial spin-off of this technology. The SBIR project goal is to understand and improve the performance of the electron storage ring system of the commercially available Compact Light Source. The knowledge gained from studying a low-energy electron storage ring may also benefit other Inverse Compton Scattering (ICS) source development. Better electron storage ring performance is one of the key technologies necessary to extend the utility and breadth of applications of the CLS or related ICS sources. This grant includes a subcontract with SLAC for technical personnel and resources for modeling, feedback development, and related accelerator physics studies.

  3. Development of nuclear battery using isotope sources

    International Nuclear Information System (INIS)

    Chang, Won Jun

    2004-02-01

    Until now, the development of the useful micro electromechanical systems has the problems because previous batteries (solar, chemical, etc) did not satisfy the requirements related to power supply. At this point of time, nuclear battery using isotope sources is rising the solution of this problem. Nuclear battery can provide superior out-put power and lifetime. So a new type of micro power source (nuclear battery) for micro electromechanical systems has been designed and analyzed. In this work, I designed the three parts, isotope source, conversion device, and shielding. I chose suitable sources, and designed semiconductor using the chosen isotope sources. Power is generated by radiation exciting electrons in the semiconductor depletion region. The efficiency of the nuclear battery depends upon the pn-junction. In this study the several conceptual nuclear batteries using radioactive materials are described with pn-junction. And for the safety, I designed the shielding to protect the environment by reducing the kinetic energy of beta particles

  4. Development and application of the BOA code in Spain

    International Nuclear Information System (INIS)

    Tortuero Lopez, C.; Doncel Gutierrez, N.; Culebras, F.

    2012-01-01

    The BOA code allows to quantitatively establish the level of risk of Axial Offset Anomaly and increased deposition of crud on the basis of specific conditions in each case. For this reason, the code is parameterized according to the individual characteristics of each plant. This paper summarizes the results obtained in the implementation of the code, as well as its future perspective.

  5. Time development of cascades by the binary collision approximation code

    International Nuclear Information System (INIS)

    Fukumura, A.; Ishino, S.; Sekimura, N.

    1991-01-01

    To link a molecular dynamic calculation to binary collision approximation codes to explore high energy cascade damage, time between consecutive collisions is introduced into the binary collision MARLOWE code. Calculated results for gold by the modified code show formation of sub-cascades and their spatial and time overlapping, which can affect formation of defect clusters. (orig.)

  6. PRIMUS: a computer code for the preparation of radionuclide ingrowth matrices from user-specified sources

    International Nuclear Information System (INIS)

    Hermann, O.W.; Baes, C.F. III; Miller, C.W.; Begovich, C.L.; Sjoreen, A.L.

    1984-10-01

    The computer program, PRIMUS, reads a library of radionuclide branching fractions and half-lives and constructs a decay-chain data library and a problem-specific decay-chain data file. PRIMUS reads the decay data compiled for 496 nuclides from the Evaluated Nuclear Structure Data File (ENSDF). The ease of adding radionuclides to the input library allows the CRRIS system to further expand its comprehensive data base. The decay-chain library produced is input to the ANEMOS code. Also, PRIMUS produces a data set reduced to only the decay chains required in a particular problem, for input to the SUMIT, TERRA, MLSOIL, and ANDROS codes. Air concentrations and deposition rates from the PRIMUS decay-chain data file. Source term data may be entered directly to PRIMUS to be read by MLSOIL, TERRA, and ANDROS. The decay-chain data prepared by PRIMUS is needed for a matrix-operator method that computes either time-dependent decay products from an initial concentration generated from a constant input source. This document describes the input requirements and the output obtained. Also, sections are included on methods, applications, subroutines, and sample cases. A short appendix indicates a method of utilizing PRIMUS and the associated decay subroutines from TERRA or ANDROS for applications to other decay problems. 18 references

  7. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    Science.gov (United States)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  8. ESE a 2D compressible multiphase flow code developed for MFCI analysis - code validation

    International Nuclear Information System (INIS)

    Leskovar, M.; Mavko, B.

    1998-01-01

    ESE (Evaluation of Steam Explosions) is a general second order accurate two-dimensional compressible multiphase flow computer code. It has been developed to model the interaction of molten core debris with water during the first premixing stage of a steam explosion. A steam explosion is a physical event, which may occur during a severe reactor accident following core meltdown when the molten fuel comes into contact with the coolant water. Since the exchanges of mass, momentum and energy are regime dependent, different exchange laws have been incorporated in ESE for the major flow regimes. With ESE a number of premixing experiments performed at the Oxford University and at the QUEOS facility at Forschungszentrum Karlsruhe has been simulated. In these premixing experiments different jets of spheres were injected in a water poll. The ESE validation plan was carefully chosen, starting from very simple, well-defined problems, and gradually working up to more complicated ones. The results of ESE simulations, which were compared to experimental data and also to first order accurate calculations, are presented in form graphs. Most of the ESE results agree qualitatively as quantitatively reasonably well with experimental data and in general better than the results obtained with the first order accurate calculation.(author)

  9. Intrinsic Motivation in Open Source Software Development

    DEFF Research Database (Denmark)

    Bitzer, J.; W., Schrettl,; Schröder, Philipp

    2004-01-01

    This papers sheds light on the puzzling evidence that even though open source software (OSS) is a public good, it is developed for free by highly qualified, young and motivated individuals, and evolves at a rapid pace. We show that once OSS development is understood as the private provision...

  10. The FLUKA code for space applications Recent developments

    CERN Document Server

    Andersen, V; Battistoni, G; Campanella, M; Carboni, M; Cerutti, F; Empl, A; Fassò, A; Ferrari, A; Gadioli, E; Garzelli, M V; Lee, K; Ottolenghi, A; Pelliccioni, M; Pinsky, L S; Ranft, J; Roesler, S; Sala, P R; Wilson, T L

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to- date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results h...

  11. Present status of transport code development based on Monte Carlo method

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki

    1985-01-01

    The present status of development in Monte Carlo code is briefly reviewed. The main items are the followings; Application fields, Methods used in Monte Carlo code (geometry spectification, nuclear data, estimator and variance reduction technique) and unfinished works, Typical Monte Carlo codes and Merits of continuous energy Monte Carlo code. (author)

  12. Development of a best estimate auditing code for CANDU thermal hydraulic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B.D.; Lee, W.J.; Lim, H.S. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-04-01

    The main purpose of this study is to develop a thermal hydraulic auditing code for the CANDU reactor, modifying the model of existing PWR auditing tool, i.e. RELAP5/MOD3. This scope of project is a second step of the whole project, and focus to the implementation of CANDU models based on the previous study. FORTRAN 90 language have been used for the development of RELAP5.MOD3/CANDU PC version. For the convenience of the previous Workstation users, the FOTRAN 77 version has been coded also and implanted into the original RELAP5 source file. The verification of model implementation has been performed through the simple verification calculations using the CANDU version. 6 refs., 15 figs., 7 tabs. (Author)

  13. Recent negative ion source developments at ORNL

    International Nuclear Information System (INIS)

    Alton, G.D.

    1979-01-01

    According to specifications written for the 25 MV ORNL tandem accelerator, the ion source used during acceptance testing must be capable of producing a negative ion beam of intensity greater than or equal to 7.5 μA within a phase space of less than or equal to 1 π cm-mrad (MeV)/sup 1/2/. The specifications were written prior to the development of an ion source with such capabilities but fortunately Andersen and Tykesson introduced a source in 1975 which could easily meet the specified requirements. The remarkable beam intensity and quality properties of this source has motivated the development of other sources which utilize sputtering in the presence of a diffuse cesium plasma - some of which will be described in these proceedings. This report describes results of studies associated with the development of a modified Aarhus geometry and an axial geometry source which utilize sputtering in the presence of a diffuse cesium plasma for the production of negative ion beams

  14. Development of a dose assessment computer code for the NPP severe accident

    International Nuclear Information System (INIS)

    Cheong, Jae Hak

    1993-02-01

    A real-time emergency dose assessment computer code called KEDA (KAIST NPP Emergency Dose Assessment) has been developed for the NPP severe accident. A new mathematical model which can calculate cloud shine has been developed and implemented in the code. KEDA considers the specific Korean situations(complex topography, orientals' thyroid metabolism, continuous washout, etc.), and provides functions of dose-monitoring and automatic decision-making. To verify the code results, KEDA has been compared with an NRC officially certified code, RASCAL, for eight hypertical accident scenarios. Through the comparison, KEDA has been proved to provide reasonable results. Qualitative sensitivity analysis also the been performed for potentially important six input parameters, and the trends of the dose v.s. down-wind distance curve have been analyzed comparing with the physical phenomena occurred in the real atmosphere. The source term and meteorological conditions are turned out to be the most important input parameters. KEDA also has been applied to simulate Kori site and a hyperthetical accident with semi-real meteorological data has been simulated and analyzed

  15. Further development of the computer code ATHLET-CD

    International Nuclear Information System (INIS)

    Weber, Sebastian; Austregesilo, Henrique; Bals, Christine; Band, Sebastian; Hollands, Thorsten; Koellein, Carsten; Lovasz, Liviusz; Pandazis, Peter; Schubert, Johann-Dietrich; Sonnenkalb, Martin

    2016-10-01

    In the framework of the reactor safety research program sponsored by the German Federal Ministry for Economic Affairs and Energy (BMWi), the computer code system ATHLET/ATHLET-CD has been further developed as an analysis tool for the simulation of accidents in nuclear power plants with pressurized and boiling water reactors as well as for the evaluation of accident management procedures. The main objective was to provide a mechanistic analysis tool for best estimate calculations of transients, accidents, and severe accidents with core degradation in light water reactors. With the continued development, the capability of the code system has been largely improved, allowing best estimate calculations of design and beyond design base accidents, and the simulation of advanced core degradation with enhanced model extent in a reasonable calculation time. ATHLET comprises inter alia a 6-equation model, models for the simulation of non-condensable gases and tracking of boron concentration, as well as additional component and process models for the complete system simulation. Among numerous model improvements, the code application has been extended to super critical pressures. The mechanistic description of the dynamic development of flow regimes on the basis of a transport equation for the interface area has been further developed. This ATHLET version is completely integrated in ATHLET-CD. ATHLET-CD further comprises dedicated models for the simulation of fuel and control assembly degradation for both pressurized and boiling water reactors, debris bed with melting in the core region, as well as fission product and aerosol release and transport in the cooling system, inclusive of decay of nuclide inventories and of chemical reactions in the gas phase. The continued development also concerned the modelling of absorber material release, of melting, melt relocation and freezing, and the interaction with the wall of the reactor pressure vessel. The following models were newly

  16. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  17. Code of practice for the control and safe handling of radioactive sources used for therapeutic purposes (1988)

    International Nuclear Information System (INIS)

    1988-01-01

    This Code is intended as a guide to safe practices in the use of sealed and unsealed radioactive sources and in the management of patients being treated with them. It covers the procedures for the handling, preparation and use of radioactive sources, precautions to be taken for patients undergoing treatment, storage and transport of radioactive sources within a hospital or clinic, and routine testing of sealed sources [fr

  18. Recent UCN source developments at Los Alamos

    International Nuclear Information System (INIS)

    Seestrom, S.J.; Anaya, J.M.; Bowles, T.J.

    1998-01-01

    The most intense sources of ultra cold neutrons (UCN) have bee built at reactors where the high average thermal neutron flux can overcome the low UCN production rate to achieve usable densities of UCN. At spallation neutron sources the average flux available is much lower than at a reactor, though the peak flux can be comparable or higher. The authors have built a UCN source that attempts to take advantage of the high peak flux available at the short pulse spallation neutron source at the Los Alamos Neutron Science Center (LANSCE) to generate a useful number of UCN. In the source UCN are produced by Doppler-shifted Bragg scattering of neutrons to convert 400-m/s neutrons down into the UCN regime. This source was initially tested in 1996 and various improvements were made based on the results of the 1996 running. These improvements were implemented and tested in 1997. In sections 2 and 3 they discuss the improvements that have been made and the resulting source performance. Recently an even more interesting concept was put forward by Serebrov et al. This involves combining a solid Deuterium UCN source, previously studied by Serebrov et al., with a pulsed spallation source to achieve world record UCN densities. They have initiated a program of calculations and measurements aimed at verifying the solid Deuterium UCN source concept. The approach has been to develop an analytical capability, combine with Monte Carlo calculations of neutron production, and perform benchmark experiments to verify the validity of the calculations. Based on the calculations and measurements they plan to test a modified version of the Serebrov UCN factory. They estimate that they could produce over 1,000 UCN/cc in a 15 liter volume, using 1 microamp of 800 MeV protons for two seconds every 500 seconds. They will discuss the result UCN production measurements in section 4

  19. Development of a multispectral autoradiography using a coded aperture

    Science.gov (United States)

    Noto, Daisuke; Takeda, Tohoru; Wu, Jin; Lwin, Thet T.; Yu, Quanwen; Zeniya, Tsutomu; Yuasa, Tetsuya; Hiranaka, Yukio; Itai, Yuji; Akatsuka, Takao

    2000-11-01

    Autoradiography is a useful imaging technique to understand biological functions using tracers including radio isotopes (RI's). However, it is not easy to describe the distribution of different kinds of tracers simultaneously by conventional autoradiography using X-ray film or Imaging plate. Each tracer describes each corresponding biological function. Therefore, if we can simultaneously estimate distribution of different kinds of tracer materials, the multispectral autoradiography must be a quite powerful tool to better understand physiological mechanisms of organs. So we are developing a system using a solid state detector (SSD) with high energy- resolution. Here, we introduce an imaging technique with a coded aperture to get spatial and spectral information more efficiently. In this paper, the imaging principle is described, and its validity and fundamental property are discussed by both simulation and phantom experiments with RI's such as 201Tl, 99mTc, 67Ga, and 123I.

  20. Recent development of three-dimensional piping code SHAPS

    International Nuclear Information System (INIS)

    Wang, C.Y.; Zeuch, W.R.

    1985-01-01

    This paper describes the recent development of the three-dimensional, structural, and hydrodynamic analysis piping code SHAPS. Several new features have been incorporated into the program, including (1) an elbow hydrodynamic model for analyzing the effect of global motion on the pressure-wave propagation, (2) a component hydrodynamic model for treating fluid motion in the vicinity of rigid obstacles and baffle plates, (3) the addition of the implicit time integration scheme in the structural-dynamic analysis, (4) the option of an implicit-implicit fluid-structural linking scheme, and (5) provisions for two constitutive equations for materials under various loading conditions. Sample problems are given to illustrate these features. Their results are discussed in detail. 7 refs., 8 figs

  1. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  2. The History of Cartographic Sources Development

    Directory of Open Access Journals (Sweden)

    L. Volkotrub

    2016-07-01

    Full Text Available Cartographic sources are the variety of descriptive sources. They include historical and geographical maps and circuits maps. The image maps are a special kind of modeling the real phenomenon, that broadcasts their quantitative and qualitative characteristics, structure, interconnections and dynamic in a graphic form. The prototypes of maps appeared as a way of transmitting information around the world. People began to use this way of communication long before the appearance of writing. The quality of mapping images matched with the evolution of techniques and methods of mapping and publishing. The general development of cartographic sources is determined primarily by three factors: the development of science and technology, the needs of society in different cartographic works, political and economic situation of country. Given this, map is an all-sufficient phenomenon, its sources expert study is based on understanding of invariance of its perception. Modern theoretical concepts show us the invariance of maps. Specifially, map is viewed in the following aspects: 1 it is one of the universal models of land and existing natural and social processes.2 it is one of the tools of researching and forecasting. 3 it is a specific language formation. 4 it is a method of transferring information. As a source map may contain important information about physical geography, geology, hydrology, political-administrative division, population, flora and fauna of a particular area in a particular period. Mostly, cartographic sources are complex, because they contain a lot of cognitive and historical information.

  3. Development of multi-physics code systems based on the reactor dynamics code DYN3D

    Energy Technology Data Exchange (ETDEWEB)

    Kliem, Soeren; Gommlich, Andre; Grahn, Alexander; Rohde, Ulrich [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany); Schuetze, Jochen [ANSYS Germany GmbH, Darmstadt (Germany); Frank, Thomas [ANSYS Germany GmbH, Otterfing (Germany); Gomez Torres, Armando M.; Sanchez Espinoza, Victor Hugo [Karlsruher Institut fuer Technologie (KIT), Eggenstein-Leopoldshafen (Germany)

    2011-07-15

    The reactor dynamics code DYN3D has been coupled with the CFD code ANSYS CFX and the 3D thermal hydraulic core model FLICA4. In the coupling with ANSYS CFX, DYN3D calculates the neutron kinetics and the fuel behavior including the heat transfer to the coolant. The physical data interface between the codes is the volumetric heat release rate into the coolant. In the coupling with FLICA4 only the neutron kinetics module of DYN3D is used. Fluid dynamics and related transport phenomena in the reactor's coolant and fuel behavior is calculated by FLICA4. The correctness of the coupling of DYN3D with both thermal hydraulic codes was verified by the calculation of different test problems. These test problems were set-up in such a way that comparison with the DYN3D stand-alone code was possible. This included steady-state and transient calculations of a mini-core consisting of nine real-size PWR fuel assemblies with ANSYS CFX/DYN3D as well as mini-core and a full core steady-state calculation using FLICA4/DYN3D. (orig.)

  4. Development of multi-physics code systems based on the reactor dynamics code DYN3D

    International Nuclear Information System (INIS)

    Kliem, Soeren; Gommlich, Andre; Grahn, Alexander; Rohde, Ulrich; Schuetze, Jochen; Frank, Thomas; Gomez Torres, Armando M.; Sanchez Espinoza, Victor Hugo

    2011-01-01

    The reactor dynamics code DYN3D has been coupled with the CFD code ANSYS CFX and the 3D thermal hydraulic core model FLICA4. In the coupling with ANSYS CFX, DYN3D calculates the neutron kinetics and the fuel behavior including the heat transfer to the coolant. The physical data interface between the codes is the volumetric heat release rate into the coolant. In the coupling with FLICA4 only the neutron kinetics module of DYN3D is used. Fluid dynamics and related transport phenomena in the reactor's coolant and fuel behavior is calculated by FLICA4. The correctness of the coupling of DYN3D with both thermal hydraulic codes was verified by the calculation of different test problems. These test problems were set-up in such a way that comparison with the DYN3D stand-alone code was possible. This included steady-state and transient calculations of a mini-core consisting of nine real-size PWR fuel assemblies with ANSYS CFX/DYN3D as well as mini-core and a full core steady-state calculation using FLICA4/DYN3D. (orig.)

  5. Detecting Source Code Plagiarism on .NET Programming Languages using Low-level Representation and Adaptive Local Alignment

    Directory of Open Access Journals (Sweden)

    Oscar Karnalim

    2017-01-01

    Full Text Available Even though there are various source code plagiarism detection approaches, only a few works which are focused on low-level representation for deducting similarity. Most of them are only focused on lexical token sequence extracted from source code. In our point of view, low-level representation is more beneficial than lexical token since its form is more compact than the source code itself. It only considers semantic-preserving instructions and ignores many source code delimiter tokens. This paper proposes a source code plagiarism detection which rely on low-level representation. For a case study, we focus our work on .NET programming languages with Common Intermediate Language as its low-level representation. In addition, we also incorporate Adaptive Local Alignment for detecting similarity. According to Lim et al, this algorithm outperforms code similarity state-of-the-art algorithm (i.e. Greedy String Tiling in term of effectiveness. According to our evaluation which involves various plagiarism attacks, our approach is more effective and efficient when compared with standard lexical-token approach.

  6. Development of general-purpose particle and heavy ion transport monte carlo code

    International Nuclear Information System (INIS)

    Iwase, Hiroshi; Nakamura, Takashi; Niita, Koji

    2002-01-01

    The high-energy particle transport code NMTC/JAM, which has been developed at JAERI, was improved for the high-energy heavy ion transport calculation by incorporating the JQMD code, the SPAR code and the Shen formula. The new NMTC/JAM named PHITS (Particle and Heavy-Ion Transport code System) is the first general-purpose heavy ion transport Monte Carlo code over the incident energies from several MeV/nucleon to several GeV/nucleon. (author)

  7. Development of a circadian light source

    Science.gov (United States)

    Nicol, David B.; Ferguson, Ian T.

    2002-11-01

    Solid state lighting presents a new paradigm for lighting - controllability. Certain characteristics of the lighting environment can be manipulated, because of the possibility of using multiple LEDs of different emission wavelengths as the illumination source. This will provide a new, versatile, general illumination source due to the ability to vary the spectral power distribution. New effects beyond the visual may be achieved that are not possible with conventional light sources. Illumination has long been the primary function of lighting but as the lighting industry has matured the psychological aspects of lighting have been considered by designers; for example, choosing a particular lighting distribution or color variation in retail applications. The next step in the evolution of light is to consider the physiological effects of lighting that cause biological changes in a person within the environment. This work presents the development of a source that may have important bearing on this area of lighting. A circadian light source has been developed to provide an illumination source that works by modulating its correlated color temperature to mimic the changes in natural daylight through the day. In addition, this source can cause or control physiological effects for a person illuminated by it. The importance of this is seen in the human circadian rhythm's peak response corresponding to blue light at ~460 nm which corresponds to the primary spectral difference in increasing color temperature. The device works by adding blue light to a broadband source or mixing polychromatic light to mimic the variation of color temperature observed for the Planckian Locus on the CIE diagram. This device can have several applications including: a tool for researchers in this area, a general illumination lighting technology, and a light therapy device.

  8. Development of the code package KASKAD for calculations of WWERs

    International Nuclear Information System (INIS)

    Bolobov, P.A.; Lazarenko, A.P.; Tomilov, M.Ju.

    2008-01-01

    The new version of software package for neutron calculation of WWER cores KASKAD 2007 consists of some calculating and service modules, which are integrated in the common framework. The package is based on the old version, which was expanded with some new functions and the new calculating modules, such as: -the BIPR-2007 code is the new one which performs calculation of power distribution in three-dimensional geometry for 2-group neutron diffusion calculation. This code is based on the BIPR-8KN model, provides all possibilities of BIPR-7A code and uses the same input data; -the PERMAK-2007 code is pin-by-pin few-group multilayer and 3-D code for neutron diffusion calculation; -graphical user interface for input data preparation of the TVS-M code. The report also includes some calculation results obtained with modified version of the KASKAD 2007 package. (Authors)

  9. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  10. Development of multi-group spectral code TVS-M

    International Nuclear Information System (INIS)

    Lazarenko, A. P.; Pryanichnikov, A. V.; Kalugin, M. A.; Gurevich, M. I.

    2011-01-01

    This paper is dedicated to the latest version of TVS-M code - TVS-M 2007, which allows the neutron flux distribution inside fuel assemblies to be calculated without using the diffusion approximation. The new spatial calculation module PERST introduced in TBS-M code is based on the first collisions probability method and allows the scattering anisotropy to be accounted for. This paper presents some preliminary results calculated with the use of the new version of TVS-M code. (Authors)

  11. Development of an Auto-Validation Program for MARS Code Assessments

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2006-01-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is a best-estimate thermal hydraulic system analysis code developed at KAERI. It is important for a thermal hydraulic computer code to be assessed against theoretical and experimental data to verify and validate the performance and the integrity of the structure, models and correlations of the code. The code assessment efforts for complex thermal hydraulics code such as MARS code can be tedious, time-consuming and require large amount of human intervention in data transfer to see the results in graphic forms. Code developers produce many versions of a code during development and each version need to be verified for integrity. Thus, for MARS code developers, it is desirable to have an automatic way of carrying out the code assessment calculations. In the present work, an Auto-Validation program that carries out the code assessment efforts has been developed. The program uses the user supplied configuration file (with '.vv' extension) which contain commands to read input file, to execute the user selected MARS program, and to generate result graphs. The program can be useful if a same set of code assessments is repeated with different versions of the code. The program is written with the Delphi program language. The program runs under the Microsoft Windows environment

  12. Code development and analyses within the area of transmutation and safety

    International Nuclear Information System (INIS)

    Maschek, W.

    2002-01-01

    A strong code development is going on to meet various demands resulting from the development of dedicated reactors for transmutation and incineration. Code development is concerned with safety codes and general codes needed for assessing scenarios and transmutation strategies. Analyses concentrate on various ADS systems with solid and liquid molten salt fuels. Analyses deal with ADS Demo Plant (5th FP EU) and transmuters with advanced fuels

  13. Development status of Severe Accident Analysis Code SAMPSON

    International Nuclear Information System (INIS)

    Iwashita, Tsuyoshi; Ujita, Hiroshi

    2000-01-01

    The Four years of the IMPACT, 'Integrated Modular Plant Analysis and Computing Technology' project Phase 1 have been completed. The verification study of Severe Accident Analysis Code SAMPSON prototype developed in Phase 1 was conducted in two steps. First, each analysis module was run independently and analysis results were compared and verified against separate-effect test data with good results. Test data are as follows: CORA-13 (FZK) for the Core Heat-up Module; VI-3 of HI/VI Test (ORNL) for the FP Release from Fuel Module; KROTOS-37 (JRC-ISPRA) for the Molten Core Relocation Module; Water Spread Test (UCSB) for the Debris Spreading Model and Benard's Melting Test for Natural Convection Model in the Debris Cooling Module; Hydrogen Burning Test (NUPEC) for the Ex-Vessel Thermal Hydraulics Module; PREMIX, PM10 (FZK) for the Steam Explosion Module; and SWISS-2 (SNL) for the Debris-Concrete Interaction Module. Second, with the Simulation Supervisory System, up to 11 analysis modules were executed concurrently in the parallel environment (currently, NUPEC uses IBM-SP2 with 72 process elements), to demonstrate the code capability and integrity. The target plant was Surry as a typical PWR and the initiation events were a 10-inch cold leg failure. The analysis is divided to two cases; one is in-vessel retention analysis when the gap cooling is effective (In-vessel scenario test), the other is analysis of phenomena event is extended to ex-vessel due to the Reactor Pressure Vessel failure when the gap cooling is not sufficient (Ex-vessel scenario test). The system verification test has confirmed that the full scope of the scenarios can be analyzed and phenomena occurred in scenarios can be simulated qualitatively reasonably considering the physical models used for the situation. The Ministry of International Trade and Industry, Japan sponsors this work. (author)

  14. Living Up to the Code's Exhortations? Social Workers' Political Knowledge Sources, Expectations, and Behaviors.

    Science.gov (United States)

    Felderhoff, Brandi Jean; Hoefer, Richard; Watson, Larry Dan

    2016-01-01

    The National Association of Social Workers' (NASW's) Code of Ethics urges social workers to engage in political action. However, little recent research has been conducted to examine whether social workers support this admonition and the extent to which they actually engage in politics. The authors gathered data from a survey of social workers in Austin, Texas, to address three questions. First, because keeping informed about government and political news is an important basis for action, the authors asked what sources of knowledge social workers use. Second, they asked what the respondents believe are appropriate political behaviors for other social workers and NASW. Third, they asked for self-reports regarding respondents' own political behaviors. Results indicate that social workers use the Internet and traditional media services to stay informed; expect other social workers and NASW to be active; and are, overall, more active than the general public in many types of political activities. The comparisons made between expectations for others and their own behaviors are interesting in their complex outcomes. Social workers should strive for higher levels of adherence to the code's urgings on political activity. Implications for future work are discussed.

  15. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    Science.gov (United States)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  16. Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding

    Directory of Open Access Journals (Sweden)

    Yongjian Nian

    2013-01-01

    Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.

  17. Communal Resources in Open Source Software Development

    Science.gov (United States)

    Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit

    2008-01-01

    Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…

  18. Development and validation of corium oxidation model for the VAPEX code

    International Nuclear Information System (INIS)

    Blinkov, V.N.; Melikhov, V.I.; Davydov, M.V.; Melikhov, O.I.; Borovkova, E.M.

    2011-01-01

    In light water reactor core melt accidents, the molten fuel (corium) can be brought into contact with coolant water in the course of the melt relocation in-vessel and ex-vessel as well as in an accident mitigation action of water addition. Mechanical energy release from such an interaction is of interest in evaluating the structural integrity of the reactor vessel as well as of the containment. Usually, the source for the energy release is considered to be the rapid transfer of heat from the molten fuel to the water ('vapor explosion'). When the fuel contains a chemically reactive metal component, there could be an additional source for the energy release, which is the heat release and hydrogen production due to the metal-water chemical reaction. In Electrogorsk Research and Engineering Center the computer code VAPEX (VAPor EXplosion) has been developed for analysis of the molten fuel coolant interaction. Multifield approach is used for modeling of dynamics of following phases: water, steam, melt jet, melt droplets, debris. The VAPEX code was successfully validated on FARO experimental data. Hydrogen generation was observed in FARO tests even though corium didn't contain metal component. The reason for hydrogen generation was not clear, so, simplified empirical model of hydrogen generation was implemented in the VAPEX code to take into account input of hydrogen into pressure increase. This paper describes new more detailed model of hydrogen generation due to the metal-water chemical reaction and results of its validation on ZREX experiments. (orig.)

  19. Co-sourcing in software development offshoring

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2013-01-01

    Software development projects are increasingly geographical distributed with offshoring, which introduce complex risks that can lead to project failure. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how co-sour......-taking by high attention to of the closely interrelated structure and technology components in terms of CMMI and the actors’ cohesion and integration in terms of Scrum....

  20. Optimization and Validation of the Developed Uranium Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    γ-ray spectroscopy is a representative non-destructive assay for nuclear material, and less time-consuming and less expensive than the destructive analysis method. The destructive technique is more precise than NDA technique, however, there is some correction algorithm which can improve the performance of γ-spectroscopy. For this reason, an analysis code for uranium isotopic analysis is developed by Applied Nuclear Physics Group in Seoul National University. Overlapped γ- and x-ray peaks in the 89-101 keV X{sub α}-region are fitted with Gaussian and Lorentzian distribution peak functions, tail and background functions. In this study, optimizations for the full-energy peak efficiency calibration and fitting parameters of peak tail and background are performed, and validated with 24 hour acquisition of CRM uranium samples. The optimization of peak tail and background parameters are performed with the validation by using CRM uranium samples. The analysis performance is improved in HEU samples, but more optimization of fitting parameters is required in LEU sample analysis. In the future, the optimization research about the fitting parameters with various type of uranium samples will be performed. {sup 234}U isotopic analysis algorithms and correction algorithms (coincidence effect, self-attenuation effect) will be developed.

  1. Development of nuclear decay data library JDDL, and nuclear generation and decay calculation code COMRAD

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Ihara, Hitoshi; Katakura, Jun-ichi; Hara, Toshiharu.

    1986-08-01

    For safety evaluation of nuclear fuel facilities, a nuclear decay data library named JDDL and a computer code COMRAD have been developed to calculate isotopic composition of each nuclide, radiation source intensity, energy spectrum of γ-ray and neutron, and decay heat of spent fuel. JDDL has been produced mainly from the evaluated nuclear data file ENSDF to use new nuclear data. To supplement the data file for short life nuclides, the JNDC data set were also used which had been evaluated by Japan Nuclear Data Committee. Using these data, calculations became possible from short period to long period after irradiation. (author)

  2. MARE2DEM: a 2-D inversion code for controlled-source electromagnetic and magnetotelluric data

    Science.gov (United States)

    Key, Kerry

    2016-10-01

    This work presents MARE2DEM, a freely available code for 2-D anisotropic inversion of magnetotelluric (MT) data and frequency-domain controlled-source electromagnetic (CSEM) data from onshore and offshore surveys. MARE2DEM parametrizes the inverse model using a grid of arbitrarily shaped polygons, where unstructured triangular or quadrilateral grids are typically used due to their ease of construction. Unstructured grids provide significantly more geometric flexibility and parameter efficiency than the structured rectangular grids commonly used by most other inversion codes. Transmitter and receiver components located on topographic slopes can be tilted parallel to the boundary so that the simulated electromagnetic fields accurately reproduce the real survey geometry. The forward solution is implemented with a goal-oriented adaptive finite-element method that automatically generates and refines unstructured triangular element grids that conform to the inversion parameter grid, ensuring accurate responses as the model conductivity changes. This dual-grid approach is significantly more efficient than the conventional use of a single grid for both the forward and inverse meshes since the more detailed finite-element meshes required for accurate responses do not increase the memory requirements of the inverse problem. Forward solutions are computed in parallel with a highly efficient scaling by partitioning the data into smaller independent modeling tasks consisting of subsets of the input frequencies, transmitters and receivers. Non-linear inversion is carried out with a new Occam inversion approach that requires fewer forward calls. Dense matrix operations are optimized for memory and parallel scalability using the ScaLAPACK parallel library. Free parameters can be bounded using a new non-linear transformation that leaves the transformed parameters nearly the same as the original parameters within the bounds, thereby reducing non-linear smoothing effects. Data

  3. Development and application of a system analysis code for liquid fueled molten salt reactors based on RELAP5 code

    International Nuclear Information System (INIS)

    Shi, Chengbin; Cheng, Maosong; Liu, Guimin

    2016-01-01

    Highlights: • New point kinetics and thermo-hydraulics models as well as a numerical method are added into RELAP5 code to be suitable for liquid fueled molten salt reactor. • The extended REALP5 code is verified by the experimental benchmarks of MSRE. • The different transient scenarios of the MSBR are simulated to evaluate performance during the transients. - Abstract: The molten salt reactor (MSR) is one of the six advanced reactor concepts declared by the Generation IV International Forum (GIF), which can be characterized by attractive attributes as inherent safety, economical efficiency, natural resource protection, sustainable development and nuclear non-proliferation. It is important to make system safety analysis for nuclear power plant of MSR. In this paper, in order to developing a system analysis code suitable for liquid fueled molten salt reactors, the point kinetics and thermo-hydraulic models as well as the numerical method in thermal–hydraulic transient code Reactor Excursion and Leak Analysis Program (RELAP5) developed at the Idaho National Engineering Laboratory (INEL) for the U.S. Nuclear Regulatory Commission (NRC) are extended and verified by Molten Salt Reactor Experiment (MSRE) experimental benchmarks. And then, four transient scenarios including the load demand change, the primary flow transient, the secondary flow transient and the reactivity transient of the Molten Salt Breeder Reactor (MSBR) are modeled and simulated so as to evaluate the performance of the reactor during the anticipated transient events using the extended RELAP5 code. The results indicate the extended RELAP5 code is effective and well suited to the liquid fueled molten salt reactor, and the MSBR has strong inherent safety characteristics because of its large negative reactivity coefficient. In the future, the extended RELAP5 code will be used to perform transient safety analysis for a liquid fueled thorium molten salt reactor named TMSR-LF developed by the Center

  4. Development and application of a system analysis code for liquid fueled molten salt reactors based on RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Chengbin [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Cheng, Maosong, E-mail: mscheng@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Liu, Guimin [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China)

    2016-08-15

    Highlights: • New point kinetics and thermo-hydraulics models as well as a numerical method are added into RELAP5 code to be suitable for liquid fueled molten salt reactor. • The extended REALP5 code is verified by the experimental benchmarks of MSRE. • The different transient scenarios of the MSBR are simulated to evaluate performance during the transients. - Abstract: The molten salt reactor (MSR) is one of the six advanced reactor concepts declared by the Generation IV International Forum (GIF), which can be characterized by attractive attributes as inherent safety, economical efficiency, natural resource protection, sustainable development and nuclear non-proliferation. It is important to make system safety analysis for nuclear power plant of MSR. In this paper, in order to developing a system analysis code suitable for liquid fueled molten salt reactors, the point kinetics and thermo-hydraulic models as well as the numerical method in thermal–hydraulic transient code Reactor Excursion and Leak Analysis Program (RELAP5) developed at the Idaho National Engineering Laboratory (INEL) for the U.S. Nuclear Regulatory Commission (NRC) are extended and verified by Molten Salt Reactor Experiment (MSRE) experimental benchmarks. And then, four transient scenarios including the load demand change, the primary flow transient, the secondary flow transient and the reactivity transient of the Molten Salt Breeder Reactor (MSBR) are modeled and simulated so as to evaluate the performance of the reactor during the anticipated transient events using the extended RELAP5 code. The results indicate the extended RELAP5 code is effective and well suited to the liquid fueled molten salt reactor, and the MSBR has strong inherent safety characteristics because of its large negative reactivity coefficient. In the future, the extended RELAP5 code will be used to perform transient safety analysis for a liquid fueled thorium molten salt reactor named TMSR-LF developed by the Center

  5. Alkali deuteride negative ion source development plan

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    A three phase program is described for the development of neutral beam systems. In the first phase, concluded in May, 1977, the laser initiated source was characterized. In phase two, scheduled for completion in September, 1978, negative ion confinement and extraction are investigated using laser energy deposition as a baseline method to produce D - . In addition other energy deposition schemes are studied in order to define a baseline energetic beam source system. The third phase is devoted to producing an integrated baseline system and scaling it up in current and energy to meet magnetic confinement system requirements

  6. Trends in EFL Technology and Educational Coding: A Case Study of an Evaluation Application Developed on LiveCode

    Science.gov (United States)

    Uehara, Suwako; Noriega, Edgar Josafat Martinez

    2016-01-01

    The availability of user-friendly coding software is increasing, yet teachers might hesitate to use this technology to develop for educational needs. This paper discusses studies related to technology for educational uses and introduces an evaluation application being developed. Through questionnaires by student users and open-ended discussion by…

  7. In-vessel source term analysis code TRACER version 2.3. User's manual

    International Nuclear Information System (INIS)

    Toyohara, Daisuke; Ohno, Shuji; Hamada, Hirotsugu; Miyahara, Shinya

    2005-01-01

    A computer code TRACER (Transport Phenomena of Radionuclides for Accident Consequence Evaluation of Reactor) version 2.3 has been developed to evaluate species and quantities of fission products (FPs) released into cover gas during a fuel pin failure accident in an LMFBR. The TRACER version 2.3 includes new or modified models shown below. a) Both model: a new model for FPs release from fuel. b) Modified model for FPs transfer from fuel to bubbles or sodium coolant. c) Modified model for bubbles dynamics in coolant. Computational models, input data and output data of the TRACER version 2.3 are described in this user's manual. (author)

  8. Design development of bellows for the DNB beam source

    International Nuclear Information System (INIS)

    Singh, Dhananjay Kumar; Venkata Nagaraju, M.; Joshi, Jaydeep; Patel, Hitesh; Yadav, Ashish; Pillai, Suraj; Singh, Mahendrajit; Bandyopadhyay, Mainak; Chakraborty, A.K.; Sharma, Dheeraj

    2017-01-01

    Establishing a procedure and mechanism for alignment of Ion beams in Neutral Beam (NB) sources for ITER like systems are complex due to large traversal distances (∼21 m) and restricted use of flexible elements into the system. For the beam source of DNB, movement requirements for beam alignment are the combination of tilting (±9mrad), rotation (±9mrad) and translation (±25mm). The present work describes the design development of a system composed of three single ply ‘Gimbal’ type bellow system, placed in series, in L-shaped hydraulic lines (size DN50, DN20 and DN15). The paper shall detail out the generation of initial requirements, transformation of movements at bellow locations, selection of bellows/combination of bellows, minimizing the induced movements by optimization of bellows location, estimation of movements through CEASAR II and the design compliance with respect to EJMA code

  9. The FEL development at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Arnold, N. D.; Benson, C.; Berg, S.; Berg, W.; Biedron, S. G.; Chae, Y. C.; Crosbie, E. A.; Decker, G.; Dejus, R. J.; Den Hartog, P.; Deriy, B.; Dortwegt, R.; Edrmann, M.; Freund, H. P.; Friedsam, H.; Galayda, J. N.; Gluskin, E.; Goeppner, G. A.; Grelick, A.; Huang, Z.; Jones, J.; Kang, Y.; Kim, K.-J.; Kim, S.; Kinoshita, K.; Lewellen, J. W.; Lill, R.; Lumpkin, A. H.; Makarov, O.; Markovich, G. M.; Milton, S. V.; Moog, E. R.; Nassiri, A.; Ogurtsov, V.; Pasky, S.; Power, J.; Tieman, B.; Trakhtenberg, E.; Travish, G.; Vasserman, I.; Walters, D. R.; Wang, J.; Xu, S.; Yang, B.

    1999-01-01

    Construction of a single-pass free-electron laser (FEL) based on the self-amplified spontaneous emission (SASE) mode of operation is nearing completion at the Advanced Photon Source (APS) with initial experiments imminent. The APS SASE FEL is a proof-of-principle fourth-generation light source. As of January 1999 the undulator hall, end-station building, necessary transfer lines, electron and optical diagnostics, injectors, and initial undulatory have been constructed and, with the exception of the undulatory, installed. All preliminary code development and simulations have also been completed. The undulator hall is now ready to accept first beam for characterization of the output radiation. It is the project goal to push towards fill FEL saturation, initially in the visible, but ultimately to W and VUV, wavelengths

  10. CodeRAnts: A recommendation method based on collaborative searching and ant colonies, applied to reusing of open source code

    Directory of Open Access Journals (Sweden)

    Isaac Caicedo-Castro

    2014-01-01

    Full Text Available This paper presents CodeRAnts, a new recommendation method based on a collaborative searching technique and inspired on the ant colony metaphor. This method aims to fill the gap in the current state of the matter regarding recommender systems for software reuse, for which prior works present two problems. The first is that, recommender systems based on these works cannot learn from the collaboration of programmers and second, outcomes of assessments carried out on these systems present low precision measures and recall and in some of these systems, these metrics have not been evaluated. The work presented in this paper contributes a recommendation method, which solves these problems.

  11. Large-eddy simulation of convective boundary layer generated by highly heated source with open source code, OpenFOAM

    International Nuclear Information System (INIS)

    Hattori, Yasuo; Suto, Hitoshi; Eguchi, Yuzuru; Sano, Tadashi; Shirai, Koji; Ishihara, Shuji

    2011-01-01

    Spatial- and temporal-characteristics of turbulence structures in the close vicinity of a heat source, which is a horizontal upward-facing round plate heated at high temperature, are examined by using well resolved large-eddy simulations. The verification is carried out through the comparison with experiments: the predicted statistics, including the PDF distribution of temperature fluctuations, agree well with measurements, indicating that the present simulations have a capability to appropriately reproduce turbulence structures near the heat source. The reproduced three-dimensional thermal- and fluid-fields in the close vicinity of the heat source reveals developing processes of coherence structures along the surface: the stationary- and streaky-flow patterns appear near the edge, and such patterns randomly shift to cell-like patterns with incursion into the center region, resulting in thermal-plume meandering. Both the patterns have very thin structures, but the depth of streaky structure is considerably small compared with that of cell-like patterns; this discrepancy causes the layered structures. The structure is the source of peculiar turbulence characteristics, the prediction of which is quite difficult with RANS-type turbulence models. The understanding such structures obtained in present study must be helpful to improve the turbulence model used in nuclear engineering. (author)

  12. Development of a nuclear data uncertainties propagation code on the residual power in fast neutron reactors

    International Nuclear Information System (INIS)

    Benoit, J.-C.

    2012-01-01

    This PhD study is in the field of nuclear energy, the back end of nuclear fuel cycle and uncertainty calculations. The CEA must design the prototype ASTRID, a sodium cooled fast reactor (SFR) and one of the selected concepts of the Generation IV forum, for which the calculation of the value and the uncertainty of the decay heat have a significant impact. In this study is developed a code of propagation of uncertainties of nuclear data on the decay heat in SFR. The process took place in three stages. The first step has limited the number of parameters involved in the calculation of the decay heat. For this, an experiment on decay heat on the reactor PHENIX (PUIREX 2008) was studied to validate experimentally the DARWIN package for SFR and quantify the source terms of the decay heat. The second step was aimed to develop a code of propagation of uncertainties: CyRUS (Cycle Reactor Uncertainty and Sensitivity). A deterministic propagation method was chosen because calculations are fast and reliable. Assumptions of linearity and normality have been validated theoretically. The code has also been successfully compared with a stochastic code on the example of the thermal burst fission curve of 235 U. The last part was an application of the code on several experiments: decay heat of a reactor, isotopic composition of a fuel pin and the burst fission curve of 235 U. The code has demonstrated the possibility of feedback on nuclear data impacting the uncertainty of this problem. Two main results were highlighted. Firstly, the simplifying assumptions of deterministic codes are compatible with a precise calculation of the uncertainty of the decay heat. Secondly, the developed method is intrusive and allows feedback on nuclear data from experiments on the back end of nuclear fuel cycle. In particular, this study showed how important it is to measure precisely independent fission yields along with their covariance matrices in order to improve the accuracy of the calculation of

  13. Reactor Systems Technology Division code development and configuration/quality control procedures

    International Nuclear Information System (INIS)

    Johnson, E.C.

    1985-06-01

    Procedures are prescribed for executing a code development task and implementing the resulting coding in an official version of a computer code. The responsibilities of the project manager, development staff members, and the Code Configuration/Quality Control Group are defined. Examples of forms, logs, computer job control language, and suggested outlines for reports associated with software production and implementation are included in Appendix A. 1 raf., 2 figs

  14. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    Science.gov (United States)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  15. Development of ASME Code Section 11 visual examination requirements

    International Nuclear Information System (INIS)

    Cook, J.F.

    1990-01-01

    Section XI of the American Society for Mechanical Engineers Boiler and Pressure Vessel Code (ASME Code) defines three types of nondestructive examinations, visual, surface, and volumetric. Visual examination is important since it is the primary examination method for many safety-related components and systems and is also used as a backup examination for the components and systems which receive surface or volumetric examinations. Recent activity in the Section XI Code organization to improve the rules for visual examinations is reviewed and the technical basis for the new rules, which cover illumination, vision acuity, and performance demonstration, is explained

  16. Management of development of renewable energy sources

    Directory of Open Access Journals (Sweden)

    Inić Branimir P.

    2014-01-01

    Full Text Available The aim of the paper: 'Management of development of renewable energy sources is to point out the possible solutions for neutralizing the threat of energy shortages. The paper outlines major short and long term energy problems facing humanity. The increase of world human population is, inevitably, accompanied by higher energy consumption. Reserves decrease of nonrenewable energy sources like oil, gas, and coal is a major threat to maintaining current living conditions, and thus requires solutions in order to neutralize the threat. This is why the management of development of renewable energy sources is an imperative for Serbia. The paper emphasizes the use of solar energy, because the annual average of solar radiation in Serbia is about 40% higher than the European average, however, the actual use of the sun's energy to generate electricity in Serbia is far behind the countries of the European Union. Solar energy is clean, renewable, and the fact that 4.2 kilowatt-hours are received daily per square meter averaged over the entire surface of the planet, makes it an almost unused energy source, Compared to EU countries, the price of non-renewable derived energy is, on average, higher in Serbia. Taking this into consideration, the use of solar energy, as an unused resource, imposes itself as indispensable.

  17. Development of multiampere negative ion sources

    International Nuclear Information System (INIS)

    Alessi, J.; Hershcovitch, A.; Prelec, K.; Sluyters, T.

    1981-01-01

    The Neutral Beam Development Group at BNL is developing H-/D- surface plasma sources as part of a high energy neutral beam injector. Uncooled Penning and magnetron sources have operated at a maximum beam current of 1 A (10 ms pulses, Mk III) and a maximum pulse length of 200 ms (0.3 A, Mk IV). A magnetron source with focusing grooves on the cathode and an asymmetric anode-cathode geometry operates at a power efficiency of 8 kW/A and a 6% gas efficiency. As the next step, a water cooled magnetron, designed to give a steady state beam of 1 to 2 A, has been constructed. Experiments are in progress to test a modification of the magnetron which may significantly improve its performance. By injecting a sheet of plasma, produced by a highly gas efficient hollow cathode discharge, into a magnetron type anode-cathode geometry, we anticipate a reduction of the source operating pressure by at least three orders of magnitude. Initial experiments have given indications of H - production. The next plasma injection experiment is designed to give a steady state beam of approx. = 1 A

  18. OpenMC: A state-of-the-art Monte Carlo code for research and development

    International Nuclear Information System (INIS)

    Romano, Paul K.; Horelik, Nicholas E.; Herman, Bryan R.; Nelson, Adam G.; Forget, Benoit; Smith, Kord

    2015-01-01

    Highlights: • OpenMC is an open source Monte Carlo particle transport code. • Solid geometry and continuous-energy physics allow high-fidelity simulations. • Development has focused on high performance and modern I/O techniques. • OpenMC is capable of scaling up to hundreds of thousands of processors. • Other features include plotting, CMFD acceleration, and variance reduction. - Abstract: This paper gives an overview of OpenMC, an open source Monte Carlo particle transport code recently developed at the Massachusetts Institute of Technology. OpenMC uses continuous-energy cross sections and a constructive solid geometry representation, enabling high-fidelity modeling of nuclear reactors and other systems. Modern, portable input/output file formats are used in OpenMC: XML for input, and HDF5 for output. High performance parallel algorithms in OpenMC have demonstrated near-linear scaling to over 100,000 processors on modern supercomputers. Other topics discussed in this paper include plotting, CMFD acceleration, variance reduction, eigenvalue calculations, and software development processes

  19. Development of a helicon ion source: Simulations and preliminary experiments

    Science.gov (United States)

    Afsharmanesh, M.; Habibi, M.

    2018-03-01

    In the present context, the extraction system of a helicon ion source has been simulated and constructed. Results of the ion source commissioning at up to 20 kV are presented as well as simulations of an ion beam extraction system. Argon current of more than 200 μA at up to 20 kV is extracted and is characterized with a Faraday cup and beam profile monitoring grid. By changing different ion source parameters such as RF power, extraction voltage, and working pressure, an ion beam with current distribution exhibiting a central core has been detected. Jump transition of ion beam current emerges at the RF power near to 700 W, which reveals that the helicon mode excitation has reached this power. Furthermore, measuring the emission line intensity of Ar ii at 434.8 nm is the other way we have used for demonstrating the mode transition from inductively coupled plasma to helicon. Due to asymmetrical longitudinal power absorption of a half-helix helicon antenna, it is used for the ion source development. The modeling of the plasma part of the ion source has been carried out using a code, HELIC. Simulations are carried out by taking into account a Gaussian radial plasma density profile and for plasma densities in range of 1018-1019 m-3. Power absorption spectrum and the excited helicon mode number are obtained. Longitudinal RF power absorption for two different antenna positions is compared. Our results indicate that positioning the antenna near to the plasma electrode is desirable for the ion beam extraction. The simulation of the extraction system was performed with the ion optical code IBSimu, making it the first helicon ion source extraction designed with the code. Ion beam emittance and Twiss parameters of the ellipse emittance are calculated at different iterations and mesh sizes, and the best values of the mesh size and iteration number have been obtained for the calculations. The simulated ion beam extraction system has been evaluated using optimized parameters such

  20. The role of the uncertainty in code development

    Energy Technology Data Exchange (ETDEWEB)

    Barre, F. [CEA-Grenoble (France)

    1997-07-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation.

  1. The role of the uncertainty in code development

    International Nuclear Information System (INIS)

    Barre, F.

    1997-01-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation

  2. Calculation Of Fuel Burnup And Radionuclide Inventory In The Syrian Miniature Neutron Source Reactor Using The GETERA Code

    International Nuclear Information System (INIS)

    Khattab, K.; Dawahra, S.

    2011-01-01

    Calculations of the fuel burnup and radionuclide inventory in the Syrian Miniature Neutron Source Reactor (MNSR) after 10 years (the reactor core expected life) of the reactor operation time are presented in this paper using the GETERA code. The code is used to calculate the fuel group constants and the infinite multiplication factor versus the reactor operating time for 10, 20, and 30 kW operating power levels. The amounts of uranium burnup and plutonium produced in the reactor core, the concentrations and radionuclides of the most important fission product and actinide radionuclides accumulated in the reactor core, and the total radioactivity of the reactor core were calculated using the GETERA code as well. It is found that the GETERA code is better than the WIMSD4 code for the fuel burnup calculation in the MNSR reactor since it is newer and has a bigger library of isotopes and more accurate. (author)

  3. Development and preliminary verification of 2-D transport module of radiation shielding code ARES

    International Nuclear Information System (INIS)

    Zhang Penghe; Chen Yixue; Zhang Bin; Zang Qiyong; Yuan Longjun; Chen Mengteng

    2013-01-01

    The 2-D transport module of radiation shielding code ARES is two-dimensional neutron and radiation shielding code. The theory model was based on the first-order steady state neutron transport equation, adopting the discrete ordinates method to disperse direction variables. Then a set of differential equations can be obtained and solved with the source iteration method. The 2-D transport module of ARES was capable of calculating k eff and fixed source problem with isotropic or anisotropic scattering in x-y geometry. The theoretical model was briefly introduced and series of benchmark problems were verified in this paper. Compared with the results given by the benchmark, the maximum relative deviation of k eff is 0.09% and the average relative deviation of flux density is about 0.60% in the BWR cells benchmark problem. As for the fixed source problem with isotropic and anisotropic scattering, the results of the 2-D transport module of ARES conform with DORT very well. These numerical results of benchmark problems preliminarily demonstrate that the development process of the 2-D transport module of ARES is right and it is able to provide high precision result. (authors)

  4. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2009-01-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electrolyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  5. Development of Unified Code for Environmental Research by Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seung Yeon; Kim, Young Sik; Lee, Sang Mi; Chung, Sang Uk; Lee, Kyu Sung; Kang, Sang Hun; Cheon, Ki Hong [Yonsei University, Seoul (Korea, Republic of)

    1997-07-01

    Three codes were developed to improve accuracy and precision of neutron activation analysis with the adoption of IAEA`s recommended `GANAAS` program which has the better peak identification and efficiency calibration algorithm than the currently using commercial program. Quantitative analytical ability of trace element was improved with the codes such that the number of detectable elements including environmentally important elements was increased. Small and over lapped peaks can be detected more efficiently with the good peak shape calibration(energy dependence on peak height, peak base width and FWHM). Several efficiency functions were added to determine the detector efficiency more accurately which was the main source of error in neutron activation analysis. Errors caused by nuclear data themselves were reduced with the introduction of ko method. New graphical program called `POWER NAA` was developed for the recent personal computer environment, Window 95, and for the data compatibility. It also reduced the error caused by operator`s mistake with the easy and comfortable operation of the code. 11 refs., 3 tabs., 9 figs. (author)

  6. Bug-Fixing and Code-Writing: The Private Provision of Open Source Software

    DEFF Research Database (Denmark)

    Bitzer, Jürgen; Schröder, Philipp

    2002-01-01

    Open source software (OSS) is a public good. A self-interested individual would consider providing such software, if the benefits he gained from having it justified the cost of programming. Nevertheless each agent is tempted to free ride and wait for others to develop the software instead...

  7. Development of an application simulating radioactive sources

    International Nuclear Information System (INIS)

    Riffault, V.; Locoge, N.; Leblanc, E.; Vermeulen, M.

    2011-01-01

    This paper presents an application simulating radioactive gamma sources developed in the 'Ecole des Mines' of Douai (France). It generates raw counting data as an XML file which can then be statistically exploited to illustrate the various concepts of radioactivity (exponential decay law, isotropy of the radiation, attenuation of radiation in matter). The application, with a spread sheet for data analysis and lab procedures, has been released under free license. (authors)

  8. Development of computing code system for level 3 PSA

    International Nuclear Information System (INIS)

    Jeong, Jong Tae; Yu, Dong Han; Kim, Seung Hwan.

    1997-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated through wind tunnel experiment. These results will give a physical insight in the development of a new dispersion model. Because there are some discrepancies between the results from Gaussian plume model and those from field test, the effect of terrain on the atmospheric dispersion was investigated by using CTDMPLUS code. Through this study we find that the model which can treat terrain effect is essential in the atmospheric dispersion of radioactive materials and the CTDMPLUS model can be used as a useful tool. And it is suggested that modification of a model and experimental study should be made through the continuous effort. The health effect assessment near the Yonggwang site by using IPE (Individual plant examination) results and its site data was performed. The health effect assessment is an important part of consequence analysis of a nuclear power plant site. The MACCS was used in the assessment. Based on the calculation of CCDF for each risk measure, it is shown that CCDF has a slow slope and thus wide probability distribution in cases of early fatality, early injury, total early fatality risk, and total weighted early fatality risk. And in cases of cancer fatality and population dose within 48km and 80km, the CCDF curve have a steep slope and thus narrow probability distribution. The establishment of methodologies for necessary models for consequence analysis resulting form a server accident in the nuclear power plant was made and a program for consequence analysis was developed. The models include atmospheric transport and diffusion, calculation of exposure doses for various pathways, and assessment of health effects and associated risks. Finally, the economic impact resulting form an accident in a nuclear power plant was investigated. In this study, estimation models for each cost terms that considered in economic

  9. Development of computing code system for level 3 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong Tae; Yu, Dong Han; Kim, Seung Hwan

    1997-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated through wind tunnel experiment. These results will give a physical insight in the development of a new dispersion model. Because there are some discrepancies between the results from Gaussian plume model and those from field test, the effect of terrain on the atmospheric dispersion was investigated by using CTDMPLUS code. Through this study we find that the model which can treat terrain effect is essential in the atmospheric dispersion of radioactive materials and the CTDMPLUS model can be used as a useful tool. And it is suggested that modification of a model and experimental study should be made through the continuous effort. The health effect assessment near the Yonggwang site by using IPE (Individual plant examination) results and its site data was performed. The health effect assessment is an important part of consequence analysis of a nuclear power plant site. The MACCS was used in the assessment. Based on the calculation of CCDF for each risk measure, it is shown that CCDF has a slow slope and thus wide probability distribution in cases of early fatality, early injury, total early fatality risk, and total weighted early fatality risk. And in cases of cancer fatality and population dose within 48km and 80km, the CCDF curve have a steep slope and thus narrow probability distribution. The establishment of methodologies for necessary models for consequence analysis resulting form a server accident in the nuclear power plant was made and a program for consequence analysis was developed. The models include atmospheric transport and diffusion, calculation of exposure doses for various pathways, and assessment of health effects and associated risks. Finally, the economic impact resulting form an accident in a nuclear power plant was investigated. In this study, estimation models for each cost terms that considered in economic

  10. PCCS model development for SBWR using the CONTAIN code

    International Nuclear Information System (INIS)

    Tills, J.; Murata, K.K.; Washington, K.E.

    1994-01-01

    The General Electric Simplified Boiling Water Reactor (SBWR) employs a passive containment cooling system (PCCS) to maintain long-term containment gas pressure and temperature below design limits during accidents. This system consists of a steam supply line that connects the upper portion of the drywell with a vertical shell-and-tube single pass heat exchanger located in an open water pool outside of the containment safety envelope. The heat exchanger tube outlet is connected to a vent line that is submerged below the suppression pool surface but above the main suppression pool horizontal vents. Steam generated in the post-shutdown period flows into the heat exchanger tubes as the result of suction and/or a low pressure differential between the drywell and suppression chamber. Operation of the PCCS is complicated by the presence of noncondensables in the flow stream. Build-up of noncondensables in the exchanger and vent line for the periods when the vent is not cleared causes a reduction in the exchanger heat removal capacity. As flow to the exchanger is reduced due to the noncondensable gas build-up, the drywell pressure increases until the vent line is cleared and the noncondensables are purged into the suppression chamber, restoring the heat removal capability of the PCCS. This paper reports on progress made in modeling SBWR containment loads using the CONTAIN code. As a central part of this effort, a PCCS model development effort has recently been undertaken to implement an appropriate model in CONTAIN. The CONTAIN PCCS modeling approach is discussed and validated. A full SBWR containment input deck has also been developed for CONTAIN. The plant response to a postulated design basis accident (DBA) has been calculated with the CONTAIN PCCS model and plant deck, and the preliminary results are discussed

  11. Recent developments in seismic analysis in the code Aster; Les developpements recents en analyse sismique dans le code aster

    Energy Technology Data Exchange (ETDEWEB)

    Guihot, P.; Devesa, G.; Dumond, A.; Panet, M.; Waeckel, F.

    1996-12-31

    Progress in the field of seismic qualification and design methods made these last few years allows physical phenomena actually in play to be better considered, while cutting down the conservatism associated with some simplified design methods. So following the change in methods and developing the most advantageous ones among them contributes to the process of the seismic margins assessment and the preparation of new design tools for future series. In this paper, the main developments and improvements in methods which have been made these last two years in the Code Aster, in order to improve seismic calculation methods and seismic margin assessment are presented. The first development relates to making the MISS3D soil structure interaction code available, thanks to an interface made with the Code Aster. The second relates to the possibility of making modal basis time calculations on multi-supported structures by considering local non linearities like impact, friction or squeeze fluid forces. Recent developments in random dynamics and postprocessing devoted to earthquake designs are then mentioned. Three applications of these developments are then ut forward. The first application relates to a test case for soil structure interaction design using MISS3D-Aster coupling. The second is a test case for a multi-supported structure. The last application, more for manufacturing, refers to seismic qualification of Main Live Steam stop valves. First results of the independent validation of the Code Aster seismic design functionalities, which provide and improve the quality of software, are also recalled. (authors). 11 refs.

  12. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    Science.gov (United States)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  13. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/μCi-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult

  14. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/..mu..Ci-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult.

  15. Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.

    Science.gov (United States)

    Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger

    2015-01-01

    To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Development of severe accident analysis code - Development of a finite element code for lower head failure analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Hoon; Lee, Choong Ho; Choi, Tae Hoon; Kim, Hyun Sup; Kim, Se Ho; Kang, Woo Jong; Seo, Chong Kwan [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-08-01

    The study concerns the development of analysis models and computer codes for lower head failure analysis when a severe accident occurs in a nuclear reactor system. Although the lower head failure modes consists of several failure modes, the study this year was focused on the global rupture with the collapse pressure and mode by limit analysis and elastic deformation. The behavior of molten core causes elevation of temperature in the reactor vessel wall and deterioration of load-carrying capacity of a reactor vessel. The behavior of molten core and the heat transfer modes were, therefore, postulated in several types and the temperature distributions according to the assumed heat flux modes were calculated. The collapse pressure of a nuclear reactor lower head decreases rapidly with elevation of temperature as time passes. The calculation shows the safety of a nuclear reactor is enhanced with the lager collapse pressure when the hot spot is located far from the pole. 42 refs., 2 tabs., 31 figs. (author)

  17. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Yidong Xia; Mitch Plummer; Robert Podgorney; Ahmad Ghassemi

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation angle for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.

  18. Development of Teaching Materials for a Physical Chemistry Experiment Using the QR Code

    OpenAIRE

    吉村, 忠与志

    2008-01-01

    The development of teaching materials with the QR code was attempted in an educational environment using a mobile telephone. The QR code is not sufficiently utilized in education, and the current study is one of the first in the field. The QR code is encrypted. However, the QR code can be deciphered by mobile telephones, thus enabling the expression of text in a small space.Contents of "Physical Chemistry Experiment" which are available on the Internet are briefly summarized and simplified. T...

  19. Development of the vacuum system pressure responce analysis code PRAC

    International Nuclear Information System (INIS)

    Horie, Tomoyoshi; Kawasaki, Kouzou; Noshiroya, Shyoji; Koizumi, Jun-ichi.

    1985-03-01

    In this report, we show the method and numerical results of the vacuum system pressure responce analysis code. Since fusion apparatus is made up of many vacuum components, it is required to analyze pressure responce at any points of the system when vacuum system is designed or evaluated. For that purpose evaluating by theoretical solution is insufficient. Numerical analysis procedure such as finite difference method is usefull. In the PRAC code (Pressure Responce Analysis Code), pressure responce is obtained solving derivative equations which is obtained from the equilibrium relation of throughputs and contain the time derivative of pressure. As it considers both molecular and viscous flows, the coefficients of the equation depend on the pressure and the equations become non-linear. This non-linearity is treated as piece-wise linear within each time step. Verification of the code is performed for the simple problems. The agreement between numerical and theoretical solutions is good. To compare with the measured results, complicated model of gas puffing system is analyzed. The agreement is well for practical use. This code will be a useful analytical tool for designing and evaluating vacuum systems such as fusion apparatus. (author)

  20. Development of ECR ion source for VEC

    Energy Technology Data Exchange (ETDEWEB)

    Bose, D K; Taki, G S; Nabhiraj, P Y; Pal, G; Mallik, C; Bhandari, R K [Variable Energy Cyclotron Centre, Calcutta (India)

    1997-12-01

    A 6.4 GHz Electron Cyclotron Resonance Ion Source (ECRIS) was developed at the VEC centre to enable acceleration of heavy ions with the K=130, Variable Energy Cyclotron (VEC). Heavy ions which will be sufficiently energetic after acceleration from the cyclotron will be utilised to explore new fields of research. VEC ECRIS was first made operational in April 1991. Initially the stability and intensity of high charge state (z) beam were poor. Constant efforts were paid to improve source performance. Finally going to high field operation that is improving the plasma confinement, desired stability and high output current were achieved. At present stable {sup 16}O beam up to 50 e{mu}A maximum is available from VEC ECRIS. Many other high- z ion beam of gaseous species are also available. (author) 16 refs., 14 figs., 2 tabs.

  1. Development of ECR ion source for VEC

    International Nuclear Information System (INIS)

    Bose, D.K.; Taki, G.S.; Nabhiraj, P.Y.; Pal, G.; Mallik, C.; Bhandari, R.K.

    1997-01-01

    A 6.4 GHz Electron Cyclotron Resonance Ion Source (ECRIS) was developed at the VEC centre to enable acceleration of heavy ions with the K=130, Variable Energy Cyclotron (VEC). Heavy ions which will be sufficiently energetic after acceleration from the cyclotron will be utilised to explore new fields of research. VEC ECRIS was first made operational in April 1991. Initially the stability and intensity of high charge state (z) beam were poor. Constant efforts were paid to improve source performance. Finally going to high field operation that is improving the plasma confinement, desired stability and high output current were achieved. At present stable 16 O beam up to 50 eμA maximum is available from VEC ECRIS. Many other high- z ion beam of gaseous species are also available. (author)

  2. Polarized H- source development at BNL

    International Nuclear Information System (INIS)

    Alessi, J.G.; Hershcovitch, A.; Kponou, A.; Niinikoski, T.; Sluyters, T.

    1986-01-01

    The AGS polarized H - source (PONI-1) now produces currents of 25-40 μA, and has operated reliably during polarized physics runs. A new polarized source, having as its goal mA's of H-vector, is now under development. An atomic hydrogen beam has been cooled to about 20 K with a forward flux of approx.10 19 atoms/s/sr. A superconducting solenoid having a calculated acceptance angle of 0.1 sr for the cold H 0 beam, is now being built. An ionizer for the resulting polarized H 0 beam based on resonant charge exchange of H 0 with D - , is being tested. 500 μA of H - have been produced by ionizing an unpolarized H 0 beam using this ionizer

  3. Comparison of TG‐43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes

    Science.gov (United States)

    Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460

  4. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    Science.gov (United States)

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  5. Development of Compton gamma-ray sources at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Albert, F.; Anderson, S. G.; Ebbers, C. A.; Gibson, D. J.; Hartemann, F. V.; Marsh, R. A.; Messerly, M. J.; Prantil, M. A.; Wu, S.; Barty, C. P. J. [Lawrence Livermore National Laboratory, NIF and Photon Science, 7000 East avenue, Livermore, CA 94550 (United States)

    2012-12-21

    Compact Compton scattering gamma-ray sources offer the potential of studying nuclear photonics with new tools. The optimization of such sources depends on the final application, but generally requires maximizing the spectral density (photons/eV) of the gamma-ray beam while simultaneously reducing the overall bandwidth on target to minimize noise. We have developed an advanced design for one such system, comprising the RF drive, photoinjector, accelerator, and electron-generating and electron-scattering laser systems. This system uses a 120 Hz, 250 pC, 2 ps, 0.35 mm mrad electron beam with 250 MeV maximum energy in an X-band accelerator scattering off a 150 mJ, 10 ps, 532 nm laser to generate 5 Multiplication-Sign 10{sup 10} photons/eV/s/Sr at 0.5 MeV with an overall bandwidth of less than 1%. The source will be able to produce photons up to energies of 2.5 MeV. We also discuss Compton scattering gamma-ray source predictions given by numerical codes.

  6. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  7. The development of the code package PERMAK--3D//SC--1

    International Nuclear Information System (INIS)

    Bolobov, P. A.; Oleksuk, D. A.

    2011-01-01

    Code package PERMAK-3D//SC-1 was developed for performing pin-by-pin coupled neutronic and thermal hydraulic calculation of the core fragment of seven fuel assemblies and was designed on the basis of 3D multigroup pin-by-pin code PERMAK-3D and 3D (subchannel) thermal hydraulic code SC-1 The code package predicts axial and radial pin-by-pin power distribution and coolant parameters in stimulated region (enthalpies,, velocities,, void fractions,, boiling and DNBR margins).. The report describes some new steps in code package development. Some PERMAK-3D//SC-1 outcomes of WWER calculations are presented in the report. (Authors)

  8. Development of FEMAG. Calculation code of magnetic field generated by ferritic plates in the tokamak devices

    Energy Technology Data Exchange (ETDEWEB)

    Urata, Kazuhiro [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2003-03-01

    In design of the future fusion devises in which low activation ferritic steel is planned to use as the plasma facing material and/or the inserts for ripple reduction, the appreciation of the error field effect against the plasma as well as the optimization of ferritic plate arrangement to reduce the toroidal field ripple require calculation of magnetic field generated by ferritic steel. However iterative calculations concerning the non-linearity in B-H curve of ferritic steel disturbs high-speed calculation required as the design tool. In the strong toroidal magnetic field that is characteristic in the tokamak fusion devices, fully magnetic saturation of ferritic steel occurs. Hence a distribution of magnetic charges as magnetic field source is determined straightforward and any iteration calculation are unnecessary. Additionally objective ferritic steel geometry is limited to the thin plate and ferritic plates are installed along the toroidal magnetic field. Taking these special conditions into account, high-speed calculation code ''FEMAG'' has been developed. In this report, the formalization of 'FEMAG' code, how to use 'FEMAG', and the validity check of 'FEMAG' in comparison with a 3D FEM code, with the measurements of the magnetic field in JFT-2M are described. The presented examples are numerical results of design studies for JT-60 modification. (author)

  9. AREVA Developments for an Efficient and Reliable use of Monte Carlo codes for Radiation Transport Applications

    Science.gov (United States)

    Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald

    2017-09-01

    In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.

  10. AREVA Developments for an Efficient and Reliable use of Monte Carlo codes for Radiation Transport Applications

    Directory of Open Access Journals (Sweden)

    Chapoutier Nicolas

    2017-01-01

    Full Text Available In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics. Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.

  11. Development and verification of a coupled code system RETRAN-MASTER-TORC

    International Nuclear Information System (INIS)

    Cho, J.Y.; Song, J.S.; Joo, H.G.; Zee, S.Q.

    2004-01-01

    Recently, coupled thermal-hydraulics (T-H) and three-dimensional kinetics codes have been widely used for the best-estimate simulations such as the main steam line break (MSLB) and locked rotor problems. This work is to develop and verify one of such codes by coupling the system T-H code RETRAN, the 3-D kinetics code MASTER and sub-channel analysis code TORC. The MASTER code has already been applied to such simulations after coupling with the MARS or RETRAN-3D multi-dimensional system T-H codes. The MASTER code contains a sub-channel analysis code COBRA-III C/P, and the coupled systems MARSMASTER-COBRA and RETRAN-MASTER-COBRA had been already developed and verified. With these previous studies, a new coupled system of RETRAN-MASTER-TORC is to be developed and verified for the standard best-estimate simulation code package in Korea. The TORC code has already been applied to the thermal hydraulics design of the several ABB/CE type plants and Korean Standard Nuclear Power Plants (KSNP). This justifies the choice of TORC rather than COBRA. Because the coupling between RETRAN and MASTER codes are already established and verified, this work is simplified to couple the TORC sub-channel T-H code with the MASTER neutronics code. The TORC code is a standalone code that solves the T-H equations for a given core problem from reading the input file and finally printing the converged solutions. However, in the coupled system, because TORC receives the pin power distributions from the neutronics code MASTER and transfers the T-H results to MASTER iteratively, TORC needs to be controlled by the MASTER code and does not need to solve the given problem completely at each iteration step. By this reason, the coupling of the TORC code with the MASTER code requires several modifications in the I/O treatment, flow iteration and calculation logics. The next section of this paper describes the modifications in the TORC code. The TORC control logic of the MASTER code is then followed. The

  12. Renewable energy sources for tenable development

    International Nuclear Information System (INIS)

    Manazza, G.

    1992-01-01

    Planning criteria for feasible tenable development strategies for industrialized and developing countries are discussed. Attention is given to the role to be played by industrial countries in renewable energy source development and technology transfer to curb the onslaught of global greenhouse effects related environmental problems. The paper cautions against the use of the expression 'tenable' in combination with 'growth'. It recommends, instead, the substitution of the expression, 'tenable growth', which implies the indefinite growth of something which is physical, with 'tenable development', a preferred term, since it denotes the realization of an optimum strategy, compatible with environmental ecosystems, for the betterment of living conditions. An assessment is made of the overall social-economic impacts of such a strategy on the proposed European free trade market and on developing countries struggling to survive in a fiercely competitive world. Here, the paper notes that, for the effective implementation of a tenable development strategy, it is of prime importance to make optimum use of the education system to instil a new set of social values and modify individual behaviour relative to the development and use of natural resources

  13. Cold source vessel development for the advanced neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Williams, P.T.; Lucas, A.T. [Oak Ridge National Lab., TN (United States)

    1995-09-01

    The Advanced Neutron Source (ANS), in its conceptual design phase at Oak Ridge National Laboratory (ORNL), will be a user-oriented neutron research facility that will produce the most intense flux of neutrons in the world. Among its many scientific applications, the productions of cold neutrons is a significant research mission for the ANS. The cold neutrons come from two independent cold sources positioned near the reactor core. Contained by an aluminum alloy vessel, each cold source is a 410 mm diameter sphere of liquid deuterium that functions both as a neutron moderator and a cryogenic coolant. With nuclear heating of the containment vessel and internal baffling, steady-state operation requires close control of the liquid deuterium flow near the vessel`s inner surface. Preliminary thermal-hydraulic analyses supporting the cold source design are being performed with multi-dimensional computational fluid dynamics simulations of the liquid deuterium flow and heat transfer. This paper presents the starting phase of a challenging program and describes the cold source conceptual design, the thermal-hydraulic feasibility studies of the containment vessel, and the future computational and experimental studies that will be used to verify the final design.

  14. Development of solid radioactive sources in acrylamide

    International Nuclear Information System (INIS)

    Yamazaki, I.M.; Koskinas, M.F.; Dias, M.S.; Andrade e Silva, L.G.; Vieira, J.M.

    2004-01-01

    The development of water-equivalent solid sources of 133 Ba prepared from an aqueous solution of acrylamide by polymerization by a high dose 60 Co irradiation is described. The main resin characteristics were measured, such as: density, effective atomic number and uniformity. The variation of these parameters was in the range of 1,08 to 1,16 g.cm -3 for density, 3.7 to 4.0 for effective atomic number and 2.8 to 7.2% for the uniformity. These values are in agreement with the literature. (author)

  15. Large-Signal Code TESLA: Current Status and Recent Development

    National Research Council Canada - National Science Library

    Chernyavskiy, Igor A; Vlasov, Alexander N; Cooke, Simon J; Abe, David K; Levush, Baruch; Antonsen, Jr., Thomas M; Nguyen, Khanh T

    2008-01-01

    .... One such tool is the large-signal code TESLA, which was successfully applied for the modeling of single-beam and multiple-beam klystron devices at the Naval Research Laboratory and which is now used by number of U.S. companies...

  16. Development of irradiator 60Co sources

    International Nuclear Information System (INIS)

    Mosca, Rodrigo C.; Moura, Eduardo S.; Zeituni, Carlos A.; Mathor, Monica B.

    2011-01-01

    According to a recent report by the International Agency for Research on Cancer (IARC) / WHO (2008-2010), the global impact of cancer more than doubled in 30 years. In this report, it was estimated that occurred about 12 million new cancer cases and 7 million deaths. In Brazil in 2010, with estimates for the year 2011, point to the occurrence of 489,270 new cases of cancer. Among the possibilities for cancer treatment, radiotherapy is one of the most important therapeutic and resources used to combat it. However, inherent complications of treatment can occur such as tiredness, loss of appetite, radiodermatitis and in more extreme cases late radionecrosis. In order to reproduce a point of radionecrosis in the vicinity of radiodermatitis to mimic these effects in animals, producing a model for assessment of tissue repair, we propose the setting up of an irradiator source of collimated 60 Co. The development of was based on 11 sources of 60 Co with 1 mm thickness that were inserted by inference in stainless steel 'gate-source' screw (patent pending) and later adjusted in a cross-shaped arrangement reinforced so that the beam radiation is directed to a target point, saving for other regions around this target point. The main use of this irradiator with sources of 60 Co is just one cause radionecrosis point (target point) of approximately 5 mm 2 with a surrounding and adjacent area of radiodermatitis around about 8 to 10 mm 2 in laboratory animals for subsequent coating with epidermal-dermal matrix populated by a cell culture of human fibroblasts, keratinocytes and mesenchymal stem cells. With that said, its use will be valuable for evaluation of curative treatments against the bone and radionecrosis or palliative treatment rather than as it is currently assumed. (author)

  17. Development of irradiator {sup 60}Co sources

    Energy Technology Data Exchange (ETDEWEB)

    Mosca, Rodrigo C.; Moura, Eduardo S.; Zeituni, Carlos A.; Mathor, Monica B., E-mail: rcmosca@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    According to a recent report by the International Agency for Research on Cancer (IARC) / WHO (2008-2010), the global impact of cancer more than doubled in 30 years. In this report, it was estimated that occurred about 12 million new cancer cases and 7 million deaths. In Brazil in 2010, with estimates for the year 2011, point to the occurrence of 489,270 new cases of cancer. Among the possibilities for cancer treatment, radiotherapy is one of the most important therapeutic and resources used to combat it. However, inherent complications of treatment can occur such as tiredness, loss of appetite, radiodermatitis and in more extreme cases late radionecrosis. In order to reproduce a point of radionecrosis in the vicinity of radiodermatitis to mimic these effects in animals, producing a model for assessment of tissue repair, we propose the setting up of an irradiator source of collimated {sup 60}Co. The development of was based on 11 sources of {sup 60}Co with 1 mm thickness that were inserted by inference in stainless steel 'gate-source' screw (patent pending) and later adjusted in a cross-shaped arrangement reinforced so that the beam radiation is directed to a target point, saving for other regions around this target point. The main use of this irradiator with sources of {sup 60}Co is just one cause radionecrosis point (target point) of approximately 5 mm{sup 2} with a surrounding and adjacent area of radiodermatitis around about 8 to 10 mm{sup 2} in laboratory animals for subsequent coating with epidermal-dermal matrix populated by a cell culture of human fibroblasts, keratinocytes and mesenchymal stem cells. With that said, its use will be valuable for evaluation of curative treatments against the bone and radionecrosis or palliative treatment rather than as it is currently assumed. (author)

  18. Development of a large proton accelerator for innovative researches; development of high power RF source

    Energy Technology Data Exchange (ETDEWEB)

    Chung, K. H.; Lee, K. O.; Shin, H. M.; Chung, I. Y. [KAPRA, Seoul (Korea); Kim, D. I. [Inha University, Incheon (Korea); Noh, S. J. [Dankook University, Seoul (Korea); Ko, S. K. [Ulsan University, Ulsan (Korea); Lee, H. J. [Cheju National University, Cheju (Korea); Choi, W. H. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2002-05-01

    This study was performed with objective to design and develop the KOMAC proton accelerator RF system. For the development of the high power RF source for CCDTL(coupled cavity drift tube linac), the medium power RF system using the UHF klystron for broadcasting was integrated and with this RF system we obtained the basic design data, operation experience and code-validity test data. Based on the medium power RF system experimental data, the high power RF system for CCDTL was designed and its performed was analyzed. 16 refs., 64 figs., 27 tabs. (Author)

  19. Advanced Neutron Source Dynamic Model (ANSDM) code description and user guide

    International Nuclear Information System (INIS)

    March-Leuba, J.

    1995-08-01

    A mathematical model is designed that simulates the dynamic behavior of the Advanced Neutron Source (ANS) reactor. Its main objective is to model important characteristics of the ANS systems as they are being designed, updated, and employed; its primary design goal, to aid in the development of safety and control features. During the simulations the model is also found to aid in making design decisions for thermal-hydraulic systems. Model components, empirical correlations, and model parameters are discussed; sample procedures are also given. Modifications are cited, and significant development and application efforts are noted focusing on examination of instrumentation required during and after accidents to ensure adequate monitoring during transient conditions

  20. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system.

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  1. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  2. Gaze strategies can reveal the impact of source code features on the cognitive load of novice programmers

    DEFF Research Database (Denmark)

    Wulff-Jensen, Andreas; Ruder, Kevin Vignola; Triantafyllou, Evangelia

    2018-01-01

    As shown by several studies, programmers’ readability of source code is influenced by its structural and the textual features. In order to assess the importance of these features, we conducted an eye-tracking experiment with programming students. To assess the readability and comprehensibility of...

  3. Use of WIMS-E lattice code for prediction of the transuranic source term for spent fuel dose estimation

    International Nuclear Information System (INIS)

    Schwinkendorf, K.N.

    1996-01-01

    A recent source term analysis has shown a discrepancy between ORIGEN2 transuranic isotopic production estimates and those produced with the WIMS-E lattice physics code. Excellent agreement between relevant experimental measurements and WIMS-E was shown, thus exposing an error in the cross section library used by ORIGEN2

  4. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  5. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  6. Assessment of gamma irradiation heating and damage in miniature neutron source reactor vessel using computational methods and SRIM - TRIM code

    International Nuclear Information System (INIS)

    Appiah-Ofori, F. F.

    2014-07-01

    The Effects of Gamma Radiation Heating and Irradiation Damage in the reactor vessel of Ghana Research Reactor 1, Miniature Neutron Source Reactor were assessed using Implicit Control Volume Finite Difference Numerical Computation and validated by SRIM - TRIM Code. It was assumed that 5.0 MeV of gamma rays from the reactor core generate heat which interact and absorbed completely by the interior surface of the MNSR vessel which affects it performance due to the induced displacement damage. This displacement damage is as result of lattice defects being created which impair the vessel through formation of point defect clusters such as vacancies and interstitiaIs which can result in dislocation loops and networks, voids and bubbles and causing changes in the layers in the thickness of the vessel. The microscopic defects produced in the vessel due to γ - radiation damage are referred to as radiation damage while the defects thus produced modify the macroscopic properties of the vessel which are also known as the radiation effects. These radiation damage effects are of major concern for materials used in nuclear energy production. In this study, the overall objective was to assess the effects of gamma radiation heating and damage in GHARR - I MNSR vessel by a well-developed Mathematical model, Analytical and Numerical solutions, simulating the radiation damage in the vessel. SRIM - TRIM Code was used as a computational tool to determine the displacement per atom (dpa) associated with radiation damage while implicit Control Volume Finite Difference Method was used to determine the temperature profile within the vessel due to γ - radiation heating respectively. The methodology adopted in assessing γ - radiation heating in the vessel involved development of the One-Dimensional Steady State Fourier Heat Conduction Equation with Volumetric Heat Generation both analytical and implicit Control Volume Finite Difference Method approach to determine the maximum temperature and

  7. Sources of Finance for Entrepreneurship Development

    Directory of Open Access Journals (Sweden)

    Balaban Mladenka

    2016-06-01

    Full Text Available Entrepreneurship is one of the most important categories that are now associated with small and medium-sized enterprises, employment and the creation of new jobs and new business category. Entrepreneurial behavior in finance implies a readiness to take risk and a taste for independence and self-fulfillment. It can develop in any sector of the economy and in any type of business. Through entrepreneurship strengthen personal resources - not only the material but also the motives of self-realization, freedom, independence, challenge. Large number of small and medium enterprises provide a huge range of products, and the customers or service users increased choice and lower prices. Considering that entrepreneurship represents the futurethis work is aimed to highlight the role the financial sector plays in its development. The authors suggest that the financial sector has very important role for the development of entrepreneurship, pointing to the different possibilities of cheaper funding development of guidelines for small and medium enterprises, but in other hand in some cases financial sector has negativ impact for growing through expensive sources of financing of development.

  8. Development of a friendly interface for ORIGEN 2.1 code using MatLab software

    International Nuclear Information System (INIS)

    Vieira, Joao Paulo

    2011-01-01

    In an event of accidental liberation of radioactive material to the environment from a nuclear power plant, decisions must be taken quickly to supply the need of mitigating actions. Thus, it is important a fast, clear and safe access to all information about the source term. This work describes the initiative to develop a graphic interface to output data for ORIGEN 2.1 code, intending a friendly and secure approach with the output data and other important parameters for an analysis in emergency case, known the historic of operation of a nuclear power plant type PWR. By using the software MATLAB it is possible to develop an output routine with graphic presentation to some necessary data for an emergency analysis. The interface output must be able of fix up the ORIGEN conventional tables in graphics. In advance, preliminary results will be presented. (author)

  9. OpenMC: a state-of-the-Art Monte Carlo code for research and development

    International Nuclear Information System (INIS)

    Romano, P.K.; Horelik, N.E.; Herman, B.R.; Forget, B.; Smith, K.; Nelson, A.G.

    2013-01-01

    This paper gives an overview of OpenMC, an open source Monte Carlo particle transport code recently developed at the Massachusetts Institute of Technology. OpenMC uses continuous-energy cross sections and a constructive solid geometry representation, enabling high-fidelity modeling of nuclear reactors and other systems. Modern, portable input/output file formats are used in OpenMC: XML for input, and HDF5 for output. High performance parallel algorithms in OpenMC have demonstrated near-linear scaling to over 100,000 processors on modern supercomputers. Other topics discussed in this paper include plotting, CMFD acceleration, variance reduction, eigenvalue calculations, and software development processes. (authors)

  10. Development of statistical analysis code for meteorological data (W-View)

    International Nuclear Information System (INIS)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  11. EHDViz: clinical dashboard development using open-source technologies.

    Science.gov (United States)

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-03-24

    To design, develop and prototype clinical dashboards to integrate high-frequency health and wellness data streams using interactive and real-time data visualisation and analytics modalities. We developed a clinical dashboard development framework called electronic healthcare data visualization (EHDViz) toolkit for generating web-based, real-time clinical dashboards for visualising heterogeneous biomedical, healthcare and wellness data. The EHDViz is an extensible toolkit that uses R packages for data management, normalisation and producing high-quality visualisations over the web using R/Shiny web server architecture. We have developed use cases to illustrate utility of EHDViz in different scenarios of clinical and wellness setting as a visualisation aid for improving healthcare delivery. Using EHDViz, we prototyped clinical dashboards to demonstrate the contextual versatility of EHDViz toolkit. An outpatient cohort was used to visualise population health management tasks (n=14,221), and an inpatient cohort was used to visualise real-time acuity risk in a clinical unit (n=445), and a quantified-self example using wellness data from a fitness activity monitor worn by a single individual was also discussed (n-of-1). The back-end system retrieves relevant data from data source, populates the main panel of the application and integrates user-defined data features in real-time and renders output using modern web browsers. The visualisation elements can be customised using health features, disease names, procedure names or medical codes to populate the visualisations. The source code of EHDViz and various prototypes developed using EHDViz are available in the public domain at http://ehdviz.dudleylab.org. Collaborative data visualisations, wellness trend predictions, risk estimation, proactive acuity status monitoring and knowledge of complex disease indicators are essential components of implementing data-driven precision medicine. As an open-source visualisation

  12. Development of a 3D FEL code for the simulation of a high-gain harmonic generation experiment

    International Nuclear Information System (INIS)

    Biedron, S. G.

    1999-01-01

    Over the last few years, there has been a growing interest in self-amplified spontaneous emission (SASE) free-electron lasers (FELs) as a means for achieving a fourth-generation light source. In order to correctly and easily simulate the many configurations that have been suggested, such as multi-segmented wigglers and the method of high-gain harmonic generation, we have developed a robust three-dimensional code. The specifics of the code, the comparison to the linear theory as well as future plans will be presented

  13. DEVELOPMENT OF SALES APPLICATION OF PREPAID ELECTRICITY VOUCHER BASED ON ANFROID PLATFORM USING QUICK RESPONSE CODE (QR CODE

    Directory of Open Access Journals (Sweden)

    Ricky Akbar

    2017-09-01

    Full Text Available Perusahaan Listrik Negara (PLN has implemented a smart electricity system or prepaid electricity. The customers pay the electricity voucher first before use the electricity. The token contained in electricity voucher that has been purchased by the customer is inserted into the Meter Prabayar (MPB installed in the location of customers. When a customer purchases a voucher, it will get a receipt that contains all of the customer's identity and the 20-digit of voucher code (token to be entered into MPB as a substitute for electrical energy credit. Receipts obtained by the customer is certainly vulnerable to loss, or hijacked by unresponsible parties. In this study, authors designed and develop an android based application by utilizing QR code technology as a replacement for the receipt of prepaid electricity credit which contains the identity of the customer and the 20-digit voucher code. The application is developed by implemented waterfall methodology. The implementation process of the waterfall methods used, are (1 analysis of functional requirement of the system by conducting a preliminary study and data collection based on field studies and literature, (2 system design by using UML diagrams and Business Process Model Notation (BPMN and Entity Relationship diagram (ERD, (3 design implementation by using OOP (Object Oriented programming technique. Web application is developed by using laravel PHP framework and database MySQL while mobile application is developed by using B4A (4 developed system is tested by using blackbox method testing. Final result of this research is a Web and mobile applications for the sale of electricityvoucher by QR Code technology.

  14. SOURCE 2.0 model development: UO2 thermal properties

    International Nuclear Information System (INIS)

    Reid, P.J.; Richards, M.J.; Iglesias, F.C.; Brito, A.C.

    1997-01-01

    During analysis of CANDU postulated accidents, the reactor fuel is estimated to experience large temperature variations and to be exposed to a variety of environments from highly oxidized to mildly reducing. The exposure of CANDU fuel to these environments and temperatures may affect fission product releases from the fuel and cause degradation of the fuel thermal properties. The SOURCE 2.0 project is a safety analysis code which will model the necessary mechanisms required to calculate fission product release for a variety of accident scenarios, including large break loss of coolant accidents (LOCAs) with or without emergency core cooling. The goal of the model development is to generate models which are consistent with each other and phenomenologically based, insofar as that is possible given the state of theoretical understanding

  15. Development of thermal hydraulic analysis code for IHX of FBR

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi; Naohara, Nobuyuki

    1991-01-01

    In order to obtain flow resistance correlations for thermal-hydrauric analysis code concerned with an intermediate heat exchanger (IHX) of FBR, the hydraulic experiment by air was carried out through a bundle of tubes arranged in an in-line and staggard fashion. The main results are summarized as follows. (1) On pressure loss per unit length of a tube bundle, which is densely a regular triangle arrangement, the in-line fashion is almost the same as the staggard one. (2) In case of 30deg sector model for IHX tube bundle, pressure loss is 1/3 in comparison with the in-line or staggard arrangement. (3) By this experimental data, flow resistance correlations for thermalhydrauric analysis code are obtained. (author)

  16. Code Development for Control Design Applications: Phase I: Structural Modeling

    International Nuclear Information System (INIS)

    Bir, G. S.; Robinson, M.

    1998-01-01

    The design of integrated controls for a complex system like a wind turbine relies on a system model in an explicit format, e.g., state-space format. Current wind turbine codes focus on turbine simulation and not on system characterization, which is desired for controls design as well as applications like operating turbine model analysis, optimal design, and aeroelastic stability analysis. This paper reviews structural modeling that comprises three major steps: formation of component equations, assembly into system equations, and linearization

  17. Development of an object oriented lattice QCD code ''Bridge++''

    International Nuclear Information System (INIS)

    Ueda, S; Aoki, S; Aoyama, T; Kanaya, K; Taniguchi, Y; Matsufuru, H; Motoki, S; Namekawa, Y; Nemura, H; Ukita, N

    2014-01-01

    We are developing a new lattice QCD code set ''Bridge++'' aiming at extensible, readable, and portable workbench for QCD simulations, while keeping a high performance at the same time. Bridge++ covers conventional lattice actions and numerical algorithms. The code set is constructed in C++ with an object oriented programming. In this paper we describe fundamental ingredients of the code and the current status of development

  18. ASME Section XI trends in developing nuclear codes and standards

    International Nuclear Information System (INIS)

    Hedden, O.F.

    1995-01-01

    When the author began working on nuclear power many years ago, he knew that perfection was the only acceptable technical standard. Unfortunately, this became an obsession with perfection that has had unfavorable consequences in some of the non-technical areas of work in ASME nuclear power Codes and Standards. However, the economic problems of the nuclear power industry now demand a more pragmatic approach if the industry is to continue. Not only does each item considered for action need to be evaluated to criteria that may in some cases be less than perfection, but one needs to consider whether it contributes tangibly to either safety or to reduction in technical or administrative burden. These should be the governing, criteria. The introduction of risk-based inspection methodologies will certainly be an important element in doing this successfully. One needs to consider these criteria collectively, as one discusses each item at the committee level, and individually, as one votes on each item. In the past, the author has been concerned that the industry was not acting quickly enough in taking advantage of opportunities offered by the Code to increase safety or to reduce cost. While he still has some concern, he thinks communication channels have been greatly improved. Now he is becoming more concerned with both the collective and individual actions that delay beneficial changes. The second part of the author's talk has to do with the relevance of the code committees in the nuclear power industry regulatory process

  19. Development of a parallelization strategy for the VARIANT code

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Khalil, H.S.; Palmiotti, G.; Tatsumi, M.

    1996-01-01

    The VARIANT code solves the multigroup steady-state neutron diffusion and transport equation in three-dimensional Cartesian and hexagonal geometries using the variational nodal method. VARIANT consists of four major parts that must be executed sequentially: input handling, calculation of response matrices, solution algorithm (i.e. inner-outer iteration), and output of results. The objective of the parallelization effort was to reduce the overall computing time by distributing the work of the two computationally intensive (sequential) tasks, the coupling coefficient calculation and the iterative solver, equally among a group of processors. This report describes the code's calculations and gives performance results on one of the benchmark problems used to test the code. The performance analysis in the IBM SPx system shows good efficiency for well-load-balanced programs. Even for relatively small problem sizes, respectable efficiencies are seen for the SPx. An extension to achieve a higher degree of parallelism will be addressed in future work. 7 refs., 1 tab

  20. Development of Ultrasonic Pulse Compression Using Golay Codes

    International Nuclear Information System (INIS)

    Kim, Young H.; Kim, Young Gil; Jeong, Peter

    1994-01-01

    Conventional ultrasonic flaw detection system uses a large amplitude narrow pulse to excite a transducer. However, these systems are limited in pulse energy. An excessively large amplitude causes a dielectric breakage of the transducer, and an excessively long pulse causes decrease of the resolution. Using the pulse compression, a long pulse of pseudorandom signal can be used without sacrificing resolution by signal correlation. In the present work, the pulse compression technique was implemented into an ultrasonic system. Golay code was used as a pseudorandom signal in this system, since pair sum of autocorrelations has no sidelobe. The equivalent input pulse of the Golay code was derived to analyze the pulse compression system. Throughout the experiment, the pulse compression technique has demonstrated for its improved SNR(signal to noise ratio) by reducing the system's white noise. And the experimental data also indicated that the SNR enhancement was proportional to the square root of the code length used. The technique seems to perform particularly well with highly energy-absorbent materials such as polymers, plastics and rubbers