WorldWideScience

Sample records for ls code generator

  1. Extending JPEG-LS for low-complexity scalable video coding

    DEFF Research Database (Denmark)

    Ukhanova, Anna; Sergeev, Anton; Forchhammer, Søren

    2011-01-01

    JPEG-LS, the well-known international standard for lossless and near-lossless image compression, was originally designed for non-scalable applications. In this paper we propose a scalable modification of JPEG-LS and compare it with the leading image and video coding standards JPEG2000 and H.264/SVC...

  2. Code Generation with Templates

    CERN Document Server

    Arnoldus, Jeroen; Serebrenik, A

    2012-01-01

    Templates are used to generate all kinds of text, including computer code. The last decade, the use of templates gained a lot of popularity due to the increase of dynamic web applications. Templates are a tool for programmers, and implementations of template engines are most times based on practical experience rather than based on a theoretical background. This book reveals the mathematical background of templates and shows interesting findings for improving the practical use of templates. First, a framework to determine the necessary computational power for the template metalanguage is presen

  3. Studi Eksperimen dan Numerik Pengaruh Penambahan Vortex Generator pada Airfoil NASA LS-0417

    Directory of Open Access Journals (Sweden)

    Ulul Azmi

    2017-03-01

    Full Text Available Separasi boundary layer merupakan fenomena penting yang mempengaruhi performansi airfoil. Salah satu upaya untuk menunda atau menghilangkan separasi aliran adalah meningkatkan momentum fluida untuk melawan adverse pressure dan tegangan geser permukaan. Hal ini mengakibatkan separasi aliran akan tertunda lebih ke belakang. Upaya tersebut dapat dilakukan dengan penambahan turbulent generator pada upper surface airfoil. Vortex generator (VG merupakan salah satu jenis turbulent generator yang dapat mempercepat transisi dari laminar boundary layer menjadi turbulent boundary layer. Oleh karena itu, penelitian ini bertujuan untuk mengetahui pengaruh jarak penempatan dan ketinggian VG terhadap perkembangan turbulent boundary layer sehingga dapat meningkatkan performansi airfoil. Penelitian ini dilakukan dengan eksperimen dan numerik pada Re = 1.41x105 dengan angle of attack 16°. Benda uji yang digunakan adalah airfoil NASA LS-0417 dengan dan tanpa VG. Variasi jarak penempatan dan ketinggian VG yaitu x/c = 0.1; 0.2; 0.3; 0.4 (h = 1 mm; 3 mm; 5 mm. Hasil yang didapatkan adalah variasi vortex generator paling optimal adalah vortex generator dengan x/c = 0.3 dan h = 1 mm dimana Nilai CL/CD mengalami kenaikan sebesar 14.337%.

  4. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  5. Effects of surface roughness and vortex generators on the LS(1)-0417MOD airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Reuss, R.L.; Hoffman, M.J.; Gregorek, G.M. [Ohio State Univ., Columbus, OH (United States)

    1995-12-01

    An 18-inch constant-chord model of the LS(l)-0417MOD airfoil section was tested under two dimensional steady state conditions ate University 7{times}10 Subsonic Wind Tunnel. The objective was to document section lift and moment characteristics model and air flow conditions. Surface pressure data was acquired at {minus}60{degrees} through + 230{degrees} geometric angles of attack, at a nominal 1 million Reynolds number. Cases with and without leading edge grit roughness were investigated. The leading edge mulated blade conditions in the field. Additionally, surface pressure data were acquired for Reynolds numbers of 1.5 and 2.0 million, with and without leading edge grit roughness; the angle of attack was limited to a {minus}20{degrees} to 40{degrees} range. In general, results showed lift curve slope sensitivities to Reynolds number and roughness. The maximum lift coefficient was reduced as much as 29% by leading edge roughness. Moment coefficient showed little sensitivity to roughness beyond 50{degrees} angle of attack, but the expected decambering effect of a thicker boundary layer with roughness did show at lower angles. Tests were also conducted with vortex generators located at the 30% chord location on the upper surface only, at 1 and 1.5 million Reynolds numbers, with and without leading edge grit roughness. In general, with leading edge grit roughness applied, the vortex generators restored 85 percent of the baseline level of maximum lift coefficient but with a more sudden stall break and at a higher angle of attack than the baseline.

  6. MAGNETOHYDRODYNAMIC EQUATIONS (MHD GENERATION CODE

    Directory of Open Access Journals (Sweden)

    Francisco Frutos Alfaro

    2017-04-01

    Full Text Available A program to generate codes in Fortran and C of the full magnetohydrodynamic equations is shown. The program uses the free computer algebra system software REDUCE. This software has a package called EXCALC, which is an exterior calculus program. The advantage of this program is that it can be modified to include another complex metric or spacetime. The output of this program is modified by means of a LINUX script which creates a new REDUCE program to manipulate the magnetohydrodynamic equations to obtain a code that can be used as a seed for a magnetohydrodynamic code for numerical applications. As an example, we present part of the output of our programs for Cartesian coordinates and how to do the discretization.

  7. A preliminary neutronic evaluation and depletion study of VHTR and LS-VHTR reactors using the codes: WIMSD5 and MCNPX

    International Nuclear Information System (INIS)

    Silva, Fabiano C.; Pereira, Claubia; Costa, Antonella Lombardi; Veloso, Maria Auxiliadora Fortini

    2009-01-01

    It is expected that, in the future, besides electricity generation, reactors should also develop secondary activities, such as hydrogen generation and seawater desalinization. Generation IV reactors are expected to possess special characteristics, like high safety, minimization of radioactive rejects amount and ability to use reprocessed fuel with non-proliferating projects in their cycles. Among the projects of IV generation reactors available nowadays, the (High Temperature Reactors) HTR, are highlighted due to these desirable characteristics. Under such circumstances, such reactor may be able to have significant higher thermal power ratings to be used for hydrogen production, without loose of safety, even in an emergency. For this work, we have chosen two HTR concepts of a prismatic reactor: (Very High Temperature Reactor) VHTR and the (Liquid Salted -Very High Temperature Reactor) LS-VHTR. The principal difference between them is the coolant. The VHTR uses helium gas as a coolant and have a burnup of 101,661 MWd/THM while the LS-VHTR uses low-pressure liquid coolant molten fluoride salt with a boiling point near 1500 de C working at 155,946 MWd/THM. The ultimate power output is limited by the capacity of the passive decay system; this capacity is limited by the reactor vessel temperature. The goal was to evaluate the neutronic behavior and fuel composition during the burnup using the codes (Winfrith Improved Multi-Group Scheme) WIMSD5 and the MCNPX2.6. The first, deterministic and the second, stochastic. For both reactors, burned fuel type 'C' coming from Angra-I nuclear plant, in Brazil, was used with 3.1% of initial enrichment, burnup to 33,000 MWd/THM using the ORIGEN2.1 code, divided in three steps of 11,000 MWd/THM, with an average density power of 37.75 MWd/THM and 5 years of cooling in pool. Finally, the fuel was reprocessed by Purex technique extracting 99.9% of Pu, and the desired amount of fissile material (15%) to achieve the final mixed oxide was

  8. New GOES satellite synchronized time code generation

    Science.gov (United States)

    Fossler, D. E.; Olson, R. K.

    1984-01-01

    The TRAK Systems' GOES Satellite Synchronized Time Code Generator is described. TRAK Systems has developed this timing instrument to supply improved accuracy over most existing GOES receiver clocks. A classical time code generator is integrated with a GOES receiver.

  9. A study on the modeling techniques using LS-INGRID

    Energy Technology Data Exchange (ETDEWEB)

    Ku, J. H.; Park, S. W

    2001-03-01

    For the development of radioactive material transport packages, the verification of structural safety of a package against the free drop impact accident should be carried out. The use of LS-DYNA, which is specially developed code for impact analysis, is essential for impact analysis of the package. LS-INGRID is a pre-processor for LS-DYNA with considerable capability to deal with complex geometries and allows for parametric modeling. LS-INGRID is most effective in combination with LS-DYNA code. Although the usage of LS-INGRID seems very difficult relative to many commercial mesh generators, the productivity of users performing parametric modeling tasks with LS-INGRID can be much higher in some cases. Therefore, LS-INGRID has to be used with LS-DYNA. This report presents basic explanations for the structure and commands, basic modelling examples and advanced modelling of LS-INGRID to use it for the impact analysis of various packages. The new users can build the complex model easily, through a study for the basic examples presented in this report from the modelling to the loading and constraint conditions.

  10. Grid code requirements for wind power generation

    International Nuclear Information System (INIS)

    Djagarov, N.; Filchev, S.; Grozdev, Z.; Bonev, M.

    2011-01-01

    In this paper production data of wind power in Europe and Bulgaria and plans for their development within 2030 are reviewed. The main characteristics of wind generators used in Bulgaria are listed. A review of the grid code in different European countries, which regulate the requirements for renewable sources, is made. European recommendations for requirements harmonization are analyzed. Suggestions for the Bulgarian gird code are made

  11. Radionuclide daughter inventory generator code: DIG

    International Nuclear Information System (INIS)

    Fields, D.E.; Sharp, R.D.

    1985-09-01

    The Daughter Inventory Generator (DIG) code accepts a tabulation of radionuclide initially present in a waste stream, specified as amounts present either by mass or by activity, and produces a tabulation of radionuclides present after a user-specified elapsed time. This resultant radionuclide inventory characterizes wastes that have undergone daughter ingrowth during subsequent processes, such as leaching and transport, and includes daughter radionuclides that should be considered in these subsequent processes or for inclusion in a pollutant source term. Output of the DIG code also summarizes radionuclide decay constants. The DIG code was developed specifically to assist the user of the PRESTO-II methodology and code in preparing data sets and accounting for possible daughter ingrowth in wastes buried in shallow-land disposal areas. The DIG code is also useful in preparing data sets for the PRESTO-EPA code. Daughter ingrowth in buried radionuclides and in radionuclides that have been leached from the wastes and are undergoing hydrologic transport are considered, and the quantities of daughter radionuclide are calculated. Radionuclide decay constants generated by DIG and included in the DIG output are required in the PRESTO-II code input data set. The DIG accesses some subroutines written for use with the CRRIS system and accesses files containing radionuclide data compiled by D.C. Kocher. 11 refs

  12. Two-Level Semantics and Code Generation

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1988-01-01

    A two-level denotational metalanguage that is suitable for defining the semantics of Pascal-like languages is presented. The two levels allow for an explicit distinction between computations taking place at compile-time and computations taking place at run-time. While this distinction is perhaps...... not absolutely necessary for describing the input-output semantics of programming languages, it is necessary when issues such as data flow analysis and code generation are considered. For an example stack-machine, the authors show how to generate code for the run-time computations and still perform the compile...

  13. Code generation of RHIC accelerator device objects

    International Nuclear Information System (INIS)

    Olsen, R.H.; Hoff, L.; Clifford, T.

    1995-01-01

    A RHIC Accelerator Device Object is an abstraction which provides a software view of a collection of collider control points known as parameters. A grammar has been defined which allows these parameters, along with code describing methods for acquiring and modifying them, to be specified efficiently in compact definition files. These definition files are processed to produce C++ source code. This source code is compiled to produce an object file which can be loaded into a front end computer. Each loaded object serves as an Accelerator Device Object class definition. The collider will be controlled by applications which set and get the parameters in instances of these classes using a suite of interface routines. Significant features of the grammar are described with details about the generated C++ code

  14. (U) Ristra Next Generation Code Report

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-22

    LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming a common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.

  15. Towards Product Lining Model-Driven Development Code Generators

    OpenAIRE

    Roth, Alexander; Rumpe, Bernhard

    2015-01-01

    A code generator systematically transforms compact models to detailed code. Today, code generation is regarded as an integral part of model-driven development (MDD). Despite its relevance, the development of code generators is an inherently complex task and common methodologies and architectures are lacking. Additionally, reuse and extension of existing code generators only exist on individual parts. A systematic development and reuse based on a code generator product line is still in its inf...

  16. New coding technique for computer generated holograms.

    Science.gov (United States)

    Haskell, R. E.; Culver, B. C.

    1972-01-01

    A coding technique is developed for recording computer generated holograms on a computer controlled CRT in which each resolution cell contains two beam spots of equal size and equal intensity. This provides a binary hologram in which only the position of the two dots is varied from cell to cell. The amplitude associated with each resolution cell is controlled by selectively diffracting unwanted light into a higher diffraction order. The recording of the holograms is fast and simple.

  17. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  18. FCG: a code generator for lazy functional languages

    NARCIS (Netherlands)

    Kastens, U.; Langendoen, K.G.; Hartel, Pieter H.; Pfahler, P.

    1992-01-01

    The FCGcode generator produces portable code that supports efficient two-space copying garbage collection. The code generator transforms the output of the FAST compiler front end into an abstract machine code. This code explicitly uses a call stack, which is accessible to the garbage collector. In

  19. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  20. Evaluation of the efficiency and fault density of software generated by code generators

    Science.gov (United States)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  1. ISOGEN: Interactive isotope generation and depletion code

    International Nuclear Information System (INIS)

    Venkata Subbaiah, Kamatam

    2016-01-01

    ISOGEN is an interactive code for solving first order coupled linear differential equations with constant coefficients for a large number of isotopes, which are produced or depleted by the processes of radioactive decay or through neutron transmutation or fission. These coupled equations can be written in a matrix notation involving radioactive decay constants and transmutation coefficients, and the eigenvalues of thus formed matrix vary widely (several tens of orders), and hence no single method of solution is suitable for obtaining precise estimate of concentrations of isotopes. Therefore, different methods of solutions are followed, namely, matrix exponential method, Bateman series method, and Gauss-Seidel iteration method, as was followed in the ORIGEN-2 code. ISOGEN code is written in a modern computer language, VB.NET version 2013 for Windows operating system version 7, which enables one to provide many interactive features between the user and the program. The output results depend on the input neutron database employed and the time step involved in the calculations. The present program can display the information about the database files, and the user has to select one which suits the current need. The program prints the 'WARNING' information if the time step is too large, which is decided based on the built-in convergence criterion. Other salient interactive features provided are (i) inspection of input data that goes into calculation, (ii) viewing of radioactive decay sequence of isotopes (daughters, precursors, photons emitted) in a graphical format, (iii) solution of parent and daughter products by direct Bateman series solution method, (iv) quick input method and context sensitive prompts for guiding the novice user, (v) view of output tables for any parameter of interest, and (vi) output file can be read to generate new information and can be viewed or printed since the program stores basic nuclide concentration unlike other batch jobs. The sample

  2. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  3. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  4. Analysis of visual coding variables on CRT generated displays

    International Nuclear Information System (INIS)

    Blackman, H.S.; Gilmore, W.E.

    1985-01-01

    Cathode ray tube generated safety parameter display systems in a nuclear power plant control room situation have been found to be improved in effectiveness when color coding is employed. Research has indicated strong support for graphic coding techniques particularly in redundant coding schemes. In addition, findings on pictographs, as applied in coding schemes, indicate the need for careful application and for further research in the development of a standardized set of symbols

  5. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  6. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  7. Code Generation from Pragmatics Annotated Coloured Petri Nets

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    limited work has been done on transforming CPN model to protocol implementations. The goal of the thesis is to be able to automatically generate high-quality implementations of communication protocols based on CPN models. In this thesis, we develop a methodology for generating implementations of protocols...... third party libraries and the code should be easily usable by third party code. Finally, the code should be readable by developers with expertise on the considered platforms. In this thesis, we show that our code generation approach is able to generate code for a wide range of platforms without altering...... such as games and rich web applications. Finally, we conclude the evaluation of the criteria of our approach by using the WebSocket PA-CPN model to show that we are able to verify fairly large protocols....

  8. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  9. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  10. Optimized Method for Generating and Acquiring GPS Gold Codes

    Directory of Open Access Journals (Sweden)

    Khaled Rouabah

    2015-01-01

    Full Text Available We propose a simpler and faster Gold codes generator, which can be efficiently initialized to any desired code, with a minimum delay. Its principle consists of generating only one sequence (code number 1 from which we can produce all the other different signal codes. This is realized by simply shifting this sequence by different delays that are judiciously determined by using the bicorrelation function characteristics. This is in contrast to the classical Linear Feedback Shift Register (LFSR based Gold codes generator that requires, in addition to the shift process, a significant number of logic XOR gates and a phase selector to change the code. The presence of all these logic XOR gates in classical LFSR based Gold codes generator provokes the consumption of an additional time in the generation and acquisition processes. In addition to its simplicity and its rapidity, the proposed architecture, due to the total absence of XOR gates, has fewer resources than the conventional Gold generator and can thus be produced at lower cost. The Digital Signal Processing (DSP implementations have shown that the proposed architecture presents a solution for acquiring Global Positioning System (GPS satellites signals optimally and in a parallel way.

  11. gCSP occam Code Generation for RMoX

    NARCIS (Netherlands)

    Groothuis, M.A.; Liet, Geert K.; Broenink, Johannes F.; Roebbers, H.W.; Sunter, J.P.E.; Welch, P.H.; Wood, D.C.

    2005-01-01

    gCSP is a graphical tool for creating and editing CSP diagrams. gCSP is used in our labs to generate the embedded software framework for our control systems. As a further extension to our gCSP tool, an occam code generator has been constructed. Generating occam from CSP diagrams gives opportunities

  12. Wavelet-Coded OFDM for Next Generation Mobile Communications

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    2016-01-01

    In this work, we evaluate the performance of Wavelet-Coding into offering robustness for OFDM signals against the combined effects of varying fading and noise bursts. Wavelet-Code enables high diversity gains with a low complex receiver, and, most notably, without compromising the system’s spectr......-wave frequencies in future generation mobile communication due to its robustness against multipath fading....

  13. Novel power saving architecture for FBG based OCDMA code generation

    Science.gov (United States)

    Osadola, Tolulope B.; Idris, Siti K.; Glesk, Ivan

    2013-10-01

    A novel architecture for generating incoherent, 2-dimensional wavelength hopping-time spreading optical CDMA codes is presented. The architecture is designed to facilitate the reuse of optical source signal that is unused after an OCDMA code has been generated using fiber Bragg grating based encoders. Effective utilization of available optical power is therefore achieved by cascading several OCDMA encoders thereby enabling 3dB savings in optical power.

  14. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Jones, J.P.

    1993-01-01

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  15. Next generation Zero-Code control system UI

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Developing ergonomic user interfaces for control systems is challenging, especially during machine upgrade and commissioning where several small changes may suddenly be required. Zero-code systems, such as *Inspector*, provide agile features for creating and maintaining control system interfaces. More so, these next generation Zero-code systems bring simplicity and uniformity and brake the boundaries between Users and Developers. In this talk we present *Inspector*, a CERN made Zero-code application development system, and we introduce the major differences and advantages of using Zero-code control systems to develop operational UI.

  16. AMZ, multigroup constant library for EXPANDA code, generated by NJOY code from ENDF/B-IV

    International Nuclear Information System (INIS)

    Chalhoub, E.S.; Moraes, Marisa de

    1985-01-01

    It is described a library of multigroup constants with 70 energy groups and 37 isotopes to fast reactor calculation. The cross sections, scattering matrices and self-shielding factors were generated by NJOY code and RGENDF interface program, from ENDF/B-IV'S evaluated data. The library is edited in adequated format to be used by EXPANDA code. (M.C.K.) [pt

  17. Improved mesh generator for the POISSON Group Codes

    International Nuclear Information System (INIS)

    Gupta, R.C.

    1987-01-01

    This paper describes the improved mesh generator of the POISSON Group Codes. These improvements enable one to have full control over the way the mesh is generated and in particular the way the mesh density is distributed throughout this model. A higher mesh density in certain regions coupled with a successively lower mesh density in others keeps the accuracy of the field computation high and the requirements on the computer time and computer memory low. The mesh is generated with the help of codes AUTOMESH and LATTICE; both have gone through a major upgrade. Modifications have also been made in the POISSON part of these codes. We shall present an example of a superconducting dipole magnet to explain how to use this code. The results of field computations are found to be reliable within a few parts in a hundred thousand even in such complex geometries

  18. Forced circulation type steam generator simulation code: HT4

    International Nuclear Information System (INIS)

    Okamoto, Masaharu; Tadokoro, Yoshihiro

    1982-08-01

    The purpose of this code is a understanding of dynamic characteristics of the steam generator, which is a component of High-temperature Heat Transfer Components Test Unit. This unit is a number 4th test section of Helium Engineering Demonstration Loop (HENDEL). Features of this report are as follows, modeling of the steam generator, a basic relationship for the continuity equation, numerical analysis techniques of a non-linear simultaneous equation and computer graphics output techniques. Forced circulation type steam generator with strait tubes and horizontal cut baffles, applied in this code, have be designed at the Over All System Design of the VHTRex. The code is for use with JAERI's digital computer FACOM M200. About 1.5 sec required for each time step reiteration, then about 40 sec cpu time required for a standard problem. (author)

  19. Summary: after LS1

    International Nuclear Information System (INIS)

    Pojer, M.; Schmidt, R.

    2012-01-01

    After LS1 the energy will be about 6.5 TeV. The physics potential of LHC is determined by the integrated luminosity useful for the experiments and not only by the peak luminosity. The integrated luminosity is determined by the peak luminosity, the luminosity decay and the efficiency of operation (availability). In this session two of these parameters are addressed, the peak luminosity and the availability. In this paper peak luminosity is discussed through the performance potential of the injectors and through the global performance reach of LHC after LS1. LHC availability is tackled through the issues of the reliability of the magnet powering and of the beam system and of the occurrence of quenches

  20. pix2code: Generating Code from a Graphical User Interface Screenshot

    OpenAIRE

    Beltramelli, Tony

    2017-01-01

    Transforming a graphical user interface screenshot created by a designer into computer code is a typical task conducted by a developer in order to build customized software, websites, and mobile applications. In this paper, we show that deep learning methods can be leveraged to train a model end-to-end to automatically generate code from a single input image with over 77% of accuracy for three different platforms (i.e. iOS, Android and web-based technologies).

  1. Modeling Guidelines for Code Generation in the Railway Signaling Context

    Science.gov (United States)

    Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo

    2009-01-01

    Modeling guidelines constitute one of the fundamental cornerstones for Model Based Development. Their relevance is essential when dealing with code generation in the safety-critical domain. This article presents the experience of a railway signaling systems manufacturer on this issue. Introduction of Model-Based Development (MBD) and code generation in the industrial safety-critical sector created a crucial paradigm shift in the development process of dependable systems. While traditional software development focuses on the code, with MBD practices the focus shifts to model abstractions. The change has fundamental implications for safety-critical systems, which still need to guarantee a high degree of confidence also at code level. Usage of the Simulink/Stateflow platform for modeling, which is a de facto standard in control software development, does not ensure by itself production of high-quality dependable code. This issue has been addressed by companies through the definition of modeling rules imposing restrictions on the usage of design tools components, in order to enable production of qualified code. The MAAB Control Algorithm Modeling Guidelines (MathWorks Automotive Advisory Board)[3] is a well established set of publicly available rules for modeling with Simulink/Stateflow. This set of recommendations has been developed by a group of OEMs and suppliers of the automotive sector with the objective of enforcing and easing the usage of the MathWorks tools within the automotive industry. The guidelines have been published in 2001 and afterwords revisited in 2007 in order to integrate some additional rules developed by the Japanese division of MAAB [5]. The scope of the current edition of the guidelines ranges from model maintainability and readability to code generation issues. The rules are conceived as a reference baseline and therefore they need to be tailored to comply with the characteristics of each industrial context. Customization of these

  2. Approximation generation for correlations in thermal-hydraulic analysis codes

    International Nuclear Information System (INIS)

    Pereira, Luiz C.M.; Carmo, Eduardo G.D. do

    1997-01-01

    A fast and precise evaluation of fluid thermodynamic and transport properties is needed for the efficient mass, energy and momentum transport phenomena simulation related to nuclear plant power generation. A fully automatic code capable to generate suitable approximation for correlations with one or two independent variables is presented. Comparison in terms of access speed and precision with original correlations currently used shows the adequacy of the approximation obtained. (author). 4 refs., 8 figs., 1 tab

  3. Interpretation and code generation based on intermediate languages

    DEFF Research Database (Denmark)

    Kornerup, Peter; Kristensen, Bent Bruun; Madsen, Ole Lehrmann

    1980-01-01

    The possibility of supporting high level languages through intermediate languages to be used for direct interpretation and as intermediate forms in compilers is investigated. An accomplished project in the construction of an interpreter and a code generator using one common intermediate form...

  4. Development of the next generation reactor analysis code system, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Nagaya, Yasunobu; Chiba, Go; Kugo, Teruhiko; Ishikawa, Makoto; Tatsumi, Masahiro; Hirai, Yasushi; Hyoudou, Hideaki; Numata, Kazuyuki; Iwai, Takehiko; Jin, Tomoyuki

    2011-03-01

    A next generation reactor analysis code system, MARBLE, has been developed. MARBLE is a successor of the fast reactor neutronics analysis code systems, JOINT-FR and SAGEP-FR (conventional systems), which were developed for so-called JUPITER standard analysis methods. MARBLE has the equivalent analysis capability to the conventional system because MARBLE can utilize sub-codes included in the conventional system without any change. On the other hand, burnup analysis functionality for power reactors is improved compared with the conventional system by introducing models on fuel exchange treatment and control rod operation and so on. In addition, MARBLE has newly developed solvers and some new features of burnup calculation by the Krylov sub-space method and nuclear design accuracy evaluation by the extended bias factor method. In the development of MARBLE, the object oriented technology was adopted from the view-point of improvement of the software quality such as flexibility, expansibility, facilitation of the verification by the modularization and assistance of co-development. And, software structure called the two-layer system consisting of scripting language and system development language was applied. As a result, MARBLE is not an independent analysis code system which simply receives input and returns output, but an assembly of components for building an analysis code system (i.e. framework). Furthermore, MARBLE provides some pre-built analysis code systems such as the fast reactor neutronics analysis code system. SCHEME, which corresponds to the conventional code and the fast reactor burnup analysis code system, ORPHEUS. (author)

  5. Post LS1 schedule

    CERN Document Server

    Lamont, M

    2014-01-01

    The scheduling limits for a typical long year taking into account technical stops, machine development, spe- cial physics runs are presented. An attempt is then made to outline a ten year post LS1 schedule taking into account the disparate requirements outlined in the previous talks in this session. The demands on the planned long shutdowns and the impact of these demands on their proposed length will be discussed. The option of using ion running as a pre-shutdown cool-down period will be addressed.

  6. Code Generation for Protocols from CPN models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    software implementation satisfies the properties verified for the model. Coloured Petri Nets (CPNs) have been widely used to model and verify protocol software, but limited work exists on using CPN models of protocol software as a basis for automated code generation. In this report, we present an approach...... modelling languages, MDE further has the advantage that models are amenable to model checking which allows key behavioural properties of the software design to be verified. The combination of formally verified models and automated code generation contributes to a high degree of assurance that the resulting...... for generating protocol software from a restricted class of CPN models. The class of CPN models considered aims at being descriptive in that the models are intended to be helpful in understanding and conveying the operation of the protocol. At the same time, a descriptive model is close to a verifiable version...

  7. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  8. Standard LS-TTL IC data book

    International Nuclear Information System (INIS)

    1997-05-01

    This book is one of semiconductor device data manual series. It introduces standard logic 74LS fast series. It has general information, circuit characteristic, note of design and test and FAST Data Sheets, which includes gates, Flip-Flop, Decoder and Encoder, Counter, Master reset, Counter, Shift register, octal Buffer/Line Driver/3-state, generator/Checker, Full adder, Error Detection and correction circuit controller and synchronous address Multiplexer. It also lists LS Data Sheets including NAND gate, NOR gate, Hex Inverter, Delay Element, Frequency Divider, Decode Counter, Function generator, Dual Decode Counter, Memory cycle controller and voltage controlled Oscillator.

  9. PCS a code system for generating production cross section libraries

    International Nuclear Information System (INIS)

    Cox, L.J.

    1997-01-01

    This document outlines the use of the PCS Code System. It summarizes the execution process for generating FORMAT2000 production cross section files from FORMAT2000 reaction cross section files. It also describes the process of assembling the ASCII versions of the high energy production files made from ENDL and Mark Chadwick's calculations. Descriptions of the function of each code along with its input and output and use are given. This document is under construction. Please submit entries, suggestions, questions, and corrections to (ljc at sign llnl.gov) 3 tabs

  10. Steam generator and circulator model for the HELAP code

    International Nuclear Information System (INIS)

    Ludewig, H.

    1975-07-01

    An outline is presented of the work carried out in the 1974 fiscal year on the GCFBR safety research project consisting of the development of improved steam generator and circulator (steam turbine driven helium compressor) models which will eventually be inserted in the HELAP (1) code. Furthermore, a code was developed which will be used to generate steady state input for the primary and secondary sides of the steam generator. The following conclusions and suggestions for further work are made: (1) The steam-generator and circulator model are consistent with the volume and junction layout used in HELAP, (2) with minor changes these models, when incorporated in HELAP, could be used to simulate a direct cycle plant, (3) an explicit control valve model is still to be developed and would be very desirable to control the flow to the turbine during a transient (initially this flow will be controlled by using the existing check valve model); (4) the friction factor in the laminar flow region is computed inaccurately, this might cause significant errors in loss-of-flow accidents; and (5) it is felt that HELAP will still use a large amount of computer time and will thus be limited to design basis accidents without scram or loss of flow transients with and without scram. Finally it may also be used as a test bed for the development of prototype component models which would be incorporated in a more sophisticated system code, developed specifically for GCFBR's

  11. BBU code development for high-power microwave generators

    International Nuclear Information System (INIS)

    Houck, T.L.; Westenskow, G.A.; Yu, S.S.

    1992-01-01

    We are developing a two-dimensional, time-dependent computer code for the simulation of transverse instabilities in support of relativistic klystron-two beam accelerator research at LLNL. The code addresses transient effects as well as both cumulative and regenerative beam breakup modes. Although designed specifically for the transport of high current (kA) beams through traveling-wave structures, it is applicable to devices consisting of multiple combinations of standing-wave, traveling-wave, and induction accelerator structures. In this paper we compare code simulations to analytical solutions for the case where there is no rf coupling between cavities, to theoretical scaling parameters for coupled cavity structures, and to experimental data involving beam breakup in the two traveling-wave output structure of our microwave generator. (Author) 4 figs., tab., 5 refs

  12. Automatic ID heat load generation in ANSYS code

    International Nuclear Information System (INIS)

    Wang, Zhibi.

    1992-01-01

    Detailed power density profiles are critical in the execution of a thermal analysis using a finite element (FE) code such as ANSYS. Unfortunately, as yet there is no easy way to directly input the precise power profiles into ANSYS. A straight-forward way to do this is to hand-calculate the power of each node or element and then type the data into the code. Every time a change is made to the FE model, the data must be recalculated and reentered. One way to solve this problem is to generate a set of discrete data, using another code such as PHOTON2, and curve-fit the data. Using curve-fitted formulae has several disadvantages. It is time consuming because of the need to run a second code for generation of the data, curve-fitting, and doing the data check, etc. Additionally, because there is no generality for different beamlines or different parameters, the above work must be repeated for each case. And, errors in the power profiles due to curve-fitting result in errors in the analysis. To solve the problem once and for all and with the capability to apply to any insertion device (ID), a program for ED power profile was written in ANSYS Parametric Design Language (APDL). This program is implemented as an ANSYS command with input parameters of peak magnetic field, deflection parameter, length of ID, and distance from the source. Once the command is issued, all the heat load will be automatically generated by the code

  13. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  14. Code Generation for a Simple First-Order Prover

    DEFF Research Database (Denmark)

    Villadsen, Jørgen; Schlichtkrull, Anders; Halkjær From, Andreas

    2016-01-01

    We present Standard ML code generation in Isabelle/HOL of a sound and complete prover for first-order logic, taking formalizations by Tom Ridge and others as the starting point. We also define a set of so-called unfolding rules and show how to use these as a simple prover, with the aim of using t...... the approach for teaching logic and verification to computer science students at the bachelor level....

  15. Improved diffusion coefficients generated from Monte Carlo codes

    International Nuclear Information System (INIS)

    Herman, B. R.; Forget, B.; Smith, K.; Aviles, B. N.

    2013-01-01

    Monte Carlo codes are becoming more widely used for reactor analysis. Some of these applications involve the generation of diffusion theory parameters including macroscopic cross sections and diffusion coefficients. Two approximations used to generate diffusion coefficients are assessed using the Monte Carlo code MC21. The first is the method of homogenization; whether to weight either fine-group transport cross sections or fine-group diffusion coefficients when collapsing to few-group diffusion coefficients. The second is a fundamental approximation made to the energy-dependent P1 equations to derive the energy-dependent diffusion equations. Standard Monte Carlo codes usually generate a flux-weighted transport cross section with no correction to the diffusion approximation. Results indicate that this causes noticeable tilting in reconstructed pin powers in simple test lattices with L2 norm error of 3.6%. This error is reduced significantly to 0.27% when weighting fine-group diffusion coefficients by the flux and applying a correction to the diffusion approximation. Noticeable tilting in reconstructed fluxes and pin powers was reduced when applying these corrections. (authors)

  16. Computer codes for simulation of Angra 1 reactor steam generator

    International Nuclear Information System (INIS)

    Pinto, A.C.

    1978-01-01

    A digital computer code is developed for the simulation of the steady-state operation of a u-tube steam generator with natural recirculation used in Pressurized Water Reactors. The steam generator is simulated with two flow channel separated by a metallic wall, with a preheating section with counter flow and a vaporizing section with parallel flow. The program permits the changes in flow patterns and heat transfer correlations, in accordance with the local conditions along the vaporizing section. Various sub-routines are developed for the determination of steam and water properties and a mathematical model is established for the simulation of transients in the same steam generator. The steady state operating conditions in one of the steam generators of ANGRA 1 reactor are determined utilizing this programme. Global results obtained agree with published values [pt

  17. The effect and contribution of wind generated rotation on outlet temperature and heat gain of LS-2 parabolic trough solar collector

    Directory of Open Access Journals (Sweden)

    Sadaghiyani Omid Karimi

    2013-01-01

    Full Text Available The Monte Carlo ray tracing method is applied and coupled with finite volume numerical methods to study effect of rotation on outlet temperature and heat gain of LS-2 parabolic trough concentrator (PTC. Based on effect of sunshape, curve of mirror and use of MCRT, heat flux distribution around of inner wall of evacuated tube is calculated. After calculation of heat flux, the geometry of LS-2 Luz collector is created and finite volume method is applied to simulate. The obtained results are compared with Dudley et al test results for irrotational cases to validate these numerical solving models. Consider that, for rotational models ,the solving method separately with K.S. Ball's results. In this work, according to the structure of mentioned collector, we use plug as a flow restriction. In the rotational case studies, the inner wall rotates with different angular speeds. We compare results of rotational collector with irrotational. Also for these two main states, the location of plug changed then outlet temperature and heat gain of collector are studied. The results show that rotation have positive role on heat transfer processing and the rotational plug in bottom half of tube have better effectual than upper half of tube. Also the contribution of rotation is calculated in the all of case studies. Working fluid of these study is one of the oil derivatives namely Syltherm-800. The power of wind can be used to rotate tube of collector.

  18. Modeling Vortex Generators in a Navier-Stokes Code

    Science.gov (United States)

    Dudek, Julianne C.

    2011-01-01

    A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.

  19. Study on random number generator in Monte Carlo code

    International Nuclear Information System (INIS)

    Oya, Kentaro; Kitada, Takanori; Tanaka, Shinichi

    2011-01-01

    The Monte Carlo code uses a sequence of pseudo-random numbers with a random number generator (RNG) to simulate particle histories. A pseudo-random number has its own period depending on its generation method and the period is desired to be long enough not to exceed the period during one Monte Carlo calculation to ensure the correctness especially for a standard deviation of results. The linear congruential generator (LCG) is widely used as Monte Carlo RNG and the period of LCG is not so long by considering the increasing rate of simulation histories in a Monte Carlo calculation according to the remarkable enhancement of computer performance. Recently, many kinds of RNG have been developed and some of their features are better than those of LCG. In this study, we investigate the appropriate RNG in a Monte Carlo code as an alternative to LCG especially for the case of enormous histories. It is found that xorshift has desirable features compared with LCG, and xorshift has a larger period, a comparable speed to generate random numbers, a better randomness, and good applicability to parallel calculation. (author)

  20. Modular code supervisor. Automatic generation of command language

    International Nuclear Information System (INIS)

    Dumas, M.; Thomas, J.B.

    1988-01-01

    It is shown how, starting from a problem formulated by the user, to generate the adequate calculation procedure in the command code, and acquire the data necessary for the calculation while verifying their validity. Modular codes are used, because of their flexibility and wide utilisation. Modules are written in Fortran, and calculations are done in batches according to an algorithm written in the GIBIANE command language. The action plans are based on the STRIPS and WARPLAN families. Elementary representation of a module and special instructions are illustrated. Dynamic construction macro-actions, and acquisition of the specification (which allows users to express the goal of a program without indicating which algorithm is used to reach the goal) are illustrated. The final phase consists in translating the algorithm into the command language [fr

  1. Machine-Checked Sequencer for Critical Embedded Code Generator

    Science.gov (United States)

    Izerrouken, Nassima; Pantel, Marc; Thirioux, Xavier

    This paper presents the development of a correct-by-construction block sequencer for GeneAuto a qualifiable (according to DO178B/ED12B recommendation) automatic code generator. It transforms Simulink models to MISRA C code for safety critical systems. Our approach which combines classical development process and formal specification and verification using proof-assistants, led to preliminary fruitful exchanges with certification authorities. We present parts of the classical user and tools requirements and derived formal specifications, implementation and verification for the correctness and termination of the block sequencer. This sequencer has been successfully applied to real-size industrial use cases from various transportation domain partners and led to requirement errors detection and a correct-by-construction implementation.

  2. Quenches after LS1

    International Nuclear Information System (INIS)

    Verweij, A.P.

    2012-01-01

    In this paper I will give an overview of the different types of quenches that occur in the LHC, followed by an estimate of the number of quenches that we can expect after LS1. Beam-induced quenches and false triggering of the QPS will be the main cause of those quenches that cause a beam dump. Possibly in total up to 10-20 per year. After consolidation of the 13 kA joints, the approach for the BLM settings can be less conservative than in 2010-2012 in order to maximize beam time. This will cause some quenches but, anyhow, a beam.induced quench is not more risky than a quench provoked by false triggering. It is not easy to predict the number of BLM triggered beam dumps, needed to avoid magnet quenches, because it is not sure how to scale beam losses and UFO's from 3.5 TeV to 6.5 TeV, and it is not sure if the thresholds at 3.5 TeV are correct. Quench events will be much more massive (ex: RB quench at 6 kA → 2 MJ, RB quench at 11 kA → 6-20 MJ), and as a result cryo recuperation much longer. There will also be more ramp induced quenches after a FPA in other circuits due to higher ramp rates and smaller temperature margins (mutual coupling)

  3. Interoperable domain-specific languages families for code generation

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Plášil, F.; Bureš, Tomáš; Hnětynka, P.

    2013-01-01

    Roč. 43, č. 5 (2013), s. 479-499 ISSN 0038-0644 R&D Projects: GA ČR GD201/09/H057 EU Projects: European Commission(XE) ASCENS 257414 Grant - others:GA AV ČR(CZ) GAP103/11/1489 Program:FP7 Institutional research plan: CEZ:AV0Z10300504 Keywords : code generation * domain specific languages * models reuse * extensible languages * specification * program synthesis Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.148, year: 2013

  4. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  5. An improved steam generator model for the SASSYS code

    International Nuclear Information System (INIS)

    Pizzica, P.A.

    1989-01-01

    A new steam generator model has been developed for the SASSYS computer code, which analyzes accident conditions in a liquid-metal-cooled fast reactor. It has been incorporated into the new SASSYS balance-of-plant model, but it can also function as a stand-alone model. The model provides a full solution of the steady-state condition before the transient calculation begins for given sodium and water flow rates, inlet and outlet sodium temperatures, and inlet enthalpy and region lengths on the water side

  6. ANL/CANTIA code for steam generator tube integrity assessment

    International Nuclear Information System (INIS)

    Revankar, S.T.; Wolf, B.; Majumdar, S.; Riznic, J.R.

    2009-01-01

    Steam generator (SG) tubes have an important safety role in CANDU type reactors and Pressurized Water Reactors (PWR) because they constitute one of the primary barriers between the radioactive and non-radioactive sides of the nuclear plant. The SG tubes are susceptible to corrosion and damage. A failure of a single steam generator tube, or even a few tubes, would not be a serious safety-related event in a CANDU reactor. The leakage from a ruptured tube is within makeup capacity of the primary heat transport system, so that as long as the operator takes the correct actions, the off-site consequences will be negligible. A sufficient safety margin against tube rupture used to be the basis for a variety of maintenance strategies developed to maintain a suitable level of plant safety and reliability. Several through-wall flaws may remain in operation and potentially contribute to the total primary-to-secondary leak rate. Assessment of the conditional probabilities of tube failures, leak rates, and ultimately risk of exceeding licensing dose limits has been used for steam generator tube fitness-for-service assessment. The advantage of this type of analysis is that it avoids the excessive conservatism typically present in deterministic methodologies. However, it requires considerable effort and expense to develop all of the failure, leakage, probability of detection, and flaw growth distributions and models necessary to obtain meaningful results from a probabilistic model. The Canadian Nuclear Safety Commission (CNSC) recently developed the CANTIA methodology for probabilistic assessment of inspection strategies for steam generator tubes as a direct effect on the probability of tube failure and primary-to-secondary leak rate Recently Argonne National Laboratory has developed tube integrity and leak rate models under Integrated Steam Generator Tube Integrity Program (ISGTIP-2). These models have been incorporated in the ANL/CANTIA code. This paper presents the ANL

  7. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    Fernandez Adiego, B.; Blanco Vinuela, E.; Prieto Barreiro, I.

    2012-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  8. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    , can be written as short C kernels operating locally on the underlying mesh, with no explicit parallelism. The executable code is then generated in C, CUDA or OpenCL and executed in parallel on the target architecture. The system also offers features of special relevance to the geosciences. In particular, the large scale separation between the vertical and horizontal directions in many geoscientific processes can be exploited to offer the flexibility of unstructured meshes in the horizontal direction, without the performance penalty usually associated with those methods.

  9. C code generation applied to nonlinear model predictive control for an artificial pancreas

    DEFF Research Database (Denmark)

    Boiroux, Dimitri; Jørgensen, John Bagterp

    2017-01-01

    This paper presents a method to generate C code from MATLAB code applied to a nonlinear model predictive control (NMPC) algorithm. The C code generation uses the MATLAB Coder Toolbox. It can drastically reduce the time required for development compared to a manual porting of code from MATLAB to C...

  10. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  11. An improved steam generator model for the SASSYS code

    International Nuclear Information System (INIS)

    Pizzica, P.A.

    1989-01-01

    A new steam generator model has been developed for the SASSYS computer code, which analyzes accident conditions in a liquid metal cooled fast reactor. It has been incorporated into the new SASSYS balance-of-plant model but it can also function on a stand-alone basis. The steam generator can be used in a once-through mode, or a variant of the model can be used as a separate evaporator and a superheater with recirculation loop. The new model provides for an exact steady-state solution as well as the transient calculation. There was a need for a faster and more flexible model than the old steam generator model. The new model provides for more detail with its multi-mode treatment as opposed to the previous model's one node per region approach. Numerical instability problems which were the result of cell-centered spatial differencing, fully explicit time differencing, and the moving boundary treatment of the boiling crisis point in the boiling region have been reduced. This leads to an increase in speed as larger time steps can now be taken. The new model is an improvement in many respects. 2 refs., 3 figs

  12. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  13. Environmental codes of practice for steam electric power generation

    International Nuclear Information System (INIS)

    1985-03-01

    The Design Phase Code is one of a series of documents being developed for the steam electric power generation industry. This industry includes fossil-fuelled stations (gas, oil and coal-fired boilers), and nuclear-powered stations (CANDU heavy water reactors). In this document, environmental concerns associated with water-related and solid waste activities of steam electric plants are discussed. Design recommendations are presented that will minimize the detrimental environmental effects of once-through cooling water systems, of wastewaters discharged to surface waters and groundwaters, and of solid waste disposal sites. Recommendations are also presented for the design of water-related monitoring systems and programs. Cost estimates associated with the implementation of these recommendations are included. These technical guides for new or modified steam electric stations are the result to consultation with a federal-provincial-industry task force

  14. Forsskåls Fiskeherbarium

    DEFF Research Database (Denmark)

    Provencal, Philippe

    2017-01-01

    En beskrivelse af Peter Forsskåls fiskeherbarium og hans metoder i det marinbiologiske feltarbejde da han rejste i Egypten, Rødehavet og Yemen som ekspeditionsmedlem i den Arabiske Rejse 2061-67.......En beskrivelse af Peter Forsskåls fiskeherbarium og hans metoder i det marinbiologiske feltarbejde da han rejste i Egypten, Rødehavet og Yemen som ekspeditionsmedlem i den Arabiske Rejse 2061-67....

  15. Controlling a Conventional LS-pump based on Electrically Measured LS-pressure

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben Ole; Hansen, Michael Rygaard

    2008-01-01

    As a result of the increasing use of sensors in mobile hydraulic equipment, the need for hydraulic pilot lines is decreasing, being replaced by electrical wiring and electrically controllable components. For controlling some of the existing hydraulic components there are, however, still a need...... this system, by either generating a copy of the LS-pressure, the LS-pressure being the output, or letting the output be the pump pressure. The focus of the current paper is on the controller design based on the first approach. Specifically a controlled leakage flow is used to avoid the need for a switching...

  16. Low-Power, Rad-hard Reconfigurable, Bi-directional Flexfet™ Level Shifter ReBiLS for Multiple Generation Technology Integration for Space Exploration, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The many different generations of integrated circuit (IC) technologies required for new space exploration systems demand designs operate at multiple and often...

  17. A novel method of generating and remembering international morse codes

    Digital Repository Service at National Institute of Oceanography (India)

    Charyulu, R.J.K.

    untethered communications have been advanced, despite as S.O.S International Morse Code will be at rescue as an emergency tool, when all other modes fail The details of hte method and actual codes have been enumerated....

  18. MUXS: a code to generate multigroup cross sections for sputtering calculations

    International Nuclear Information System (INIS)

    Hoffman, T.J.; Robinson, M.T.; Dodds, H.L. Jr.

    1982-10-01

    This report documents MUXS, a computer code to generate multigroup cross sections for charged particle transport problems. Cross sections generated by MUXS can be used in many multigroup transport codes, with minor modifications to these codes, to calculate sputtering yields, reflection coefficients, penetration distances, etc

  19. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  20. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  1. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  2. Improvements to the COBRA-TF (EPRI) computer code for steam generator analysis. Final report

    International Nuclear Information System (INIS)

    Stewart, C.W.; Barnhart, J.S.; Koontz, A.S.

    1980-09-01

    The COBRA-TF (EPRI) code has been improved and extended for pressurized water reactor steam generator analysis. New features and models have been added in the areas of subcooled boiling and heat transfer, turbulence, numerics, and global steam generator modeling. The code's new capabilities are qualified against selected experimental data and demonstrated for typical global and microscale steam generator analysis

  3. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  4. AMZ, library of multigroup constants for EXPANDA computer codes, generated by NJOY computer code from ENDF/B-IV

    International Nuclear Information System (INIS)

    Chalhoub, E.S.; Moraes, M. de.

    1984-01-01

    A 70-group, 37-isotope library of multigroup constants for fast reactor nuclear design calculations is described. Nuclear cross sections, transfer matrices, and self-shielding factors were generated with NJOY code and an auxiliary program RGENDF using evaluated data from ENDF/B-IV. The output is being issued in a format suitable for EXPANDA code. Comparisons with JFS-2 library, as well as, test resuls for 14 CSEWG benchmark critical assemblies are presented. (Author) [pt

  5. A method for generating subgroup parameters from resonance tables and the SPART code

    International Nuclear Information System (INIS)

    Devan, K.; Mohanakrishnan, P.

    1995-01-01

    A method for generating subgroup or band parameters from resonance tables is described. A computer code SPART was written using this method. This code generates the subgroup parameters for any number of bands within the specified broad groups at different temperatures by reading the required input data from the binary cross section library in the Cadarache format. The results obtained with SPART code for two bands were compared with that obtained from GROUPIE code and a good agreement was obtained. Results of the generation of subgroup parameters in four bands for sample case of 239 Pu from resonance tables of Cadarache Ver.2 library is also presented. 6 refs, 2 tabs

  6. Generation of Efficient High-Level Hardware Code from Dataflow Programs

    OpenAIRE

    Siret , Nicolas; Wipliez , Matthieu; Nezan , Jean François; Palumbo , Francesca

    2012-01-01

    High-level synthesis (HLS) aims at reducing the time-to-market by providing an automated design process that interprets and compiles high-level abstraction programs into hardware. However, HLS tools still face limitations regarding the performance of the generated code, due to the difficulties of compiling input imperative languages into efficient hardware code. Moreover the hardware code generated by the HLS tools is usually target-dependant and at a low level of abstraction (i.e. gate-level...

  7. LS1: exciting times ahead

    CERN Multimedia

    Caroline Duc

    2013-01-01

    As the first and last proton-lead run of 2013 draws to a close, the extensive upgrade and maintenance programme of the LHC's first long shutdown (LS1) is about to get under way.   The LHC has provided physicists with a huge quantity of data to analyse since the first physics run in 2009. Now it's time for the machine, along with CERN's other accelerators, to get a facelift. LS1 will start on 13 February 2013, but this doesn’t mean that life at the Laboratory will be any less rich and exciting. Although there will be no collisions for a period of almost two years, the whole CERN site will be a hive of activity, with large-scale work under way to modernise the infrastructure and prepare the LHC for operation at higher energy. "A whole series of renovation work will be carried out around the LHC during LS1,” explains Simon Baird, deputy head of the EN Department. "The key driver is of course the consolidation of the 10,170 high-curren...

  8. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  9. Global Convergence of a Modified LS Method

    Directory of Open Access Journals (Sweden)

    Liu JinKui

    2012-01-01

    Full Text Available The LS method is one of the effective conjugate gradient methods in solving the unconstrained optimization problems. The paper presents a modified LS method on the basis of the famous LS method and proves the strong global convergence for the uniformly convex functions and the global convergence for general functions under the strong Wolfe line search. The numerical experiments show that the modified LS method is very effective in practice.

  10. Bursts generate a non-reducible spike-pattern code

    Directory of Open Access Journals (Sweden)

    Hugo G Eyherabide

    2009-05-01

    Full Text Available On the single-neuron level, precisely timed spikes can either constitute firing-rate codes or spike-pattern codes that utilize the relative timing between consecutive spikes. There has been little experimental support for the hypothesis that such temporal patterns contribute substantially to information transmission. Using grasshopper auditory receptors as a model system, we show that correlations between spikes can be used to represent behaviorally relevant stimuli. The correlations reflect the inner structure of the spike train: a succession of burst-like patterns. We demonstrate that bursts with different spike counts encode different stimulus features, such that about 20% of the transmitted information corresponds to discriminating between different features, and the remaining 80% is used to allocate these features in time. In this spike-pattern code, the "what" and the "when" of the stimuli are encoded in the duration of each burst and the time of burst onset, respectively. Given the ubiquity of burst firing, we expect similar findings also for other neural systems.

  11. CITADEL: a computer code for the analysis of iodine behavior in steam generator tube rupture accidents

    International Nuclear Information System (INIS)

    1982-04-01

    The computer code CITADEL was written to analyze iodine behavior during steam generator tube rupture accidents. The code models the transport and deposition of iodine from its point of escape at the steam generator primary break until its release to the environment. This report provides a brief description of the code including its input requirements and the nature and form of its output. A user's guide describing the manner in which the input data are required to be set up to run the code is also provided

  12. Texture side information generation for distributed coding of video-plus-depth

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Raket, Lars Lau; Zamarin, Marco

    2013-01-01

    We consider distributed video coding in a monoview video-plus-depth scenario, aiming at coding textures jointly with their corresponding depth stream. Distributed Video Coding (DVC) is a video coding paradigm in which the complexity is shifted from the encoder to the decoder. The Side Information...... components) is strongly correlated, so the additional depth information may be used to generate more accurate SI for the texture stream, increasing the efficiency of the system. In this paper we propose various methods for accurate texture SI generation, comparing them with other state-of-the-art solutions...

  13. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  14. Detector Plans for LS1

    Energy Technology Data Exchange (ETDEWEB)

    Nessi, M [European Organization for Nuclear Research, Geneva (Switzerland)

    2012-07-01

    All experiments plan an effective usage of the LS1 shutdown period. After three years of running they will go through a consolidation phase, mostly to fix problems that have emerged over time, like single points of failure in the infrastructure, failures of low voltage power supplies and optical links. Upgrades of some detector components will start, mainly related to the beam pipe, the innermost tracker elements and the trigger system. Detector components, which had to be staged for cost reasons in 2003, will then enter into the detector setup. The goal is to be fully ready for the new energy regime at nominal luminosity.

  15. On the use of SERPENT Monte Carlo code to generate few group diffusion constants

    Energy Technology Data Exchange (ETDEWEB)

    Piovezan, Pamela, E-mail: pamela.piovezan@ctmsp.mar.mil.b [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Sao Paulo, SP (Brazil); Carluccio, Thiago; Domingos, Douglas Borges; Rossi, Pedro Russo; Mura, Luiz Felipe, E-mail: fermium@cietec.org.b, E-mail: thiagoc@ipen.b [Fermium Tecnologia Nuclear, Sao Paulo, SP (Brazil); Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The accuracy of diffusion reactor codes strongly depends on the quality of the groups constants processing. For many years, the generation of such constants was based on 1-D infinity cell transport calculations. Some developments using collision probability or the method of characteristics allow, nowadays, 2-D assembly group constants calculations. However, these 1-D and 2-D codes how some limitations as , for example, on complex geometries and in the neighborhood of heavy absorbers. On the other hand, since Monte Carlos (MC) codes provide accurate neutro flux distributions, the possibility of using these solutions to provide group constants to full-core reactor diffusion simulators has been recently investigated, especially for the cases in which the geometry and reactor types are beyond the capability of the conventional deterministic lattice codes. The two greatest difficulties on the use of MC codes to group constant generation are the computational costs and the methodological incompatibility between analog MC particle transport simulation and deterministic transport methods based in several approximations. The SERPENT code is a 3-D continuous energy MC transport code with built-in burnup capability that was specially optimized to generate these group constants. In this work, we present the preliminary results of using the SERPENT MC code to generate 3-D two-group diffusion constants for a PWR like assembly. These constants were used in the CITATION diffusion code to investigate the effects of the MC group constants determination on the neutron multiplication factor diffusion estimate. (author)

  16. Generating Importance Map for Geometry Splitting using Discrete Ordinates Code in Deep Shielding Problem

    International Nuclear Information System (INIS)

    Kim, Jong Woon; Lee, Young Ouk

    2016-01-01

    When we use MCNP code for a deep shielding problem, we prefer to use variance reduction technique such as geometry splitting, or weight window, or source biasing to have relative error within reliable confidence interval. To generate importance map for geometry splitting in MCNP calculation, we should know the track entering number and previous importance on each cells since a new importance is calculated based on these information. If a problem is deep shielding problem such that we have zero tracks entering on a cell, we cannot generate new importance map. In this case, discrete ordinates code can provide information to generate importance map easily. In this paper, we use AETIUS code as a discrete ordinates code. Importance map for MCNP is generated based on a zone average flux of AETIUS calculation. The discretization of space, angle, and energy is not necessary for MCNP calculation. This is the big merit of MCNP code compared to the deterministic code. However, deterministic code (i.e., AETIUS) can provide a rough estimate of the flux throughout a problem relatively quickly. This can help MCNP by providing variance reduction parameters. Recently, ADVANTG code is released. This is an automated tool for generating variance reduction parameters for fixed-source continuous-energy Monte Carlo simulations with MCNP5 v1.60

  17. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  18. Developments in the Generation and Interpretation of Wire Codes (invited paper)

    International Nuclear Information System (INIS)

    Ebi, K.L.

    1999-01-01

    Three new developments in the generation and interpretation of wire codes are discussed. First, a method was developed to computer generate wire codes using data gathered from a utility database of the local distribution system and from tax assessor records. This method was used to wire code more than 250,000 residences in the greater Denver metropolitan area. There was an approximate 75% agreement with field wire coding. Other research in Denver suggests that wire codes predict some characteristics of a residence and its neighbourhood, including age, assessed value, street layout and traffic density. A third new development is the case-specular method to study the association between wire codes and childhood cancers. Recent results from applying the method to the Savitz et al and London et al studies suggest that the associations between childhood cancer and VHCC residences were strongest for residences with a backyard rather than street service drop, and for VHCC residences with LCC speculars. (author)

  19. Strong normalization by type-directed partial evaluation and run-time code generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1998-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  20. Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1997-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  1. Perspectives on the development of next generation reactor systems safety analysis codes

    International Nuclear Information System (INIS)

    Zhang, H.

    2015-01-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  2. Perspectives on the development of next generation reactor systems safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-07-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  3. SPS completes LS1 activities

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    On 27 June, the SPS closed its doors to the LS1 engineers, bringing to an end almost 17 months of activities. The machine now enters the hardware-testing phase in preparation for an October restart.   Photo 1: The SPS transfer tunnel, TT10, reinforced with steal beams. Having completed their LS1 activities right on schedule (to the day!), the SPS team is now preparing the machine for its restart. Over the next eight weeks, hardware tests of the SPS dipole and quadrupole power converters will be underway, led by the TE-EPC (Electrical Power Converters) team. "OP start-up test activities will also be running in parallel, utilising the off hours when EPC is not using the machine," says David McFarlane, the SPS technical coordinator from the Engineering Department. "The primary beam testing phase will start at the beginning of September, once hardware tests and DSO safety tests have been completed." It has been a long journey to this point, with several major...

  4. FASOR - A second generation shell of revolution code

    Science.gov (United States)

    Cohen, G. A.

    1978-01-01

    An integrated computer program entitled Field Analysis of Shells of Revolution (FASOR) currently under development for NASA is described. When completed, this code will treat prebuckling, buckling, initial postbuckling and vibrations under axisymmetric static loads as well as linear response and bifurcation under asymmetric static loads. Although these modes of response are treated by existing programs, FASOR extends the class of problems treated to include general anisotropy and transverse shear deformations of stiffened laminated shells. At the same time, a primary goal is to develop a program which is free of the usual problems of modeling, numerical convergence and ill-conditioning, laborious problem setup, limitations on problem size and interpretation of output. The field method is briefly described, the shell differential equations are cast in a suitable form for solution by this method and essential aspects of the input format are presented. Numerical results are given for both unstiffened and stiffened anisotropic cylindrical shells and compared with previously published analytical solutions.

  5. Information Theoretic Secret Key Generation: Structured Codes and Tree Packing

    Science.gov (United States)

    Nitinawarat, Sirin

    2010-01-01

    This dissertation deals with a multiterminal source model for secret key generation by multiple network terminals with prior and privileged access to a set of correlated signals complemented by public discussion among themselves. Emphasis is placed on a characterization of secret key capacity, i.e., the largest rate of an achievable secret key,…

  6. KCUT, code to generate minimal cut sets for fault trees

    International Nuclear Information System (INIS)

    Han, Sang Hoon

    2008-01-01

    1 - Description of program or function: KCUT is a software to generate minimal cut sets for fault trees. 2 - Methods: Expand a fault tree into cut sets and delete non minimal cut sets. 3 - Restrictions on the complexity of the problem: Size and complexity of the fault tree

  7. ADGEN: An automated adjoint code generator for large-scale sensitivity analysis

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.; Horwedel, J.E.; Lucius, J.L.

    1987-01-01

    This paper describes a new automated system, named ADGEN, which makes use of the strengths of computer calculus to automate the costly and time-consuming calculation of derivatives in FORTRAN computer codes, and automatically generate adjoint solutions of computer codes

  8. Generation of neutron cross sections library for the Thermos code of the Fuel management System (FMS)

    International Nuclear Information System (INIS)

    Alonso V, G.; Viais J, J.

    1990-10-01

    There is developed a method to generate the library of neutron cross sections for the Thermos code by means of the database ENDF-B/IV and the NJOY code. The obtained results are compared with the version previous of the library of neutron cross sections which was processed using the version ENDF-B/III. (Author)

  9. SLACINPT - a FORTRAN program that generates boundary data for the SLAC gun code

    International Nuclear Information System (INIS)

    Michel, W.L.; Hepburn, J.D.

    1982-03-01

    The FORTRAN program SLACINPT was written to simplify the preparation of boundary data for the SLAC gun code. In SLACINPT, the boundary is described by a sequence of straight line or arc segments. From these, the program generates the individual boundary mesh point data, required as input by the SLAC gun code

  10. INGEN: a general-purpose mesh generator for finite element codes

    International Nuclear Information System (INIS)

    Cook, W.A.

    1979-05-01

    INGEN is a general-purpose mesh generator for two- and three-dimensional finite element codes. The basic parts of the code are surface and three-dimensional region generators that use linear-blending interpolation formulas. These generators are based on an i, j, k index scheme that is used to number nodal points, construct elements, and develop displacement and traction boundary conditions. This code can generate truss elements (2 modal points); plane stress, plane strain, and axisymmetry two-dimensional continuum elements (4 to 8 nodal points); plate elements (4 to 8 nodal points); and three-dimensional continuum elements (8 to 21 nodal points). The traction loads generated are consistent with the element generated. The expansion--contraction option is of special interest. This option makes it possible to change an existing mesh such that some regions are refined and others are made coarser than the original mesh. 9 figures

  11. Collective Sensing of β-Cells Generates the Metabolic Code

    Directory of Open Access Journals (Sweden)

    Dean Korošak

    2018-01-01

    Full Text Available Major part of a pancreatic islet is composed of β-cells that secrete insulin, a key hormone regulating influx of nutrients into all cells in a vertebrate organism to support nutrition, housekeeping or energy storage. β-cells constantly communicate with each other using both direct, short-range interactions through gap junctions, and paracrine long-range signaling. However, how these cell interactions shape collective sensing and cell behavior in islets that leads to insulin release is unknown. When stimulated by specific ligands, primarily glucose, β-cells collectively respond with expression of a series of transient Ca2+ changes on several temporal scales. Here we reanalyze a set of Ca2+ spike trains recorded in acute rodent pancreatic tissue slice under physiological conditions. We found strongly correlated states of co-spiking cells coexisting with mostly weak pairwise correlations widespread across the islet. Furthermore, the collective Ca2+ spiking activity in islet shows on-off intermittency with scaling of spiking amplitudes, and stimulus dependent autoassociative memory features. We use a simple spin glass-like model for the functional network of a β-cell collective to describe these findings and argue that Ca2+ spike trains produced by collective sensing of β-cells constitute part of the islet metabolic code that regulates insulin release and limits the islet size.

  12. Detectors plans for LS1

    International Nuclear Information System (INIS)

    Nessi, M.

    2012-01-01

    All experiments plan an effective usage of the LS1 shutdown period. After three years of running they will go through a consolidation phase, mostly to fix problems that have emerged over time, like single points of failure in the infrastructure, failures of low-voltage power supplies and optical links. Upgrades of some detector components will start, mainly related to the beam pipe, the innermost tracker elements and the trigger system. Detector components, which had to be staged for cost reasons in 2003, will then enter into the detector setup. The goal is to be fully ready for the new energy regime at nominal luminosity. This article reviews the planned maintenance and modification works for ATLAS, CMS, LHCb and ALICE experiments. (author)

  13. Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder

    Science.gov (United States)

    Staats, Matt

    2009-01-01

    We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.

  14. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...

  15. A platform independent framework for Statecharts code generation

    International Nuclear Information System (INIS)

    Andolfato, L.; Chiozzi, G.; Migliorini, N.; Morales, C.

    2012-01-01

    Control systems for telescopes and their instruments are reactive systems very well suited to be modelled using Statecharts formalism. The World Wide Web Consortium is working on a new standard called SCXML that specifies XML notation to describe Statecharts and provides a well defined operational semantic for run-time interpretation of the SCXML models. This paper presents a generic application framework for reactive non realtime systems based on interpreted Statecharts. The framework consists of a model to text transformation tool and an SCXML interpreter. The tool generates from UML state machine models the SCXML representation of the state machines as well as the application skeletons for the supported software platforms. An abstraction layer propagates the events from the middle-ware to the SCXML interpreter facilitating the support for different software platforms. This project benefits from the positive experience gained in several years of development of coordination and monitoring applications for the telescope control software domain using Model Driven Development technologies. (authors)

  16. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  17. Multiplicative Structure and Hecke Rings of Generator Matrices for Codes over Quotient Rings of Euclidean Domains

    Directory of Open Access Journals (Sweden)

    Hajime Matsui

    2017-12-01

    Full Text Available In this study, we consider codes over Euclidean domains modulo their ideals. In the first half of the study, we deal with arbitrary Euclidean domains. We show that the product of generator matrices of codes over the rings mod a and mod b produces generator matrices of all codes over the ring mod a b , i.e., this correspondence is onto. Moreover, we show that if a and b are coprime, then this correspondence is one-to-one, i.e., there exist unique codes over the rings mod a and mod b that produce any given code over the ring mod a b through the product of their generator matrices. In the second half of the study, we focus on the typical Euclidean domains such as the rational integer ring, one-variable polynomial rings, rings of Gaussian and Eisenstein integers, p-adic integer rings and rings of one-variable formal power series. We define the reduced generator matrices of codes over Euclidean domains modulo their ideals and show their uniqueness. Finally, we apply our theory of reduced generator matrices to the Hecke rings of matrices over these Euclidean domains.

  18. Verification of 3-D generation code package for neutronic calculations of WWERs

    International Nuclear Information System (INIS)

    Sidorenko, V.D.; Aleshin, S.S.; Bolobov, P.A.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Morozov, V.V.; Syslov, A.A.; Tsvetkov, V.M.

    2000-01-01

    Materials on verification of the 3 -d generation code package for WWERs neutronic calculations are presented. The package includes: - spectral code TVS-M; - 2-D fine mesh diffusion code PERMAK-A for 4- or 6-group calculation of WWER core burnup; - 3-D coarse mesh diffusion code BIPR-7A for 2-group calculations of quasi-stationary WWERs regimes. The materials include both TVS-M verification data and verification data on PERMAK-A and BIPR-7A codes using constant libraries generated with TVS-M. All materials are related to the fuel without Gd. TVS-M verification materials include results of comparison both with benchmark calculations obtained by other codes and with experiments carried out at ZR-6 critical facility. PERMAK-A verification materials contain results of comparison with TVS-M calculations and with ZR-6 experiments. BIPR-7A materials include comparison with operation data for Dukovany-2 and Loviisa-1 NPPs (WWER-440) and for Balakovo NPP Unit 4 (WWER-1000). The verification materials demonstrate rather good accuracy of calculations obtained with the use of code package of the 3 -d generation. (Authors)

  19. Development of the next generation code system as an engineering modeling language. (2). Study with prototyping

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Uto, Nariaki; Kasahara, Naoto; Ishikawa, Makoto

    2003-04-01

    In the fast reactor development, numerical simulation using analytical codes plays an important role for complementing theory and experiment. It is necessary that the engineering models and analysis methods can be flexibly changed, because the phenomena to be investigated become more complicated due to the diversity of the needs for research. And, there are large problems in combining physical properties and engineering models in many different fields. Aiming to the realization of the next generation code system which can solve those problems, the authors adopted three methods, (1) Multi-language (SoftWIRE.NET, Visual Basic.NET and Fortran) (2) Fortran 90 and (3) Python to make a prototype of the next generation code system. As this result, the followings were confirmed. (1) It is possible to reuse a function of the existing codes written in Fortran as an object of the next generation code system by using Visual Basic.NET. (2) The maintainability of the existing code written by Fortran 77 can be improved by using the new features of Fortran 90. (3) The toolbox-type code system can be built by using Python. (author)

  20. Architectural and Algorithmic Requirements for a Next-Generation System Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    V.A. Mousseau

    2010-05-01

    This document presents high-level architectural and system requirements for a next-generation system analysis code (NGSAC) to support reactor safety decision-making by plant operators and others, especially in the context of light water reactor plant life extension. The capabilities of NGSAC will be different from those of current-generation codes, not only because computers have evolved significantly in the generations since the current paradigm was first implemented, but because the decision-making processes that need the support of next-generation codes are very different from the decision-making processes that drove the licensing and design of the current fleet of commercial nuclear power reactors. The implications of these newer decision-making processes for NGSAC requirements are discussed, and resulting top-level goals for the NGSAC are formulated. From these goals, the general architectural and system requirements for the NGSAC are derived.

  1. Validation of the WIMSD4M cross-section generation code with benchmark results

    International Nuclear Information System (INIS)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented

  2. Validation of the WIMSD4M cross-section generation code with benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.

  3. Mesh generation and energy group condensation studies for the jaguar deterministic transport code

    International Nuclear Information System (INIS)

    Kennedy, R. A.; Watson, A. M.; Iwueke, C. I.; Edwards, E. J.

    2012-01-01

    The deterministic transport code Jaguar is introduced, and the modeling process for Jaguar is demonstrated using a two-dimensional assembly model of the Hoogenboom-Martin Performance Benchmark Problem. This single assembly model is being used to test and analyze optimal modeling methodologies and techniques for Jaguar. This paper focuses on spatial mesh generation and energy condensation techniques. In this summary, the models and processes are defined as well as thermal flux solution comparisons with the Monte Carlo code MC21. (authors)

  4. Mesh generation and energy group condensation studies for the jaguar deterministic transport code

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, R. A.; Watson, A. M.; Iwueke, C. I.; Edwards, E. J. [Knolls Atomic Power Laboratory, Bechtel Marine Propulsion Corporation, P.O. Box 1072, Schenectady, NY 12301-1072 (United States)

    2012-07-01

    The deterministic transport code Jaguar is introduced, and the modeling process for Jaguar is demonstrated using a two-dimensional assembly model of the Hoogenboom-Martin Performance Benchmark Problem. This single assembly model is being used to test and analyze optimal modeling methodologies and techniques for Jaguar. This paper focuses on spatial mesh generation and energy condensation techniques. In this summary, the models and processes are defined as well as thermal flux solution comparisons with the Monte Carlo code MC21. (authors)

  5. Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization

    Science.gov (United States)

    Green, Lawrence; Carle, Alan; Fagan, Mike

    1999-01-01

    Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop

  6. Generation of cross-sections and reference solutions using the code Serpent

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Delfin L, A.; Del Valle G, E.

    2012-10-01

    Serpent is a code that solves the neutron transport equations using the Monte Carlo method that besides generating reference solutions in stationary state for complex geometry problems, has been specially designed for physical applications of cells, what includes the generation of homogenized cross-sections for several energy groups. In this work a calculation methodology is described using the code Serpent to generate the necessary cross-sections to carry out calculations with the code TNXY, developed in 1993 in the Nuclear Engineering Department of the Instituto Politecnico Nacional (Mexico) by means of an interface programmed in Octave. The computation program TNXY solves the neutron transport equations for several energy groups in stationary state and geometry X Y using the Discreet Ordinates method (S N ). To verify and to validate the methodology the results of TNXY were compared with those calculated by Serpent giving minor differences to 0.55% in the value of the multiplication factor. (Author)

  7. LS1 Report: the clouds are lifting

    CERN Multimedia

    Anaïs Schaeffer

    2014-01-01

    To combat the problem of electron clouds, which perturbate the environment of the particle beams in our accelerators, the Vacuum team have turned to amorphous carbon. This material is being applied to the interior of 16 magnets in the SPS during LS1 and will help prevent the formation of the secondary particles which are responsible for these clouds.   This photo shows the familiar coils of an SPS dipole magnet in brown. The vacuum chamber is the metallic rectangular part in the centre. The small wheeled device you can see in the vacuum chamber carries the hollow cathodes  along the length of the chamber. When a particle beam circulates at high energy in a vacuum chamber, it unavoidably generates secondary particles. These include electrons produced by the ionisation of residual molecules in the vacuum or indirectly generated by synchrotron radiation. When these electrons hit the surface of the vacuum chamber, they produce other electrons which, through an avalanche-like process, re...

  8. Consistency and accuracy of diagnostic cancer codes generated by automated registration: comparison with manual registration

    Directory of Open Access Journals (Sweden)

    Codazzi Tiziana

    2006-09-01

    Full Text Available Abstract Background Automated procedures are increasingly used in cancer registration, and it is important that the data produced are systematically checked for consistency and accuracy. We evaluated an automated procedure for cancer registration adopted by the Lombardy Cancer Registry in 1997, comparing automatically-generated diagnostic codes with those produced manually over one year (1997. Methods The automatically generated cancer cases were produced by Open Registry algorithms. For manual registration, trained staff consulted clinical records, pathology reports and death certificates. The social security code, present and checked in both databases in all cases, was used to match the files in the automatic and manual databases. The cancer cases generated by the two methods were compared by manual revision. Results The automated procedure generated 5027 cases: 2959 (59% were accepted automatically and 2068 (41% were flagged for manual checking. Among the cases accepted automatically, discrepancies in data items (surname, first name, sex and date of birth constituted 8.5% of cases, and discrepancies in the first three digits of the ICD-9 code constituted 1.6%. Among flagged cases, cancers of female genital tract, hematopoietic system, metastatic and ill-defined sites, and oropharynx predominated. The usual reasons were use of specific vs. generic codes, presence of multiple primaries, and use of extranodal vs. nodal codes for lymphomas. The percentage of automatically accepted cases ranged from 83% for breast and thyroid cancers to 13% for metastatic and ill-defined cancer sites. Conclusion Since 59% of cases were accepted automatically and contained relatively few, mostly trivial discrepancies, the automatic procedure is efficient for routine case generation effectively cutting the workload required for routine case checking by this amount. Among cases not accepted automatically, discrepancies were mainly due to variations in coding practice.

  9. On the use of the Serpent Monte Carlo code for few-group cross section generation

    International Nuclear Information System (INIS)

    Fridman, E.; Leppaenen, J.

    2011-01-01

    Research highlights: → B1 methodology was used for generation of leakage-corrected few-group cross sections in the Serpent Monte-Carlo code. → Few-group constants generated by Serpent were compared with those calculated by Helios deterministic lattice transport code. → 3D analysis of a PWR core was performed by a nodal diffusion code DYN3D employing two-group cross section sets generated by Serpent and Helios. → An excellent agreement in the results of 3D core calculations obtained with Helios and Serpent generated cross-section libraries was observed. - Abstract: Serpent is a recently developed 3D continuous-energy Monte Carlo (MC) reactor physics burnup calculation code. Serpent is specifically designed for lattice physics applications including generation of homogenized few-group constants for full-core core simulators. Currently in Serpent, the few-group constants are obtained from the infinite-lattice calculations with zero neutron current at the outer boundary. In this study, in order to account for the non-physical infinite-lattice approximation, B1 methodology, routinely used by deterministic lattice transport codes, was considered for generation of leakage-corrected few-group cross sections in the Serpent code. A preliminary assessment of the applicability of the B1 methodology for generation of few-group constants in the Serpent code was carried out according to the following steps. Initially, the two-group constants generated by Serpent were compared with those calculated by Helios deterministic lattice transport code. Then, a 3D analysis of a Pressurized Water Reactor (PWR) core was performed by the nodal diffusion code DYN3D employing two-group cross section sets generated by Serpent and Helios. At this stage thermal-hydraulic (T-H) feedback was neglected. The DYN3D results were compared with those obtained from the 3D full core Serpent MC calculations. Finally, the full core DYN3D calculations were repeated taking into account T-H feedback and

  10. NSLINK, Coupling of NJOY Cross-Sections Generator Code to SCALE-3 System

    International Nuclear Information System (INIS)

    De Leege, P.F.A

    1991-01-01

    1 - Description of program or function: NSLINK (NJOY - SCALE - LINK) is a set of computer codes to couple the NJOY cross-section generation code to the SCALE-3 code system (using AMPX-2 master library format) retaining the Nordheim resolved resonance treatment option. 2 - Method of solution: The following module and codes are included in NSLINK: XLACSR: This module is a stripped-down version of the XLACS-2 code. The module passes all l=0 resonance parameters as well as the contribution from all other resonances to the group cross-sections, the contribution from the wings of the l=0 resonances, the background cross-section and possible interference for multilevel Breit-Wigner resonance parameters. The group cross-sections are stored in the appropriate 1-D cross-section arrays. The output file has AMPX-2 master format. The original NJOY code is used to calculate all other data. The XLACSR module is included in the NJOY code. MILER: This code converts NJOY output (GENDF format) to AMPX-2 master format. The code is an extensively revised version of the original MILER code. In addition, the treatment of thermal scattering matrices at different temperatures is included. UNITABR: This code is a revised version of the UNITAB code. It merges the output of XLACSR and MILER in such a way that contributions from the bodies of the l=0 resonances in the resolved energy range, calculated by XLACSR, are subtracted from the 1-D group cross-section arrays for fission (MT=18) and neutron capture (MT=102). The l=0 resonance parameters and the contributions from the bodies of these resonances are added separately (MT=1023, 1022 and 1021). The total cross-section (MT=1), the absorption cross- section (MT=27) and the neutron removal cross-section (MT=101) values are adjusted. In the case of Bondarenko data, infinite dilution values of the cross-sections (MT=1, 18 and 102) are changed in the same way as the 1-D cross-section. The output file of UNITABR is in AMPX-2 master format and

  11. ANNarchy: a code generation approach to neural simulations on parallel hardware

    Science.gov (United States)

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  12. A genetic code alteration is a phenotype diversity generator in the human pathogen Candida albicans.

    Directory of Open Access Journals (Sweden)

    Isabel Miranda

    Full Text Available BACKGROUND: The discovery of genetic code alterations and expansions in both prokaryotes and eukaryotes abolished the hypothesis of a frozen and universal genetic code and exposed unanticipated flexibility in codon and amino acid assignments. It is now clear that codon identity alterations involve sense and non-sense codons and can occur in organisms with complex genomes and proteomes. However, the biological functions, the molecular mechanisms of evolution and the diversity of genetic code alterations remain largely unknown. In various species of the genus Candida, the leucine CUG codon is decoded as serine by a unique serine tRNA that contains a leucine 5'-CAG-3'anticodon (tRNA(CAG(Ser. We are using this codon identity redefinition as a model system to elucidate the evolution of genetic code alterations. METHODOLOGY/PRINCIPAL FINDINGS: We have reconstructed the early stages of the Candida genetic code alteration by engineering tRNAs that partially reverted the identity of serine CUG codons back to their standard leucine meaning. Such genetic code manipulation had profound cellular consequences as it exposed important morphological variation, altered gene expression, re-arranged the karyotype, increased cell-cell adhesion and secretion of hydrolytic enzymes. CONCLUSION/SIGNIFICANCE: Our study provides the first experimental evidence for an important role of genetic code alterations as generators of phenotypic diversity of high selective potential and supports the hypothesis that they speed up evolution of new phenotypes.

  13. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  14. An Empirical Model for Vane-Type Vortex Generators in a Navier-Stokes Code

    Science.gov (United States)

    Dudek, Julianne C.

    2005-01-01

    An empirical model which simulates the effects of vane-type vortex generators in ducts was incorporated into the Wind-US Navier-Stokes computational fluid dynamics code. The model enables the effects of the vortex generators to be simulated without defining the details of the geometry within the grid, and makes it practical for researchers to evaluate multiple combinations of vortex generator arrangements. The model determines the strength of each vortex based on the generator geometry and the local flow conditions. Validation results are presented for flow in a straight pipe with a counter-rotating vortex generator arrangement, and the results are compared with experimental data and computational simulations using a gridded vane generator. Results are also presented for vortex generator arrays in two S-duct diffusers, along with accompanying experimental data. The effects of grid resolution and turbulence model are also examined.

  15. Simplification of coding of NRU loop experiment software with dimensional generator

    International Nuclear Information System (INIS)

    Davis, R. S.

    2006-01-01

    The following are specific topics of this paper: 1.There is much creativity in the manner in which Dimensional Generator can be applied to a specific programming task [2]. This paper tells how Dimensional Generator was applied to a reactor-physics task. 2. In this first practical use, Dimensional Generator itself proved not to need change, but a better user interface was found necessary, essentially because the relevance of Dimensional Generator to reactor physics was initially underestimated. It is briefly described. 3. The use of Dimensional Generator helps make reactor-physics source code somewhat simpler. That is explained here with brief examples from BURFEL-PC and WIMSBURF. 4. Most importantly, with the help of Dimensional Generator, all erroneous physical expressions were automatically detected. The errors are detailed here (in spite of the author's embarrassment) because they show clearly, both in theory and in practice, how Dimensional Generator offers quality enhancement of reactor-physics programming. (authors)

  16. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  17. Code Generation by Model Transformation : A Case Study in Transformation Modularity

    NARCIS (Netherlands)

    Hemel, Z.; Kats, L.C.L.; Visser, E.

    2008-01-01

    Preprint of paper published in: Theory and Practice of Model Transformations (ICMT 2008), Lecture Notes in Computer Science 5063; doi:10.1007/978-3-540-69927-9_13 The realization of model-driven software development requires effective techniques for implementing code generators for domain-specific

  18. A UML profile for code generation of component based distributed systems

    International Nuclear Information System (INIS)

    Chiozzi, G.; Karban, R.; Andolfato, L.; Tejeda, A.

    2012-01-01

    A consistent and unambiguous implementation of code generation (model to text transformation) from UML (must rely on a well defined UML (Unified Modelling Language) profile, customizing UML for a particular application domain. Such a profile must have a solid foundation in a formally correct ontology, formalizing the concepts and their relations in the specific domain, in order to avoid a maze or set of wildly created stereotypes. The paper describes a generic profile for the code generation of component based distributed systems for control applications, the process to distill the ontology and define the profile, and the strategy followed to implement the code generator. The main steps that take place iteratively include: defining the terms and relations with an ontology, mapping the ontology to the appropriate UML meta-classes, testing the profile by creating modelling examples, and generating the code. This has allowed us to work on the modelling of E-ELT (European Extremely Large Telescope) control system and instrumentation without knowing what infrastructure will be finally used

  19. Towards provably correct code generation for a hard real-time programming language

    DEFF Research Database (Denmark)

    Fränzle, Martin; Müller-Olm, Markus

    1994-01-01

    This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementation...

  20. Process of cross section generation for radiation shielding calculations, using the NJOY code

    International Nuclear Information System (INIS)

    Ono, S.; Corcuera, R.P.

    1986-10-01

    The process of multigroup cross sections generation for radiation shielding calculations, using the NJOY code, is explained. Photon production cross sections, processed by the GROUPR module, and photon interaction cross sections processed by the GAMINR are given. These data are compared with the data produced by the AMPX system and published data. (author) [pt

  1. Generating Code with Polymorphic let: A Ballad of Value Restriction, Copying and Sharing

    Directory of Open Access Journals (Sweden)

    Oleg Kiselyov

    2017-02-01

    Full Text Available Getting polymorphism and effects such as mutation to live together in the same language is a tale worth telling, under the recurring refrain of copying vs. sharing. We add new stanzas to the tale, about the ordeal to generate code with polymorphism and effects, and be sure it type-checks. Generating well-typed-by-construction polymorphic let-expressions is impossible in the Hindley-Milner type system: even the author believed that. The polymorphic-let generator turns out to exist. We present its derivation and the application for the lightweight implementation of quotation via a novel and unexpectedly simple source-to-source transformation to code-generating combinators. However, generating let-expressions with polymorphic functions demands more than even the relaxed value restriction can deliver. We need a new deal for let-polymorphism in ML. We conjecture the weaker restriction and implement it in a practically-useful code-generation library. Its formal justification is formulated as the research program.

  2. Towards Qualifiable Code Generation from a Clocked Synchronous Subset of Modelica

    Directory of Open Access Journals (Sweden)

    Bernhard Thiele

    2015-01-01

    Full Text Available So far no qualifiable automatic code generators (ACGs are available for Modelica. Hence, digital control applications can be modeled and simulated in Modelica, but require tedious additional efforts (e.g., manual reprogramming to produce qualifiable target system production code. In order to more fully leverage the potential of a model-based development (MBD process in Modelica, a qualifiable automatic code generator is needed. Typical Modelica code generation is a fairly complex process which imposes a huge development burden to any efforts of tool qualification. This work aims at mapping a Modelica subset for digital control function development to a well-understood synchronous data-flow kernel language. This kernel language allows to resort to established compilation techniques for data-flow languages which are understood enough to be accepted by certification authorities. The mapping is established by providing a translational semantics from the Modelica subset to the synchronous data-flow kernel language. However, this translation turned out to be more intricate than initially expected and has given rise to several interesting issues that require suitable design decisions regarding the mapping and the language subset.

  3. Generation of the WIMS code library from the ENDF/B-VI basic library

    International Nuclear Information System (INIS)

    Aboustta, Mohamed Ali Bashir.

    1994-01-01

    The WIMS code is being presently used in many research centers and educational institutions in the world. It has proven to be versatile, reliable and diverse as it is used to calculate different reactor systems. Its data library is rich of useful information that can even be condensed to serve other codes, but the copy distributed with the code is not updated. Some of its data has never been changed, others had changed many times to accommodate certain experimental setups and some data is, simply, not included. This work is an attempt to dominate the techniques used in generating a multigroup library as being applied to the WIMS data library. This new library is called UFMGLIB. A new set of consistent data was generated from the basic ENDF/B-VI library, including complete data for the fission product nuclides and more elaborated burnup chains. The performance of the library is comparable to that of the Standard library accompanying the code and a later library, WIMKAL 88, generated by a group of the Korean Research Institute of Atomic Energy. (author). 38 refs., 40 figs., 30 tabs

  4. Development of Multi-Scale Finite Element Analysis Codes for High Formability Sheet Metal Generation

    International Nuclear Information System (INIS)

    Nnakamachi, Eiji; Kuramae, Hiroyuki; Ngoc Tam, Nguyen; Nakamura, Yasunori; Sakamoto, Hidetoshi; Morimoto, Hideo

    2007-01-01

    In this study, the dynamic- and static-explicit multi-scale finite element (F.E.) codes are developed by employing the homogenization method, the crystalplasticity constitutive equation and SEM-EBSD measurement based polycrystal model. These can predict the crystal morphological change and the hardening evolution at the micro level, and the macroscopic plastic anisotropy evolution. These codes are applied to analyze the asymmetrical rolling process, which is introduced to control the crystal texture of the sheet metal for generating a high formability sheet metal. These codes can predict the yield surface and the sheet formability by analyzing the strain path dependent yield, the simple sheet forming process, such as the limit dome height test and the cylindrical deep drawing problems. It shows that the shear dominant rolling process, such as the asymmetric rolling, generates ''high formability'' textures and eventually the high formability sheet. The texture evolution and the high formability of the newly generated sheet metal experimentally were confirmed by the SEM-EBSD measurement and LDH test. It is concluded that these explicit type crystallographic homogenized multi-scale F.E. code could be a comprehensive tool to predict the plastic induced texture evolution, anisotropy and formability by the rolling process and the limit dome height test analyses

  5. MC2-3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C. H. [Argonne National Lab. (ANL), Argonne, IL (United States); Yang, W. S. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-11-08

    The MC2-3 code is a Multigroup Cross section generation Code for fast reactor analysis, developed by improving the resonance self-shielding and spectrum calculation methods of MC2-2 and integrating the one-dimensional cell calculation capabilities of SDX. The code solves the consistent P1 multigroup transport equation using basic neutron data from ENDF/B data files to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (~2000) or hyperfine (~400,000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified isotopic temperatures. The pointwise cross sections are directly used in the hyperfine group calculation whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for two-dimensional whole-core problems to generate region-dependent broad-group cross sections. Multigroup cross sections are written in the ISOTXS format for a user-specified group structure. The code is executable on UNIX, Linux, and PC Windows systems, and its library includes all isotopes of the ENDF/BVII. 0 data.

  6. Code Assessment of SPACE 2.19 using LSTF Steam Generator Tube Rupture Test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Minhee; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The SPACE is a best estimated two-phase three-field thermal-hydraulic analysis code used to analyze the safety and performance of pressurized water reactors. As a result of the development, the 2.19 version of the code was released through the successive various verification and validation works. The present work is on the line of expanding the work by Kim et al. In this study, results produced by the SPACE 2.19 code were compared with the experimental data from JAERI's LSTF Test Run LSTF SB-SG-06 experiment simulating a Steam Generator Tube Rupture (SGTR) transient. In order to identify the predictability of SPACE 2.19, the LSTF steam generator tube rupture test was simulated. To evaluate the computed results, LSTF SB-SG-06 test data simulating the SGTR and the RELAP5/ MOD3.1 are used. The calculation results indicate that the SPACE 2.19 code predicted well the sequence of events and the major phenomena during the transient, such as the asymmetric loop behavior, reactor coolant system cooldown and heat transfer by natural circulation, the primary and secondary system depressurization by the pressurizer auxiliary spray and the steam dump using the intact loop steam generator relief valve.

  7. NULIF: neutron spectrum generator, few-group constant calculator, and fuel depletion code

    International Nuclear Information System (INIS)

    Wittkopf, W.A.; Tilford, J.M.; Andrews, J.B. II; Kirschner, G.; Hassan, N.M.; Colpo, P.N.

    1977-02-01

    The NULIF code generates a microgroup neutron spectrum and calculates spectrum-weighted few-group parameters for use in a spatial diffusion code. A wide variety of fuel cells, non-fuel cells, and fuel lattices, typical of PWR (or BWR) lattices, are treated. A fuel depletion routine and change card capability allow a broad range of problems to be studied. Coefficient variation with fuel burnup, fuel temperature change, moderator temperature change, soluble boron concentration change, burnable poison variation, and control rod insertion are readily obtained. Heterogeneous effects, including resonance shielding and thermal flux depressions, are treated. Coefficients are obtained for one thermal group and up to three epithermal groups. A special output routine writes the few-group coefficient data in specified format on an output tape for automated fitting in the PDQ07-HARMONY system of spatial diffusion-depletion codes

  8. The CAIN computer code for the generation of MABEL input data sets: a user's manual

    International Nuclear Information System (INIS)

    Tilley, D.R.

    1983-03-01

    CAIN is an interactive FORTRAN computer code designed to overcome the substantial effort involved in manually creating the thermal-hydraulics input data required by MABEL-2. CAIN achieves this by processing output from either of the whole-core codes, RELAP or TRAC, interpolating where necessary, and by scanning RELAP/TRAC output in order to generate additional information. This user's manual describes the actions required in order to create RELAP/TRAC data sets from magnetic tape, to create the other input data sets required by CAIN, and to operate the interactive command procedure for the execution of CAIN. In addition, the CAIN code is described in detail. This programme of work is part of the Nuclear Installations Inspectorate (NII)'s contribution to the United Kingdom Atomic Energy Authority's independent safety assessment of pressurized water reactors. (author)

  9. Mr.CAS-A minimalistic (pure) Ruby CAS for fast prototyping and code generation

    Science.gov (United States)

    Ragni, Matteo

    There are Computer Algebra System (CAS) systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.

  10. Mr.CAS—A minimalistic (pure Ruby CAS for fast prototyping and code generation

    Directory of Open Access Journals (Sweden)

    Matteo Ragni

    2017-01-01

    Full Text Available There are Computer Algebra System (CAS systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.

  11. Steam generator transient studies using a simplified two-fluid computer code

    International Nuclear Information System (INIS)

    Munshi, P.; Bhatnagar, R.; Ram, K.S.

    1985-01-01

    A simplified two-fluid computer code has been used to simulate reactor-side (or primary-side) transients in a PWR steam generator. The disturbances are modelled as ramp inputs for pressure, internal energy and mass flow-rate for the primary fluid. The CPU time for a transient duration of 4 s is approx. 10 min on a DEC-1090 computer system. The results are thermodynamically consistent and encouraging for further studies. (author)

  12. Automated importance generation and biasing techniques for Monte Carlo shielding techniques by the TRIPOLI-3 code

    International Nuclear Information System (INIS)

    Both, J.P.; Nimal, J.C.; Vergnaud, T.

    1990-01-01

    We discuss an automated biasing procedure for generating the parameters necessary to achieve efficient Monte Carlo biasing shielding calculations. The biasing techniques considered here are exponential transform and collision biasing deriving from the concept of the biased game based on the importance function. We use a simple model of the importance function with exponential attenuation as the distance to the detector increases. This importance function is generated on a three-dimensional mesh including geometry and with graph theory algorithms. This scheme is currently being implemented in the third version of the neutron and gamma ray transport code TRIPOLI-3. (author)

  13. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  14. Experimental benchmark and code validation for airfoils equipped with passive vortex generators

    International Nuclear Information System (INIS)

    Baldacchino, D; Ferreira, C; Florentie, L; Timmer, N; Van Zuijlen, A; Manolesos, M; Chaviaropoulos, T; Diakakis, K; Papadakis, G; Voutsinas, S; González Salcedo, Á; Aparicio, M; García, N R.; Sørensen, N N.; Troldborg, N

    2016-01-01

    Experimental results and complimentary computations for airfoils with vortex generators are compared in this paper, as part of an effort within the AVATAR project to develop tools for wind turbine blade control devices. Measurements from two airfoils equipped with passive vortex generators, a 30% thick DU97W300 and an 18% thick NTUA T18 have been used for benchmarking several simulation tools. These tools span low-to-high complexity, ranging from engineering-level integral boundary layer tools to fully-resolved computational fluid dynamics codes. Results indicate that with appropriate calibration, engineering-type tools can capture the effects of vortex generators and outperform more complex tools. Fully resolved CFD comes at a much higher computational cost and does not necessarily capture the increased lift due to the VGs. However, in lieu of the limited experimental data available for calibration, high fidelity tools are still required for assessing the effect of vortex generators on airfoil performance. (paper)

  15. LS1: electrical engineering upgrades and consolidation

    International Nuclear Information System (INIS)

    Duval, F.

    2012-01-01

    3 different types of activities are planned by the Engineering Department Electrical Engineering (EN-EL) Group for the first long shut down (LS1). First, the consolidation of EN-EL's ageing infrastructure elements. It is part of a 15- year programme aiming at increasing the reliability and availability of the power distribution network. Secondly, the maintenance of the accelerators infrastructure. In addition to the usual periodic operations and those delayed until LS1, the group will address more demanding activities like the replacement campaigns for irradiated cables and non-radiation resistant fibres as well as the removal of unused cables in particularly overcrowded areas. Thirdly, a vast amount of user copper and optical fibre cabling requests: EN-EL estimates that only 50% of LS1 requests are currently known. The main activities will be EN-EL's contributions to the R2E project, BE-BI upgrade projects, and the RF upgrade project in SPS BA3

  16. LS1 Report: Shielding operations

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    At the LHC, the SMACC project’s consolidation train has just entered Sector 7-8, the third sector to be consolidated. It has moved on from Sector 6-7, which is now in the closure phase.   This week saw the start of the replacement campaign for the compensators on the LHC’s cryogenic distribution lines (QRL), involving all sectors of the machine. Nine compensators in total will be replaced between now and the end of the year (see the article in this week's Bulletin). Operations are advancing very quickly on the R2E (radiation to electronics) project. At Point 1, for example, the teams have successfully managed to get several weeks ahead of the activity schedule. Tests on the back-up electrical supply have also been completed. The diesel generators, designed to take over in the event of a failure of the main electrical supply, were put through their paces during a simulated power cut and passed with flying colours.   At the PS, 29 pillars have been ...

  17. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  18. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  19. LHC-GCS a model-driven approach for automatic PLC and SCADA code generation

    CERN Document Server

    Thomas, Geraldine; Barillère, Renaud; Cabaret, Sebastien; Kulman, Nikolay; Pons, Xavier; Rochez, Jacques

    2005-01-01

    The LHC experiments’ Gas Control System (LHC GCS) project [1] aims to provide the four LHC experiments (ALICE, ATLAS, CMS and LHCb) with control for their 23 gas systems. To ease the production and maintenance of 23 control systems, a model-driven approach has been adopted to generate automatically the code for the Programmable Logic Controllers (PLCs) and for the Supervision Control And Data Acquisition (SCADA) systems. The first milestones of the project have been achieved. The LHC GCS framework [4] and the generation tools have been produced. A first control application has actually been generated and is in production, and a second is in preparation. This paper describes the principle and the architecture of the model-driven solution. It will in particular detail how the model-driven solution fits with the LHC GCS framework and with the UNICOS [5] data-driven tools.

  20. DOG -II input generator program for DOT3.5 code

    International Nuclear Information System (INIS)

    Hayashi, Katsumi; Handa, Hiroyuki; Yamada, Koubun; Kamogawa, Susumu; Takatsu, Hideyuki; Koizumi, Kouichi; Seki, Yasushi

    1992-01-01

    DOT3.5 is widely used for radiation transport analysis of fission reactors, fusion experimental facilities and particle accelerators. We developed the input generator program for DOT3.5 code in aim to prepare input data effectively. Formar program DOG was developed and used internally in Hitachi Engineering Company. In this new version DOG-II, limitation for R-Θ geometry was removed. All the input data is created by interactive method in front of color display without using DOT3.5 manual. Also the geometry related input are easily created without calculation of precise curved mesh point. By using DOG-II, reliable input data for DOT3.5 code is obtained easily and quickly

  1. Improved Side Information Generation for Distributed Video Coding by Exploiting Spatial and Temporal Correlations

    Directory of Open Access Journals (Sweden)

    Ye Shuiming

    2009-01-01

    Full Text Available Distributed video coding (DVC is a video coding paradigm allowing low complexity encoding for emerging applications such as wireless video surveillance. Side information (SI generation is a key function in the DVC decoder, and plays a key-role in determining the performance of the codec. This paper proposes an improved SI generation for DVC, which exploits both spatial and temporal correlations in the sequences. Partially decoded Wyner-Ziv (WZ frames, based on initial SI by motion compensated temporal interpolation, are exploited to improve the performance of the whole SI generation. More specifically, an enhanced temporal frame interpolation is proposed, including motion vector refinement and smoothing, optimal compensation mode selection, and a new matching criterion for motion estimation. The improved SI technique is also applied to a new hybrid spatial and temporal error concealment scheme to conceal errors in WZ frames. Simulation results show that the proposed scheme can achieve up to 1.0 dB improvement in rate distortion performance in WZ frames for video with high motion, when compared to state-of-the-art DVC. In addition, both the objective and perceptual qualities of the corrupted sequences are significantly improved by the proposed hybrid error concealment scheme, outperforming both spatial and temporal concealments alone.

  2. LIBVERSIONINGCOMPILER: An easy-to-use library for dynamic generation and invocation of multiple code versions

    Science.gov (United States)

    Cherubin, S.; Agosta, G.

    2018-01-01

    We present LIBVERSIONINGCOMPILER, a C++ library designed to support the dynamic generation of multiple versions of the same compute kernel in a HPC scenario. It can be used to provide continuous optimization, code specialization based on the input data or on workload changes, or otherwise to dynamically adjust the application, without the burden of a full dynamic compiler. The library supports multiple underlying compilers but specifically targets the LLVM framework. We also provide examples of use, showing the overhead of the library, and providing guidelines for its efficient use.

  3. Improved numerical grid generation techniques for the B2 edge plasma code

    International Nuclear Information System (INIS)

    Stotler, D.P.; Coster, D.P.

    1992-06-01

    Techniques used to generate grids for edge fluid codes such as B2 from numerically computed equilibria are discussed. Fully orthogonal, numerically derived grids closely resembling analytically prescribed meshes can be obtained. But, the details of the poloidal field can vary, yielding significantly different plasma parameters in the simulations. The magnitude of these differences is consistent with the predictions of an analytic model of the scrape-off layer. Both numerical and analytic grids are insensitive to changes in their defining parameters. Methods for implementing nonorthogonal boundaries in these meshes are also presented; they differ slightly from those required for fully orthogonal grids

  4. Design and construction of a graphical interface for automatic generation of simulation code GEANT4

    International Nuclear Information System (INIS)

    Driss, Mozher; Bouzaine Ismail

    2007-01-01

    This work is set in the context of the engineering studies final project; it is accomplished in the center of nuclear sciences and technologies in Sidi Thabet. This project is about conceiving and developing a system based on graphical user interface which allows an automatic codes generation for simulation under the GEANT4 engine. This system aims to facilitate the use of GEANT4 by scientific not necessary expert in this engine and to be used in different areas: research, industry and education. The implementation of this project uses Root library and several programming languages such as XML and XSL. (Author). 5 refs

  5. Evaluation of angular integrals in the generation of transfer matrices for multigroup transport codes

    International Nuclear Information System (INIS)

    Garcia, R.D.M.

    1985-01-01

    The generalization of a semi-analytical technique for the evaluation of angular integrals that appear in the generation of elastic and discrete inelastic tranfer matrices for transport codes is carried out. In contrast to the generalized series expansions which are found to be too complex and thus of little practical value, when compared to the Gaussian quadrature technique, the recursion relations developed in this work are superior to the quadrature scheme, for those cases where the round-off error propagation is not significant. (Author) [pt

  6. Development of FEMAG. Calculation code of magnetic field generated by ferritic plates in the tokamak devices

    Energy Technology Data Exchange (ETDEWEB)

    Urata, Kazuhiro [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2003-03-01

    In design of the future fusion devises in which low activation ferritic steel is planned to use as the plasma facing material and/or the inserts for ripple reduction, the appreciation of the error field effect against the plasma as well as the optimization of ferritic plate arrangement to reduce the toroidal field ripple require calculation of magnetic field generated by ferritic steel. However iterative calculations concerning the non-linearity in B-H curve of ferritic steel disturbs high-speed calculation required as the design tool. In the strong toroidal magnetic field that is characteristic in the tokamak fusion devices, fully magnetic saturation of ferritic steel occurs. Hence a distribution of magnetic charges as magnetic field source is determined straightforward and any iteration calculation are unnecessary. Additionally objective ferritic steel geometry is limited to the thin plate and ferritic plates are installed along the toroidal magnetic field. Taking these special conditions into account, high-speed calculation code ''FEMAG'' has been developed. In this report, the formalization of 'FEMAG' code, how to use 'FEMAG', and the validity check of 'FEMAG' in comparison with a 3D FEM code, with the measurements of the magnetic field in JFT-2M are described. The presented examples are numerical results of design studies for JT-60 modification. (author)

  7. ARCADIAR - A New Generation of Coupled Neutronics / Core Thermal- Hydraulics Code System at AREVA NP

    International Nuclear Information System (INIS)

    Curca-Tivig, Florin; Merk, Stephan; Pautz, Andreas; Thareau, Sebastien

    2007-01-01

    Anticipating future needs of our customers and willing to concentrate synergies and competences existing in the company for the benefit of our customers, AREVA NP decided in 2002 to develop the next generation of coupled neutronics/ core thermal-hydraulic (TH) code systems for fuel assembly and core design calculations for both, PWR and BWR applications. The global CONVERGENCE project was born: after a feasibility study of one year (2002) and a conceptual phase of another year (2003), development was started at the beginning of 2004. The present paper introduces the CONVERGENCE project, presents the main feature of the new code system ARCADIA R and concludes on customer benefits. ARCADIA R is designed to meet AREVA NP market and customers' requirements worldwide. Besides state-of-the-art physical modeling, numerical performance and industrial functionality, the ARCADIA R system is featuring state-of-the-art software engineering. The new code system will bring a series of benefits for our customers: e.g. improved accuracy for heterogeneous cores (MOX/ UOX, Gd...), better description of nuclide chains, and access to local neutronics/ thermal-hydraulics and possibly thermal-mechanical information (3D pin by pin full core modeling). ARCADIA is a registered trademark of AREVA NP. (authors)

  8. A dynamic, dependent type system for nuclear fuel cycle code generation

    Energy Technology Data Exchange (ETDEWEB)

    Scopatz, A. [The University of Chicago 5754 S. Ellis Ave, Chicago, IL 60637 (United States)

    2013-07-01

    The nuclear fuel cycle may be interpreted as a network or graph, thus allowing methods from formal graph theory to be used. Nodes are often idealized as nuclear fuel cycle facilities (reactors, enrichment cascades, deep geologic repositories). With the advent of modern object-oriented programming languages - and fuel cycle simulators implemented in these languages - it is natural to define a class hierarchy of facility types. Bright is a quasi-static simulator, meaning that the number of material passes through a facility is tracked rather than natural time. Bright is implemented as a C++ library that models many canonical components such as reactors, storage facilities, and more. Cyclus is a discrete time simulator, meaning that natural time is tracked through out the simulation. Therefore a robust, dependent type system was developed to enable inter-operability between Bright and Cyclus. This system is capable of representing any fuel cycle facility. Types declared in this system can then be used to automatically generate code which binds a facility implementation to a simulator front end. Facility model wrappers may be used either internally to a fuel cycle simulator or as a mechanism for inter-operating multiple simulators. While such a tool has many potential use cases it has two main purposes: enabling easy performance of code-to-code comparisons and the verification and the validation of user input.

  9. A program code generator for multiphysics biological simulation using markup languages.

    Science.gov (United States)

    Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi

    2012-01-01

    To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.

  10. Computer code to simulate transients in a steam generator of PWR nuclear power plants

    International Nuclear Information System (INIS)

    Silva, J.M. da.

    1979-01-01

    A digital computer code KIBE was developed to simulate the transient behavior of a Steam Generator used in Pressurized Water Reactor Power PLants. The equations of Conservation of mass, energy and momentum were numerically integrated by an implicit method progressively in the several axial sections into which the Steam Generator was divided. Forced convection heat transfer was assumed on the primary side, while on the secondary side all the different modes of heat transfer were permitted and deternined from the various correlations. The stability of the stationary state was verified by its reproducibility during the integration of the conservation equation without any pertubation. Transient behavior resulting from pertubations in the flow and the internal energy (temperature) at the inlet of the primary side were simulated. The results obtained exhibited satisfactory behaviour. (author) [pt

  11. Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®

    Energy Technology Data Exchange (ETDEWEB)

    Stander, Nielen [Livermore Software Technology Corporation, CA (United States); Basudhar, Anirban [Livermore Software Technology Corporation, CA (United States); Basu, Ushnish [Livermore Software Technology Corporation, CA (United States); Gandikota, Imtiaz [Livermore Software Technology Corporation, CA (United States); Savic, Vesna [General Motors, Flint, MI (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, XiaoHua [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pourboghrat, Farhang [The Ohio State Univ., Columbus, OH (United States); Park, Taejoon [The Ohio State Univ., Columbus, OH (United States); Mapar, Aboozar [Michigan State Univ., East Lansing, MI (United States); Kumar, Sharvan [Brown Univ., Providence, RI (United States); Ghassemi-Armaki, Hassan [Brown Univ., Providence, RI (United States); Abu-Farha, Fadi [Clemson Univ., SC (United States)

    2015-06-15

    Ever-tightening regulations on fuel economy and carbon emissions demand continual innovation in finding ways for reducing vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials by adding material diversity, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing thickness while retaining sufficient strength and ductility required for durability and safety. Such a project was proposed and is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the Department of Energy. Under this program, new steel alloys (Third Generation Advanced High Strength Steel or 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. In this project the principal phases identified are (i) material identification, (ii) formability optimization and (iii) multi-disciplinary vehicle optimization. This paper serves as an introduction to the LS-OPT methodology and therefore mainly focuses on the first phase, namely an approach to integrate material identification using material models of different length scales. For this purpose, a multi-scale material identification strategy, consisting of a Crystal Plasticity (CP) material model and a Homogenized State Variable (SV) model, is discussed and demonstrated. The paper concludes with proposals for integrating the multi-scale methodology into the overall vehicle design.

  12. Generation of the library of neutron cross sections for the Record code of the Fuel Management System (FMS)

    International Nuclear Information System (INIS)

    Alonso V, G.; Hernandez L, H.

    1991-11-01

    On the basis of the library structure of the RECORD code a method to generate the neutron cross sections by means of the ENDF-B/IV database and the NJOY code has been developed. The obtained cross sections are compared with those of the current library which was processed using the ENDF-B/III version. (Author)

  13. Generating multi-photon W-like states for perfect quantum teleportation and superdense coding

    Science.gov (United States)

    Li, Ke; Kong, Fan-Zhen; Yang, Ming; Ozaydin, Fatih; Yang, Qing; Cao, Zhuo-Liang

    2016-08-01

    An interesting aspect of multipartite entanglement is that for perfect teleportation and superdense coding, not the maximally entangled W states but a special class of non-maximally entangled W-like states are required. Therefore, efficient preparation of such W-like states is of great importance in quantum communications, which has not been studied as much as the preparation of W states. In this paper, we propose a simple optical scheme for efficient preparation of large-scale polarization-based entangled W-like states by fusing two W-like states or expanding a W-like state with an ancilla photon. Our scheme can also generate large-scale W states by fusing or expanding W or even W-like states. The cost analysis shows that in generating large-scale W states, the fusion mechanism achieves a higher efficiency with non-maximally entangled W-like states than maximally entangled W states. Our scheme can also start fusion or expansion with Bell states, and it is composed of a polarization-dependent beam splitter, two polarizing beam splitters and photon detectors. Requiring no ancilla photon or controlled gate to operate, our scheme can be realized with the current photonics technology and we believe it enable advances in quantum teleportation and superdense coding in multipartite settings.

  14. Salt Selection for the LS-VHTR

    International Nuclear Information System (INIS)

    Williams, D.F.; Clarno, K.T.

    2006-01-01

    Molten fluorides were initially developed for use in the nuclear industry as the high temperature fluid-fuel for a Molten Salt Reactor (MSR). The Office of Nuclear Energy is exploring the use of molten fluorides as a primary coolant (rather than helium) in an Advanced High Temperature Reactor (AHTR) design, also know as the Liquid-Salt cooled Very High Temperature Reactor (LS-VHTR). This paper provides a review of relevant properties for use in evaluation and ranking of candidate coolants for the LS-VHTR. Nuclear, physical, and chemical properties were reviewed and metrics for evaluation are recommended. Chemical properties of the salt were examined for the purpose of identifying factors that effect materials compatibility (i.e., corrosion). Some preliminary consideration of economic factors for the candidate salts is also presented. (authors)

  15. Multiple optical code-label processing using multi-wavelength frequency comb generator and multi-port optical spectrum synthesizer.

    Science.gov (United States)

    Moritsuka, Fumi; Wada, Naoya; Sakamoto, Takahide; Kawanishi, Tetsuya; Komai, Yuki; Anzai, Shimako; Izutsu, Masayuki; Kodate, Kashiko

    2007-06-11

    In optical packet switching (OPS) and optical code division multiple access (OCDMA) systems, label generation and processing are key technologies. Recently, several label processors have been proposed and demonstrated. However, in order to recognize N different labels, N separate devices are required. Here, we propose and experimentally demonstrate a large-scale, multiple optical code (OC)-label generation and processing technology based on multi-port, a fully tunable optical spectrum synthesizer (OSS) and a multi-wavelength electro-optic frequency comb generator. The OSS can generate 80 different OC-labels simultaneously and can perform 80-parallel matched filtering. We also demonstrated its application to OCDMA.

  16. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    Science.gov (United States)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  17. Modelling of WWER-1000 steam generators by REALP5/MOD3.2 code

    Energy Technology Data Exchange (ETDEWEB)

    D`Auria, F.; Galassi, G.M. [Univ. of Pisa (Italy); Frogheri, M. [Univ. of Genova (Italy)

    1997-12-31

    The presentation summarises the results of best estimate calculations carried out with reference to the WWER-1000 Nuclear Power Plant, utilizing a qualified nodalization set-up for the Relap5/Mod3.2 code. The nodalization development has been based on the data of the Kozloduy Bulgarian Plant. The geometry of the steam generator imposed drastic changes in noding philosophy with respect to what is suitable for the U-tubes steam generators. For the secondary side a symmetry axis was chosen to separate (in the nodalization) the hot and the cold sides of the tubes. In this way the secondary side of the steam generators was divided into three zones: (a) the hot zone including the hot collector and the hot l/2 parts of the tubes; (b) the cold zone including the cold collector and the cold 1/2 parts of the tubes; (c) the downcomer region, where down flow is assumed. As a consequence of above in the primary side more nodes are placed on the hot side of the tubes. Steady state and transient qualification has been achieved, considering the criteria proposed at the University of Pisa, utilizing plant transient data from the Kozloduy and the Ukrainian Zaporosche Plants. The results of the application of the qualified WWER-1000 Relap5/Mod3.2 nodalization to various transients including large break LOCA, small break LOCA and steam generator tube rupture, together with a sensitivity analysis on the steam generators, are reported in the presentation. Emphasis is given to the prediction of the steam generators performances. 23 refs.

  18. Modelling of WWER-1000 steam generators by REALP5/MOD3.2 code

    Energy Technology Data Exchange (ETDEWEB)

    D` Auria, F; Galassi, G M [Univ. of Pisa (Italy); Frogheri, M [Univ. of Genova (Italy)

    1998-12-31

    The presentation summarises the results of best estimate calculations carried out with reference to the WWER-1000 Nuclear Power Plant, utilizing a qualified nodalization set-up for the Relap5/Mod3.2 code. The nodalization development has been based on the data of the Kozloduy Bulgarian Plant. The geometry of the steam generator imposed drastic changes in noding philosophy with respect to what is suitable for the U-tubes steam generators. For the secondary side a symmetry axis was chosen to separate (in the nodalization) the hot and the cold sides of the tubes. In this way the secondary side of the steam generators was divided into three zones: (a) the hot zone including the hot collector and the hot l/2 parts of the tubes; (b) the cold zone including the cold collector and the cold 1/2 parts of the tubes; (c) the downcomer region, where down flow is assumed. As a consequence of above in the primary side more nodes are placed on the hot side of the tubes. Steady state and transient qualification has been achieved, considering the criteria proposed at the University of Pisa, utilizing plant transient data from the Kozloduy and the Ukrainian Zaporosche Plants. The results of the application of the qualified WWER-1000 Relap5/Mod3.2 nodalization to various transients including large break LOCA, small break LOCA and steam generator tube rupture, together with a sensitivity analysis on the steam generators, are reported in the presentation. Emphasis is given to the prediction of the steam generators performances. 23 refs.

  19. Impact of Distributed Generation Grid Code Requirements on Islanding Detection in LV Networks

    Directory of Open Access Journals (Sweden)

    Fabio Bignucolo

    2017-01-01

    Full Text Available The recent growing diffusion of dispersed generation in low voltage (LV distribution networks is entailing new rules to make local generators participate in network stability. Consequently, national and international grid codes, which define the connection rules for stability and safety of electrical power systems, have been updated requiring distributed generators and electrical storage systems to supply stabilizing contributions. In this scenario, specific attention to the uncontrolled islanding issue has to be addressed since currently required anti-islanding protection systems, based on relays locally measuring voltage and frequency, could no longer be suitable. In this paper, the effects on the interface protection performance of different LV generators’ stabilizing functions are analysed. The study takes into account existing requirements, such as the generators’ active power regulation (according to the measured frequency and reactive power regulation (depending on the local measured voltage. In addition, the paper focuses on other stabilizing features under discussion, derived from the medium voltage (MV distribution network grid codes or proposed in the literature, such as fast voltage support (FVS and inertia emulation. Stabilizing functions have been reproduced in the DIgSILENT PowerFactory 2016 software environment, making use of its native programming language. Later, they are tested both alone and together, aiming to obtain a comprehensive analysis on their impact on the anti-islanding protection effectiveness. Through dynamic simulations in several network scenarios the paper demonstrates the detrimental impact that such stabilizing regulations may have on loss-of-main protection effectiveness, leading to an increased risk of unintentional islanding.

  20. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    International Nuclear Information System (INIS)

    Virtanen, E.; Haapalehto, T.; Kouhia, J.

    1997-01-01

    Three experiments were conducted to study the behaviour of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes so that the results may be compared. Only the steam generator was modeled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments. (orig.)

  1. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.; Haapalehto, T. [Lappeenranta Univ. of Technology, Lappeenranta (Finland); Kouhia, J. [VTT Energy, Nuclear Energy, Lappeenranta (Finland)

    1995-09-01

    Three experiments were conducted to study the behavior of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes to that the results may be compared. Only the steam generator was modelled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments.

  2. The generation of absorbed dose profiles of proton beam in water using Geant4 code

    International Nuclear Information System (INIS)

    Christovao, Marilia T.; Campos, Tarcisio Passos R. de

    2007-01-01

    The present article approaches simulations on the proton beam radiation therapy, using an application based on the code GEANT4, with Open GL as a visualization drive and JAS3 (Java Analysis Studio) analysis data tools systems, implementing the AIDA interfaces. The proton radiotherapy is adapted to treat cancer or other benign tumors that are close to sensitive structures, since it allows precise irradiation of the target with high doses, while the health tissues adjacent to vital organs and tissues are preserved, due to physical property of dose profile. GEANT4 is a toolkit for simulating the transport of particles through matter, in complex geometries. Taking advantage of the object-oriented project features, the user can adapt or extend the tool in all domain, due to the flexibility of the code, providing a subroutine's group for materials definition, geometries and particles properties in agreement with the user's needs to generate the Monte Carlo simulation. In this paper, the parameters of beam line used in the simulation possess adjustment elements, such as: the range shifter, composition and dimension; the beam line, energy, intensity, length, according with physic processes applied. The simulation result is the depth dose profiles on water, dependent on the various incident beam energy. Starting from those profiles, one can define appropriate conditions for proton radiotherapy in ocular region. (author)

  3. GeNN: a code generation framework for accelerated brain simulations

    Science.gov (United States)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  4. CCFL in hot legs and steam generators and its prediction with the CATHARE code

    International Nuclear Information System (INIS)

    Geffraye, G.; Bazin, P.; Pichon, P.

    1995-01-01

    This paper presents a study about the Counter-Current Flow Limitation (CCFL) prediction in hot legs and steam generators (SG) in both system test facilities and pressurized water reactors. Experimental data are analyzed, particularly the recent MHYRESA test data. Geometrical and scale effects on the flooding behavior are shown. The CATHARE code modelling problems concerning the CCFL prediction are discussed. A method which gives the user the possibility of controlling the flooding limit at a given location is developed. In order to minimize the user effect, a methodology is proposed to the user in case of a calculation with a counter-current flow between the upper plenum and the SF U-tubes. The following questions have to be made clear for the user: when to use the CATHARE CCFL option, which correlation to use, and where to locate the flooding limit

  5. CCFL in hot legs and steam generators and its prediction with the CATHARE code

    Energy Technology Data Exchange (ETDEWEB)

    Geffraye, G.; Bazin, P.; Pichon, P. [CEA/DRN/STR, Grenoble (France)

    1995-09-01

    This paper presents a study about the Counter-Current Flow Limitation (CCFL) prediction in hot legs and steam generators (SG) in both system test facilities and pressurized water reactors. Experimental data are analyzed, particularly the recent MHYRESA test data. Geometrical and scale effects on the flooding behavior are shown. The CATHARE code modelling problems concerning the CCFL prediction are discussed. A method which gives the user the possibility of controlling the flooding limit at a given location is developed. In order to minimize the user effect, a methodology is proposed to the user in case of a calculation with a counter-current flow between the upper plenum and the SF U-tubes. The following questions have to be made clear for the user: when to use the CATHARE CCFL option, which correlation to use, and where to locate the flooding limit.

  6. Evaluation of Material Models within LS-DYNA(Registered TradeMark) for a Kevlar/Epoxy Composite Honeycomb

    Science.gov (United States)

    Polanco, Michael A.; Kellas, Sotiris; Jackson, Karen

    2009-01-01

    The performance of material models to simulate a novel composite honeycomb Deployable Energy Absorber (DEA) was evaluated using the nonlinear explicit dynamic finite element code LS-DYNA(Registered TradeMark). Prototypes of the DEA concept were manufactured using a Kevlar/Epoxy composite material in which the fibers are oriented at +/-45 degrees with respect to the loading axis. The development of the DEA has included laboratory tests at subcomponent and component levels such as three-point bend testing of single hexagonal cells, dynamic crush testing of single multi-cell components, and impact testing of a full-scale fuselage section fitted with a system of DEA components onto multi-terrain environments. Due to the thin nature of the cell walls, the DEA was modeled using shell elements. In an attempt to simulate the dynamic response of the DEA, it was first represented using *MAT_LAMINATED_COMPOSITE_FABRIC, or *MAT_58, in LS-DYNA. Values for each parameter within the material model were generated such that an in-plane isotropic configuration for the DEA material was assumed. Analytical predictions showed that the load-deflection behavior of a single-cell during three-point bending was within the range of test data, but predicted the DEA crush response to be very stiff. In addition, a *MAT_PIECEWISE_LINEAR_PLASTICITY, or *MAT_24, material model in LS-DYNA was developed, which represented the Kevlar/Epoxy composite as an isotropic elastic-plastic material with input from +/-45 degrees tensile coupon data. The predicted crush response matched that of the test and localized folding patterns of the DEA were captured under compression, but the model failed to predict the single-cell three-point bending response.

  7. Aspects of the design of the automated system for code generation of electrical items of technological equipment

    Directory of Open Access Journals (Sweden)

    Erokhin V.V.

    2017-09-01

    Full Text Available The article presents the aspects of designing an automated system for generating codes for electrical elements of process equipment using CASE-means. We propose our own technology of iterative development of such systems. The proposed methodology uses the tool to develop the ERwin Data Modeler databases of Computer Associates and the author's tool for the automatic generation of ERwin Class Builder code. The implemented design tool is a superstructure over the ERwin Data Modeler from Computer Associates, which extends its functionality. ERwin Data Modeler works with logical and physical data models and allows you to generate a description of the database and ddl-scripts.

  8. SPRINT: A Tool to Generate Concurrent Transaction-Level Models from Sequential Code

    Directory of Open Access Journals (Sweden)

    Richard Stahl

    2007-01-01

    Full Text Available A high-level concurrent model such as a SystemC transaction-level model can provide early feedback during the exploration of implementation alternatives for state-of-the-art signal processing applications like video codecs on a multiprocessor platform. However, the creation of such a model starting from sequential code is a time-consuming and error-prone task. It is typically done only once, if at all, for a given design. This lack of exploration of the design space often leads to a suboptimal implementation. To support our systematic C-based design flow, we have developed a tool to generate a concurrent SystemC transaction-level model for user-selected task boundaries. Using this tool, different parallelization alternatives have been evaluated during the design of an MPEG-4 simple profile encoder and an embedded zero-tree coder. Generation plus evaluation of an alternative was possible in less than six minutes. This is fast enough to allow extensive exploration of the design space.

  9. Using MathWorks' Simulink® and Real-Time Workshop® Code Generator to Produce Attitude Control Test and Flight Code

    OpenAIRE

    Salada, Mark; Dellinger, Wayne

    1998-01-01

    This paper describes the use of a commercial product, MathWorks' RealTime Workshop® (RTW), to generate actual flight code for NASA's Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics (TIMED) mission. The Johns Hopkins University Applied Physics Laboratory is handling the design and construction of this satellite for NASA. As TIMED is scheduled to launch in May of the year 2000, software development for both ground and flight systems are well on their way. However, based on experien...

  10. Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®

    Energy Technology Data Exchange (ETDEWEB)

    Stander, Nielen; Basudhar, Anirban; Basu, Ushnish; Gandikota, Imtiaz; Savic, Vesna; Sun, Xin; Choi, Kyoo Sil; Hu, Xiaohua; Pourboghrat, F.; Park, Taejoon; Mapar, Aboozar; Kumar, Shavan; Ghassemi-Armaki, Hassan; Abu-Farha, Fadi

    2015-09-14

    Ever-tightening regulations on fuel economy, and the likely future regulation of carbon emissions, demand persistent innovation in vehicle design to reduce vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials, by adding material diversity and composite materials, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing plate thickness while retaining sufficient strength and ductility required for durability and safety. A project to develop computational material models for advanced high strength steel is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the US Department of Energy. Under this program, new Third Generation Advanced High Strength Steel (i.e., 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. The objectives of the project are to integrate atomistic, microstructural, forming and performance models to create an integrated computational materials engineering (ICME) toolkit for 3GAHSS. The mechanical properties of Advanced High Strength Steels (AHSS) are controlled by many factors, including phase composition and distribution in the overall microstructure, volume fraction, size and morphology of phase constituents as well as stability of the metastable retained austenite phase. The complex phase transformation and deformation mechanisms in these steels make the well-established traditional techniques obsolete, and a multi-scale microstructure-based modeling approach following the ICME [0]strategy was therefore chosen in this project. Multi-scale modeling as a major area of research and development is an outgrowth of the Comprehensive

  11. LS1 Report: nearing the finish line

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    The LS1 team will be popping the champagne next week, on Tuesday 27 May, celebrating the completion of the consolidation of the splices in the framework of the SMACC project.   A technician works on one of the final shunts during LS1. "It has been a long journey into the heart of the LHC, tackling over 27,000 shunts*," says Luca Bottura, TE-MSC Group leader. "We are happy that the final train has, at last, reached its rest station, and look forward to sending it on many new adventures," confirm Frédéric Savary, TE-MSC Large Magnet Facility Section leader, and Jean-Philippe Tock, SMACC Project leader. Also in the LHC, pressure tests in Sector 1-2 - the third sector to be tackled - are almost complete. The temperature in Sector 6-7 is around 100K and it will be accessible again from next week. As for the SPS, all the LSS1 beam elements excluding one monitor are back in position. Vacuum teams are now ...

  12. Non-stop training during LS1!

    CERN Multimedia

    HSE Unit

    2013-01-01

    The year 2013 is a busy year for the Safety Training team, who are seeing a dramatic increase in their activities during LS1. The Safety Training Service within the HSE Unit offers training courses all year round to people working on the CERN site who are exposed to a variety of potential hazards (e.g. chemical hazards, fire hazards, etc.) either because of the activities they perform (e.g. work in confined spaces or on machines) and/or their place of work (e.g. workshops, laboratories, underground areas, etc.).   LS1 has triggered an increase in the number of requests for training, mainly from people requiring to carry out work on the LHC. Indeed, in order to access the underground areas, it is obligatory to have taken certain safety courses such as the self-rescue mask or radiation protection training courses. Consequently, the number of training sessions and the number of people trained is currently twice what it was during the same period in 2012, with almost 4,600 people trained in 530 s...

  13. Thermal-hydraulic analysis of SMART steam generator tube rupture using TASS/SMR-S code

    International Nuclear Information System (INIS)

    Kim, Hee-Kyung; Kim, Soo Hyoung; Chung, Young-Jong; Kim, Hyeon-Soo

    2013-01-01

    Highlights: ► The analysis was performed from the viewpoint of primary coolant leakage. ► The thermal hydraulic responses and the maximum leakage have been identified. ► There is no direct release into the atmosphere caused by an SGTR accident. ► SMART safety system works well against an SGTR accident. - Abstract: A steam generator tube rupture (SGTR) accident analysis for SMART was performed using the TASS/SMR-S code. SMART with a rated thermal power of 330 MWt has been developed at the Korea Atomic Energy Research Institute. The TASS/SMR-S code can analyze the thermal hydraulic phenomena of SMART in a full range of reactor operating conditions. An SGTR is one of the most important accidents from a thermal hydraulic and radiological viewpoint. A conservative analysis against a SMART SGTR was performed. The major concern of this analysis is to find the thermal hydraulic responses and maximum leakage amount from a primary to a secondary side caused by an SGTR accident. A sensitivity study searching for the conservative thermal hydraulic conditions, break locations, reactivity and other conditions was performed. The dominant parameters related with the integral leak are the high RCS pressure, low core inlet coolant temperature and low break location of the SG cassette. The largest integral leak comes to 28 tons in the most conservative case during 1 h. But there is no direct release into the atmosphere because the secondary system pressure is maintained with a sufficient margin for the design pressure. All leaks go to the condenser. The analysis results show that the primary and secondary system pressures are maintained below the design pressure and the SMART safety system is working well against an SGTR accident

  14. Calculation of neutron spectra produced in neutron generator target: Code testing.

    Science.gov (United States)

    Gaganov, V V

    2018-03-01

    DT-neutron spectra calculated using the SRIANG code was benchmarked against the results obtained by widely used Monte Carlo codes: PROFIL, SHORIN, TARGET, ENEA-JSI, MCUNED, DDT and NEUSDESC. The comparison of the spectra obtained by different codes confirmed the correctness of SRIANG calculations. The cross-checking of the compared spectra revealed some systematic features and possible errors of analysed codes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  16. The large-scale blast score ratio (LS-BSR pipeline: a method to rapidly compare genetic content between bacterial genomes

    Directory of Open Access Journals (Sweden)

    Jason W. Sahl

    2014-04-01

    Full Text Available Background. As whole genome sequence data from bacterial isolates becomes cheaper to generate, computational methods are needed to correlate sequence data with biological observations. Here we present the large-scale BLAST score ratio (LS-BSR pipeline, which rapidly compares the genetic content of hundreds to thousands of bacterial genomes, and returns a matrix that describes the relatedness of all coding sequences (CDSs in all genomes surveyed. This matrix can be easily parsed in order to identify genetic relationships between bacterial genomes. Although pipelines have been published that group peptides by sequence similarity, no other software performs the rapid, large-scale, full-genome comparative analyses carried out by LS-BSR.Results. To demonstrate the utility of the method, the LS-BSR pipeline was tested on 96 Escherichia coli and Shigella genomes; the pipeline ran in 163 min using 16 processors, which is a greater than 7-fold speedup compared to using a single processor. The BSR values for each CDS, which indicate a relative level of relatedness, were then mapped to each genome on an independent core genome single nucleotide polymorphism (SNP based phylogeny. Comparisons were then used to identify clade specific CDS markers and validate the LS-BSR pipeline based on molecular markers that delineate between classical E. coli pathogenic variant (pathovar designations. Scalability tests demonstrated that the LS-BSR pipeline can process 1,000 E. coli genomes in 27–57 h, depending upon the alignment method, using 16 processors.Conclusions. LS-BSR is an open-source, parallel implementation of the BSR algorithm, enabling rapid comparison of the genetic content of large numbers of genomes. The results of the pipeline can be used to identify specific markers between user-defined phylogenetic groups, and to identify the loss and/or acquisition of genetic information between bacterial isolates. Taxa-specific genetic markers can then be translated

  17. Development of a 3D FEL code for the simulation of a high-gain harmonic generation experiment

    International Nuclear Information System (INIS)

    Biedron, S. G.

    1999-01-01

    Over the last few years, there has been a growing interest in self-amplified spontaneous emission (SASE) free-electron lasers (FELs) as a means for achieving a fourth-generation light source. In order to correctly and easily simulate the many configurations that have been suggested, such as multi-segmented wigglers and the method of high-gain harmonic generation, we have developed a robust three-dimensional code. The specifics of the code, the comparison to the linear theory as well as future plans will be presented

  18. XML-Based Generator of C++ Code for Integration With GUIs

    Science.gov (United States)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  19. Pulse laser-induced generation of cluster codes from metal nanoparticles for immunoassay applications

    Directory of Open Access Journals (Sweden)

    Chia-Yin Chang

    2017-05-01

    Full Text Available In this work, we have developed an assay for the detection of proteins by functionalized nanomaterials coupled with laser-induced desorption/ionization mass spectrometry (LDI-MS by monitoring the generation of metal cluster ions. We achieved selective detection of three proteins [thrombin, vascular endothelial growth factor-A165 (VEGF-A165, and platelet-derived growth factor-BB (PDGF-BB] by modifying nanoparticles (NPs of three different metals (Au, Ag, and Pt with the corresponding aptamer or antibody in one assay. The Au, Ag, and Pt acted as metal bio-codes for the analysis of thrombin, VEGF-A165, and PDGF-BB, respectively, and a microporous cellulose acetate membrane (CAM served as a medium for an in situ separation of target protein-bound and -unbound NPs. The functionalized metal nanoparticles bound to their specific proteins were subjected to LDI-MS on the CAM. The functional nanoparticles/CAM system can function as a signal transducer and amplifier by transforming the protein concentration into an intense metal cluster ion signal during LDI-MS analysis. This system can selectively detect proteins at picomolar concentrations. Most importantly, the system has great potential for the detection of multiple proteins without any pre-concentration, separation, or purification process because LDI-MS coupled with CAM effectively removes all signals except for those from the metal cluster ions.

  20. UFOs in the LHC after LS1

    International Nuclear Information System (INIS)

    Baer, T.; Barnes, M.J.; Carlier, E.; Cerutti, F.; Dehning, B.; Ducimetiere, L.; Ferrari, A.; Garrel, N.; Gerardin, A.; Goddard, B.; Holzer, E.B.; Jackson, S.; Jimenez, J.M.; Kain, V.; Lechner, A.; Mertens, V.; Misiowiec, M.; Moron Ballester, R.; Nebot del Busto, E.; Norderhaug Drosdal, L.; Nordt, A.; Uythoven, J.; Velghe, B.; Vlachoudis, V.; Wenninger, J.; Zamantzas, C.; Zimmermann, F.; Fuster Martinez, N.

    2012-01-01

    UFOs (Unidentified Falling Objects) are potentially a major luminosity limitation for nominal LHC operation. With large-scale increases of the BLM thresholds, their impact on LHC availability was mitigated in the second half of 2011. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. Therefore, in 2011, the diagnostics for UFO events were significantly improved, dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. In this paper, the state of knowledge is summarized and extrapolations for LHC operation after LS1 are presented. Mitigation strategies are proposed and related tests and measures for 2012 are specified. (authors)

  1. UFOs in the LHC after LS1

    CERN Document Server

    Baer, T; Carlier, E; Cerutti, F; Dehning, B; Ducimetière, L; Ferrari, A; Garrel, N; Gérardin, A; Goddard, B; Holzer, E B; Jackson, S; Jimenez, J M; Kain, V; Lechner, A; Mertens, V; Misiowiec, M; Morón Ballester, R; Nebot del Busto, E; Norderhaug Drosdal, L; Nordt, A; Uythoven, J; Velghe, B; Vlachoudis, V; Wenninger, J; Zamantzas, C; Zimmermann, F; Fuster Martinez, N

    2012-01-01

    UFOs (Unidentified Falling Objects) are potentially a major luminosity limitation for nominal LHC operation. With large-scale increases of the BLM thresholds, their impact on LHC availability was mitigated in the second half of 2011. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. Therefore, in 2011, the diagnostics for UFO events were significantly improved, dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. In this paper, the state of knowledge is summarized and extrapolations for LHC operation after LS1 are presented. Mitigation strategies are proposed and related tests and measures for 2012 are specified.

  2. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.

  3. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  4. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  5. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    International Nuclear Information System (INIS)

    Arndt, S.A.

    1997-01-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities

  6. A code system to generate multigroup cross-sections using basic data

    International Nuclear Information System (INIS)

    Garg, S.B.; Kumar, Ashok

    1978-01-01

    For the neutronic studies of nuclear reactors, multigroup cross-sections derived from the basic energy point data are needed. In order to carry out the design based studies, these cross-sections should also incorporate the temperature and fuel concentration effects. To meet these requirements, a code system comprising of RESRES, UNRES, FIGERO, INSCAT, FUNMO, AVER1 and BGPONE codes has been adopted. The function of each of these codes is discussed. (author)

  7. ARTEMIS: The core simulator of AREVA NP's next generation coupled neutronics/thermal-hydraulics code system ARCADIAR

    International Nuclear Information System (INIS)

    Hobson, Greg; Merk, Stephan; Bolloni, Hans-Wilhelm; Breith, Karl-Albert; Curca-Tivig, Florin; Van Geemert, Rene; Heinecke, Jochen; Hartmann, Bettina; Porsch, Dieter; Tiles, Viatcheslav; Dall'Osso, Aldo; Pothet, Baptiste

    2008-01-01

    AREVA NP has developed a next-generation coupled neutronics/thermal-hydraulics code system, ARCADIA R , to fulfil customer's current demands and even anticipate their future demands in terms of accuracy and performance. The new code system will be implemented world-wide and will replace several code systems currently used in various global regions. An extensive phase of verification and validation of the new code system is currently in progress. One of the principal components of this new system is the core simulator, ARTEMIS. Besides the stand-alone tests on the individual computational modules, integrated tests on the overall code are being performed in order to check for non-regression as well as for verification of the code. Several benchmark problems have been successfully calculated. Full-core depletion cycles of different plant types from AREVA's French, American and German regions (e.g. N4 and KONVOI types) have been performed with ARTEMIS (using APOLLO2-A cross sections) and compared directly with current production codes, e.g. with SCIENCE and CASCADE-3D, and additionally with measurements. (authors)

  8. Evaluation of four-dimensional nonbinary LDPC-coded modulation for next-generation long-haul optical transport networks.

    Science.gov (United States)

    Zhang, Yequn; Arabaci, Murat; Djordjevic, Ivan B

    2012-04-09

    Leveraging the advanced coherent optical communication technologies, this paper explores the feasibility of using four-dimensional (4D) nonbinary LDPC-coded modulation (4D-NB-LDPC-CM) schemes for long-haul transmission in future optical transport networks. In contrast to our previous works on 4D-NB-LDPC-CM which considered amplified spontaneous emission (ASE) noise as the dominant impairment, this paper undertakes transmission in a more realistic optical fiber transmission environment, taking into account impairments due to dispersion effects, nonlinear phase noise, Kerr nonlinearities, and stimulated Raman scattering in addition to ASE noise. We first reveal the advantages of using 4D modulation formats in LDPC-coded modulation instead of conventional two-dimensional (2D) modulation formats used with polarization-division multiplexing (PDM). Then we demonstrate that 4D LDPC-coded modulation schemes with nonbinary LDPC component codes significantly outperform not only their conventional PDM-2D counterparts but also the corresponding 4D bit-interleaved LDPC-coded modulation (4D-BI-LDPC-CM) schemes, which employ binary LDPC codes as component codes. We also show that the transmission reach improvement offered by the 4D-NB-LDPC-CM over 4D-BI-LDPC-CM increases as the underlying constellation size and hence the spectral efficiency of transmission increases. Our results suggest that 4D-NB-LDPC-CM can be an excellent candidate for long-haul transmission in next-generation optical networks.

  9. Application of the thermal-hydraulic codes in VVER-440 steam generators modelling

    Energy Technology Data Exchange (ETDEWEB)

    Matejovic, P.; Vranca, L.; Vaclav, E. [Nuclear Power Plant Research Inst. VUJE (Slovakia)

    1995-12-31

    Performances with the CATHARE2 V1.3U and RELAP5/MOD3.0 application to the VVER-440 SG modelling during normal conditions and during transient with secondary water lowering are described. Similar recirculation model was chosen for both codes. In the CATHARE calculation, no special measures were taken with the aim to optimize artificially flow rate distribution coefficients for the junction between SG riser and steam dome. Contrary to RELAP code, the CATHARE code is able to predict reasonable the secondary swell level in nominal conditions. Both codes are able to model properly natural phase separation on the SG water level. 6 refs.

  10. Application of the thermal-hydraulic codes in VVER-440 steam generators modelling

    Energy Technology Data Exchange (ETDEWEB)

    Matejovic, P; Vranca, L; Vaclav, E [Nuclear Power Plant Research Inst. VUJE (Slovakia)

    1996-12-31

    Performances with the CATHARE2 V1.3U and RELAP5/MOD3.0 application to the VVER-440 SG modelling during normal conditions and during transient with secondary water lowering are described. Similar recirculation model was chosen for both codes. In the CATHARE calculation, no special measures were taken with the aim to optimize artificially flow rate distribution coefficients for the junction between SG riser and steam dome. Contrary to RELAP code, the CATHARE code is able to predict reasonable the secondary swell level in nominal conditions. Both codes are able to model properly natural phase separation on the SG water level. 6 refs.

  11. Data on genome analysis of Bacillus velezensis LS69.

    Science.gov (United States)

    Liu, Guoqiang; Kong, Yingying; Fan, Yajing; Geng, Ce; Peng, Donghai; Sun, Ming

    2017-08-01

    The data presented in this article are related to the published entitled "Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria" (Liu et al., 2017) [1]. Genome analysis revealed B. velezensis LS69 has a good potential for biocontrol and plant growth promotion. This article provides an extended analysis of the genetic islands, core genes and amylolysin loci of B. velezensis LS69.

  12. Data on genome analysis of Bacillus velezensis LS69

    OpenAIRE

    Liu, Guoqiang; Kong, Yingying; Fan, Yajing; Geng, Ce; Peng, Donghai; Sun, Ming

    2017-01-01

    The data presented in this article are related to the published entitled “Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria” (Liu et al., 2017) [1]. Genome analysis revealed B. velezensis LS69 has a good potential for biocontrol and plant growth promotion. This article provides an extended analysis of the genetic islands, core genes and amylolysin loci of B. velezensis LS69.

  13. Data on genome analysis of Bacillus velezensis LS69

    Directory of Open Access Journals (Sweden)

    Guoqiang Liu

    2017-08-01

    Full Text Available The data presented in this article are related to the published entitled “Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria” (Liu et al., 2017 [1]. Genome analysis revealed B. velezensis LS69 has a good potential for biocontrol and plant growth promotion. This article provides an extended analysis of the genetic islands, core genes and amylolysin loci of B. velezensis LS69.

  14. Influence of Terraced area DEM Resolution on RUSLE LS Factor

    Science.gov (United States)

    Zhang, Hongming; Baartman, Jantiene E. M.; Yang, Xiaomei; Gai, Lingtong; Geissen, Viollette

    2017-04-01

    Topography has a large impact on the erosion of soil by water. Slope steepness and slope length are combined (the LS factor) in the universal soil-loss equation (USLE) and its revised version (RUSLE) for predicting soil erosion. The LS factor is usually extracted from a digital elevation model (DEM). The grid size of the DEM will thus influence the LS factor and the subsequent calculation of soil loss. Terracing is considered as a support practice factor (P) in the USLE/RUSLE equations, which is multiplied with the other USLE/RUSLE factors. However, as terraces change the slope length and steepness, they also affect the LS factor. The effect of DEM grid size on the LS factor has not been investigated for a terraced area. We obtained a high-resolution DEM by unmanned aerial vehicles (UAVs) photogrammetry, from which the slope steepness, slope length, and LS factor were extracted. The changes in these parameters at various DEM resolutions were then analysed. The DEM produced detailed LS-factor maps, particularly for low LS factors. High (small valleys, gullies, and terrace ridges) and low (flats and terrace fields) spatial frequencies were both sensitive to changes in resolution, so the areas of higher and lower slope steepness both decreased with increasing grid size. Average slope steepness decreased and average slope length increased with grid size. Slope length, however, had a larger effect than slope steepness on the LS factor as the grid size varied. The LS factor increased when the grid size increased from 0.5 to 30-m and increased significantly at grid sizes >5-m. The LS factor was increasingly overestimated as grid size decreased. The LS factor decreased from grid sizes of 30 to 100-m, because the details of the terraced terrain were gradually lost, but the factor was still overestimated.

  15. Requests from use experience of ORIGEN code. Activity of the working group on evaluation of nuclide generation and depletion

    International Nuclear Information System (INIS)

    Matsumura, Tetsuo

    2005-01-01

    A questionnaire survey was carried out through the committee members of the working group on evaluation of nuclide generation and depletion about the demand accuracy of the ORIGEN code which is used widely in various fields of design analysis and evaluation. WG committee asked each organization's ORIGEN user, and obtained the replay from various fields. (author)

  16. Analysis of the VVER-440 reactor steam generator secondary side with the RELAP5/MOD3 code

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1993-01-01

    Nuclear Engineering Laboratory of the Technical Research Centre of Finland has widely used RELAP5/MOD2 and -MOD3 codes to simulate horizontal steam generators. Several models have been developed and successfully used in the VVER-safety analysis. Nevertheless, the models developed have included only rather few nodes in the steam generator secondary side. The secondary side has normally been divided into about 10 to 15 nodes. Since the secondary side at the steam generators of VVER-440 type reactors consists of a rather large water pool, these models were only roughly capable to predict secondary side flows. The paper describes an attempt to use RELAP5/MOD3 code to predict secondary side flows in a steam generator of a VVER-440 reactor. A 2D/3D model has been developed using RELAP5/MOD3 codes cross-flow junctions. The model includes 90 volumes on the steam generator secondary side. The model has been used to calculate steady state flow conditions in the secondary side of a VVER-440 reactor steam generator. (orig.) (1 ref., 9 figs., 2 tabs.)

  17. Information rates of next-generation long-haul optical fiber systems using coded modulation

    NARCIS (Netherlands)

    Liga, G.; Alvarado, A.; Agrell, E.; Bayvel, P.

    2017-01-01

    A comprehensive study of the coded performance of long-haul spectrally-efficient WDM optical fiber transmission systems with different coded modulation decoding structures is presented. Achievable information rates are derived for three different square QAM formats and the optimal format is

  18. A Comparison of Nuclear Power Plant Simulator with RELAP5/MOD3 code about Steam Generator Tube Rupture

    International Nuclear Information System (INIS)

    Kim, Sung Hyun; Moon, Chan Ki; Park, Sung Baek; Na, Man Gyun

    2013-01-01

    The RELAP5/MOD3 code introduced in cooperation with U. S. NRC has been utilized mainly for validation calculation of accident analysis submitted by licensee in Korea. The Korea Institute of Nuclear Safety has built a verification system of LWR accident analysis with RELAP5/MOD3 code engine. Therefore, the simulator replicates the design basis accident and its results are compared with RELAP5/MOD3 code results that will have important implications in the verification of the simulator in the future. The SGTR simulations were performed by the simulator and its results were compared with ones by RELAP5/MOD3 code in this study. Thus, the results of this study can be used as materials to build the verification system of the nuclear power plant simulator. We tried to compare with RELAP5/MOD3 verification code by replicating major parameters of steam generator tube rupture using the simulator for OPR-1000 in Yonggwang training center. By comparing the changes in temperature, pressure and inventory of the reactor coolant system and main steam system during the SGTR, it was confirmed that the main behaviors of SGTR which the simulator and RELAP5/MOD3 code showed are similar. However, the behavior of SG pressure and level that are important parameters to diagnose the accident were a little different. We estimated that RELAP5/MOD3 code was not reflected the major control systems in detail, such as FWCS, SBCS and PPCS. The different behaviors of SG level and pressure in this study should be needed an additional review. As a result of the comparison, the major simulation parameters behavior by RELAP5/MOD3 code agreed well with the one by the simulator. Therefore, it is thought that RELAP5/MOD3 code is used as a tool for validation of NPP simulator in the near future through this study

  19. ORIGEN2: a revised and updated version of the Oak Ridge isotope generation and depletion code

    International Nuclear Information System (INIS)

    Croff, A.G.

    1980-07-01

    ORIGEN2 is a versatile point depletion and decay computer code for use in simulating nuclear fuel cycles and calculating the nuclide compositions of materials contained therein. This code represents a revision and update of the original ORIGEN computer code which has been distributed world-wide beginning in the early 1970s. The purpose of this report is to give a summary description of a revised and updated version of the original ORIGEN computer code, which has been designated ORIGEN2. A detailed description of the computer code ORIGEN2 is presented. The methods used by ORIGEN2 to solve the nuclear depletion and decay equations are included. Input information necessary to use ORIGEN2 that has not been documented in supporting reports is documented

  20. A computer code for calculation of radioactive nuclide generation and depletion, decay heat and γ ray spectrum. FPGS90

    International Nuclear Information System (INIS)

    Ihara, Hitoshi; Katakura, Jun-ichi; Nakagawa, Tsuneo

    1995-11-01

    In a nuclear reactor radioactive nuclides are generated and depleted with burning up of nuclear fuel. The radioactive nuclides, emitting γ ray and β ray, play role of radioactive source of decay heat in a reactor and radiation exposure. In safety evaluation of nuclear reactor and nuclear fuel cycle, it is needed to estimate the number of nuclides generated in nuclear fuel under various burn-up condition of many kinds of nuclear fuel used in a nuclear reactor. FPGS90 is a code calculating the number of nuclides, decay heat and spectrum of emitted γ ray from fission products produced in a nuclear fuel under the various kinds of burn-up condition. The nuclear data library used in FPGS90 code is the library 'JNDC Nuclear Data Library of Fission Products - second version -', which is compiled by working group of Japanese Nuclear Data Committee for evaluating decay heat in a reactor. The code has a function of processing a so-called evaluated nuclear data file such as ENDF/B, JENDL, ENSDF and so on. It also has a function of making figures of calculated results. Using FPGS90 code it is possible to do all works from making library, calculating nuclide generation and decay heat through making figures of the calculated results. (author)

  1. A computer code for calculation of radioactive nuclide generation and depletion, decay heat and {gamma} ray spectrum. FPGS90

    Energy Technology Data Exchange (ETDEWEB)

    Ihara, Hitoshi; Katakura, Jun-ichi; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1995-11-01

    In a nuclear reactor radioactive nuclides are generated and depleted with burning up of nuclear fuel. The radioactive nuclides, emitting {gamma} ray and {beta} ray, play role of radioactive source of decay heat in a reactor and radiation exposure. In safety evaluation of nuclear reactor and nuclear fuel cycle, it is needed to estimate the number of nuclides generated in nuclear fuel under various burn-up condition of many kinds of nuclear fuel used in a nuclear reactor. FPGS90 is a code calculating the number of nuclides, decay heat and spectrum of emitted {gamma} ray from fission products produced in a nuclear fuel under the various kinds of burn-up condition. The nuclear data library used in FPGS90 code is the library `JNDC Nuclear Data Library of Fission Products - second version -`, which is compiled by working group of Japanese Nuclear Data Committee for evaluating decay heat in a reactor. The code has a function of processing a so-called evaluated nuclear data file such as ENDF/B, JENDL, ENSDF and so on. It also has a function of making figures of calculated results. Using FPGS90 code it is possible to do all works from making library, calculating nuclide generation and decay heat through making figures of the calculated results. (author).

  2. FERMI/LAT OBSERVATIONS OF LS 5039

    International Nuclear Information System (INIS)

    Abdo, A. A.; Ackermann, M.; Ajello, M.; Bechtol, K.; Berenji, B.; Blandford, R. D.; Bloom, E. D.; Borgland, A. W.; Atwood, W. B.; Axelsson, M.; Baldini, L.; Bellazzini, R.; Bregeon, J.; Brez, A.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Baughman, B. M.; Bonamente, E.; Brigida, M.

    2009-01-01

    The first results from observations of the high-mass X-ray binary LS 5039 using the Fermi Gamma-ray Space Telescope data between 2008 August and 2009 June are presented. Our results indicate variability that is consistent with the binary period, with the emission being modulated with a period of 3.903 ± 0.005 days; the first detection of this modulation at GeV energies. The light curve is characterized by a broad peak around superior conjunction in agreement with inverse Compton scattering models. The spectrum is represented by a power law with an exponential cutoff, yielding an overall flux (100 MeV-300 GeV) of 4.9 ± 0.5(stat) ± 1.8(syst) x10 -7 photon cm -2 s -1 , with a cutoff at 2.1 ± 0.3(stat) ± 1.1(syst) GeV and photon index Γ = 1.9 ± 0.1(stat) ± 0.3(syst). The spectrum is observed to vary with orbital phase, specifically between inferior and superior conjunction. We suggest that the presence of a cutoff in the spectrum may be indicative of magnetospheric emission similar to the emission seen in many pulsars by Fermi.

  3. LS1 Report: short-circuit tests

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    As the LS1 draws to an end, teams move from installation projects to a phase of intense testing. Among these are the so-called 'short-circuit tests'. Currently under way at Point 7, these tests verify the cables, the interlocks, the energy extraction systems, the power converters that provide current to the superconducting magnets and the cooling system.   Thermal camera images taken during tests at point 4 (IP4). Before putting beam into the LHC, all of the machine's hardware components need to be put to the test. Out of these, the most complicated are the superconducting circuits, which have a myriad of different failure modes with interlock and control systems. While these will be tested at cold - during powering tests to be done in August - work can still be done beforehand. "While the circuits in the magnets themselves cannot be tested at warm, what we can do is verify the power converter and the circuits right up to the place the cables go into the magn...

  4. LS1 Report: achieving the unachievable

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    The dismantling and extraction of a defective DFBA module from LHC Point 6, announced a few weeks ago, has been completed without a hitch. The DFBAs in the LHC are unique and irreplaceable components that must be handled with care.   The Transport team extract the defective module in one of the two DFBAs at Point 6. This module was brought to the surface, where it is currently being repared. Dismantling and extracting part of an electrical feed box (DFBA) had not been planned and could not have been foreseen. Nonetheless, that is what had to be done. When the LS1 teams discovered that the bellows of one of the DFBAs in Sector 5-6 were damaged - and completely inaccessible - they were not exactly overwhelmed with solutions. In fact, they had only one option: to dismantle them and take them up to the surface. Step 1: measure the alignment of the module to be taken out in relation to the beam lines to ensure that when the DFBA is put back in, it is in the right position for the beam to pass thr...

  5. LS1 Report: Setting the bar high

    CERN Multimedia

    Anaïs Schaeffer

    2014-01-01

    This week LS1 successfully passed an important milestone: the first pressure test of a complete sector, sector 6-7.  The objective of this test was to check the mechanical integrity and overall leak-tightness of this section of the LHC by injecting it with pressurised helium.   The team in charge of the preparation and of the realisation of the pressure tests in sector 6-7. “Given the scale of the work and of the operations carried out during 2013, particularly in the framework of the SMACC project and of the repair of the compensators of the cryogenic distribution line (QRL), we need to revalidate the integrity of the systems before the accelerator starts up again,” explains Olivier Pirotte, who is in charge of the pressure tests (TE-CRG). The pressure tests are performed over a single day after two weeks of intensive activity to prepare and specially configure the cryogenic instrumentation in the tunnel, and the pressure within a sector is increased in stages,...

  6. LS1 Report: Summer cool down

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    As the final LS1 activities are carried out in the machine, teams have been cooling down the accelerator sector by sector in preparation for beams.   The third sector of the LHC to be cooled down - sector 1-2 - has seen the process begin this week. During the cool-down phase, survey teams are measuring and smoothing (or realigning) the magnets at cold. By the end of August, five sectors of the machine will be in the process of cooling down, with one (sector 6-7) at cold. The LHC Access Safety System (LASS) is now being commissioned, and will be validated during the DSO tests at the beginning of October. As teams consolidate the modifications made to LASS during the shutdown, many points were closed for testing purposes. The CSCM (copper stabiliser continuity measurement) tests have been completed in the first sector (6-7) and no defect has been found. These results will be presented to the LHC Machine Committee next week. CSCM tests will start in the second sector in mid-August. Following many...

  7. Parallel Calculations in LS-DYNA

    Science.gov (United States)

    Vartanovich Mkrtychev, Oleg; Aleksandrovich Reshetov, Andrey

    2017-11-01

    Nowadays, structural mechanics exhibits a trend towards numeric solutions being found for increasingly extensive and detailed tasks, which requires that capacities of computing systems be enhanced. Such enhancement can be achieved by different means. E.g., in case a computing system is represented by a workstation, its components can be replaced and/or extended (CPU, memory etc.). In essence, such modification eventually entails replacement of the entire workstation, i.e. replacement of certain components necessitates exchange of others (faster CPUs and memory devices require buses with higher throughput etc.). Special consideration must be given to the capabilities of modern video cards. They constitute powerful computing systems capable of running data processing in parallel. Interestingly, the tools originally designed to render high-performance graphics can be applied for solving problems not immediately related to graphics (CUDA, OpenCL, Shaders etc.). However, not all software suites utilize video cards’ capacities. Another way to increase capacity of a computing system is to implement a cluster architecture: to add cluster nodes (workstations) and to increase the network communication speed between the nodes. The advantage of this approach is extensive growth due to which a quite powerful system can be obtained by combining not particularly powerful nodes. Moreover, separate nodes may possess different capacities. This paper considers the use of a clustered computing system for solving problems of structural mechanics with LS-DYNA software. To establish a range of dependencies a mere 2-node cluster has proven sufficient.

  8. CMS outreach event to close LS1

    CERN Multimedia

    Achintya Rao

    2015-01-01

    CMS opened its doors to about 700 students from schools near CERN, who visited the detector on 16 and 17 February during the last major CMS outreach event of LS1.   Pellentesque sapien mi, pharetra vitae, auctor eu, congue sed, turpis. Enthusiastic CMS guides spent a day and a half showing the equally enthusiastic visitors, aged 10 to 18, the beauty of CMS and particle physics. The recently installed wheelchair lift was called into action and enabled a visitor who arrived on crutches to access the detector cavern unimpeded.  The CMS collaboration had previously devoted a day to school visits after the successful “Neighbourhood Days” in May 2014 and, encouraged by the turnout, decided to extend an invitation to local schools once again. The complement of nearly 40 guides and crowd marshals was aided by a support team that coordinated the transportation of the young guests and received them at Point 5, where a dedicated safety team including first-aiders, security...

  9. LS1 Report: onwards and upwards

    CERN Multimedia

    Katarina Anthony

    2013-01-01

    For the first time since 2008, engineers have taken most of the LHC’s electromagnetic circuits up to the current needed for magnets to guide beams around the machine at the design energy of 7 TeV. This first phase of intensive tests has been instrumental for the planning of upcoming machine interventions.   All of the circuits in Sector 67 were powered to a 7 TeV equivalent current, with the main circuits (to be consolidated during LS1) powered at 4 TeV. Around 1700 magnet circuits are needed to circulate beams in the LHC. Come 2015, each and every one of these circuits will have to be able to accept their 7 TeV equivalent current. For the LHC’s 24 main dipole and quadrupole circuits, this will mean the consolidation of all their interconnections. But what about the rest of the LHC’s circuits that had been mostly operating at around 60% of the nominal value? How will they handle the ramp-up to design energy? Those questions were asked and answered during the rec...

  10. LS1 Report: Thank you magnetic horn!

    CERN Multimedia

    Antonella Del Rosso & Katarina Anthony

    2014-01-01

    Experiments at the Antimatter Decelerator (AD) have been receiving beams since the beginning of this week. There is a crucial element at the heart of the chain that prepares the antiproton beam: the so-called magnetic horn, a delicate piece of equipment that had to be refurbished during LS1 and that is now showing just how well it can perform.   View from the top of the target and horn trolley, along the direction of the beam. Antiprotons for the AD are produced by smashing a beam of protons from the PS onto an iridium target. However, the particles produced by the nuclear interactions are emitted at very wide angles; without a focussing element, all these precious particles would be lost. “A magnetic horn is placed at the exit of the target to focus back a large fraction of the negative particles, including antiprotons, parallel to the beam line and with the right momentum,” explains Marco Calviani, physicist in the EN Department and the expert in charge of the AD targe...

  11. LS1 Report: ALICE ups the ante

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    SPS up and running... LHC almost cold... CCC Operators back at their desks... all telltale signs of the start of Run 2! For the experiments, that means there are just a few short months left for them to prepare for beams. The CERN Bulletin will be checking in with each of the Big Four to see how they are getting on during these closing months...   It has been a long road for the ALICE LS1 team. From major improvements to the 19 sub-detectors to a full re-cabling and replacement of LEP-era electrical infrastructure, no part of the ALICE cavern has gone untouched.* With the experiment set to close in early December, the teams are making finishing touches before turning their focus towards re-commissioning and calibration. "Earlier this week, we installed the last two modules of the di-jet calorimeter," explains Werner Riegler, ALICE technical coordinator. "These are the final parts of a 60 degree calorimeter extension that is installed opposite the present calorimeter, c...

  12. LS1 Report: LHCb's early Christmas

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Accelerator chain up and running... CCC Operators back at their desks... all telltale signs of the start of Run 2! For the experiments, that means there are just a few short weeks left for them to prepare for beams. Over at LHCb, teams have kept ahead of the curve by focusing on new installations and improvements.   A delicate task: re-connecting the beam pipe in LHCb. From the primary detector services to the DAQ system to the high level trigger, November's injector test beams saw their way through a well-prepared LHCb experiment. “We set the transfer line tests as our deadline for the restart - the entire experiment had to be at nominal position and conditions,” says Eric Thomas, LHCb deputy Technical Coordinator and LHCb LS1 Project Coordinator. “Achieving this was a major milestone for the collaboration. If beam were to come tomorrow, we would be ready.” The injector tests gave the LHCb team a chance to synchronise their detectors, and to al...

  13. TRISO fuel thermal simulations in the LS-VHTR

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Mario C.; Scari, Maria E.; Costa, Antonella L.; Pereira, Claubia; Veloso, Maria A.F., E-mail: marc5663@gmail.com, E-mail: melizabethscari@yahoo.com, E-mail: antonella@nuclear.ufmg.br, E-mail: claubia@nuclear.ufmg.br, E-mail: dora@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear; Instituto Nacional de Ciência e Tecnologia de Reatores Nucleares Inovadores/CNPq (Brazil)

    2017-07-01

    The liquid-salt-cooled very high-temperature reactor (LS-VHTR) is a reactor that presents very good characteristics in terms of energy production and safety aspects. It uses as fuel the TRISO particles immersed in a graphite matrix with a cylindrical shape called fuel compact, as moderator graphite and as coolant liquid salt Li{sub 2}BeF{sub 4} called Flibe. This work evaluates the thermal hydraulic performance of the heat removal system and the reactor core by performing different simplifications to represent the reactor core and the fuel compact under steady-state conditions, starting the modeling from a single fuel element, until complete the studies with the entire core model developed in the RELAP5-3D code. Two models were considered for representation of the fuel compact, homogeneous and non-homogeneous models, as well as different geometries of the heat structures was considered. The aim to develop several models was to compare the thermal hydraulic characteristics resulting from the construction of a more economical and less discretized model with much more refined models that can lead to more complexes analyzes to representing TRISO effect particles in the fuel compact. The different results found, mainly, for the core temperature distributions are presented and discussed. (author)

  14. RELAP5/MOD2 code modifications to obtain better predictions for the once-through steam generator

    International Nuclear Information System (INIS)

    Blanchat, T.; Hassan, Y.

    1989-01-01

    The steam generator is a major component in pressurized water reactors. Predicting the response of a steam generator during both steady-state and transient conditions is essential in studying the thermal-hydraulic behavior of a nuclear reactor coolant system. Therefore, many analytical and experimental efforts have been performed to investigate the thermal-hydraulic behavior of the steam generators during operational and accident transients. The objective of this study is to predict the behavior of the secondary side of the once-through steam generator (OTSG) using the RELAP5/MOD2 computer code. Steady-state conditions were predicted with the current version of the RELAP5/MOD2 code and compared with experimental plant data. The code predictions consistently underpredict the degree of superheat. A new interface friction model has been implemented in a modified version of RELAP5/MOD2. This modification, along with changes to the flow regime transition criteria and the heat transfer correlations, correctly predicts the degree of superheat and matches plant data

  15. Phase-coded microwave signal generation based on a single electro-optical modulator and its application in accurate distance measurement.

    Science.gov (United States)

    Zhang, Fangzheng; Ge, Xiaozhong; Gao, Bindong; Pan, Shilong

    2015-08-24

    A novel scheme for photonic generation of a phase-coded microwave signal is proposed and its application in one-dimension distance measurement is demonstrated. The proposed signal generator has a simple and compact structure based on a single dual-polarization modulator. Besides, the generated phase-coded signal is stable and free from the DC and low-frequency backgrounds. An experiment is carried out. A 2 Gb/s phase-coded signal at 20 GHz is successfully generated, and the recovered phase information agrees well with the input 13-bit Barker code. To further investigate the performance of the proposed signal generator, its application in one-dimension distance measurement is demonstrated. The measurement accuracy is less than 1.7 centimeters within a measurement range of ~2 meters. The experimental results can verify the feasibility of the proposed phase-coded microwave signal generator and also provide strong evidence to support its practical applications.

  16. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  17. Development of the next generation code system as an engineering modelling language. 3. Study with prototyping. 2

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Chiba, Go; Kasahara, Naoto; Ishikawa, Makoto

    2004-04-01

    In the fast reactor development, numerical simulations using analysis code play and important role for complementing theory and experiment. In order to efficiently advance the research and development of fast reactors, JNC promotes the development of next generation simulation code (NGSC). In this report, an investigation research result by prototyping which carried out for the conceptual design of the NGSC is described. From the viewpoint of the cooperative research with CEA (Commissariat a l'Energie Atomique) in France, a trend survey on several platforms for numerical analysis and an applicability evaluation of the SALOME platform in CEA for the NGSC were carried out. As a result of the evaluation, it is confirmed that the SALOME had been satisfied the features of efficiency, openness, universality, expansibility and completeness that are required by the NGSC. In addition, it is confirmed that the SALOME had the concept of the control layer required by the NGSC and would be one of the important candidates as a platform of the NGSC. In the field of the structure analysis, the prototype of the PRTS.NET code was reexamined from the viewpoint of class structure and input/output specification in order to improve the data processing efficiency and maintainability. In the field of the reactor physics analysis, a development test of a new code with C++ and a reuse test of an existing code written in Fortran was carried out in view of utilizing the SALOME for the NGSC. (author)

  18. Development of a new generation solid rocket motor ignition computer code

    Science.gov (United States)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Ciucci, Alessandro; Johnson, Shelby D.

    1994-01-01

    This report presents the results of experimental and numerical investigations of the flow field in the head-end star grain slots of the Space Shuttle Solid Rocket Motor. This work provided the basis for the development of an improved solid rocket motor ignition transient code which is also described in this report. The correlation between the experimental and numerical results is excellent and provides a firm basis for the development of a fully three-dimensional solid rocket motor ignition transient computer code.

  19. Automatic Generation of Agents using Reusable Soft Computing Code Libraries to develop Multi Agent System for Healthcare

    OpenAIRE

    Priti Srinivas Sajja

    2015-01-01

    This paper illustrates architecture for a multi agent system in healthcare domain. The architecture is generic and designed in form of multiple layers. One of the layers of the architecture contains many proactive, co-operative and intelligent agents such as resource management agent, query agent, pattern detection agent and patient management agent. Another layer of the architecture is a collection of libraries to auto-generate code for agents using soft computing techni...

  20. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    International Nuclear Information System (INIS)

    Blink, J.A.

    1985-03-01

    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs

  1. Verification of SIGACE code for generating ACE format cross-section files with continuous energy at high temperature

    International Nuclear Information System (INIS)

    Li Zhifeng; Yu Tao; Xie Jinsen; Qin Mian

    2012-01-01

    Based on the recently released ENDF/B-VII. 1 library, high temperature neutron cross-section files are generated through SIGACE code using low temperature ACE format files. To verify the processed ACE file of SIGACE, benchmark calculations are performed in this paper. The calculated results of selected ICT, standard CANDU assembly, LWR Doppler coefficient and SEFOR benchmarks are well conformed with reference value, which indicates that high temperature ACE files processed by SIGACE can be used in related neutronics calculations. (authors)

  2. Development of the next generation code system as an engineering modeling language (1)

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Uto, Nariaki; Kasahara, Naoto; Nagura, Fuminori; Ishikawa, Makoto; Ohira, Masanori; Kato, Masayuki

    2002-11-01

    In the fast reactor development, numerical simulation using analytical codes plays an important role for complementing theory and experiment. It is necessary that the engineering models and analysis methods can be flexibly changed, because the phenamine to be investigated become more complicated due to the diversity of the needs for research. And, there are large problems in combining physical properties and engineering models in many different fields. In this study, the goal is to develop a flexible and general-purposive analysis system, in which the physical properties and engineering models are represented as a programming language or a diagrams that are easily understandable for humans and executable for computers. The authors named this concept the Engineering Modeling Language (EML). This report describes the result of the investigation for latest computer technologies and software development techniques which seem to be usable for a realization of the analysis code system for nuclear engineering as an EML. (author)

  3. PREP-PWR-1.0: a WIMS-D/4 pre-processor code for the generation of data for PWR fuel assemblies

    International Nuclear Information System (INIS)

    Ball, G.

    1991-06-01

    The PREP-PWR-1.0 computer code is a substantially modified version of the PREWIM code which formed part of the original MARIA System (Report J.E.N. 543). PREP-PWR-1.0 is a comprehensive pre-processor code which generates input data for the WIMS-D/4.1 code (Report PEL 294) for PWR fuel assemblies, with or without control and burnable poison rods. This data is generated at various base and off-base conditions. The overall cross section generation methodology is described, followed by a brief overview of the model. Aspects of the base/off-base calculational scheme are outlined. Additional features of the code are described while the input data format of PREP-PWR-1.0 is listed. The sample problems and suggestions for further improvements to the code are also described. 2 figs., 2 tabs., 12 refs

  4. User instructions for levelized power generation cost codes using an IBM-type PC

    International Nuclear Information System (INIS)

    Coen, J.J.; Delene, J.G.

    1989-01-01

    Programs for the calculation of levelized power generation costs using an IBM or compatible PC are described. Cost calculations for nuclear plants and coal-fired plants include capital investment cost, operation and maintenance cost, fuel cycle cost, decommissioning cost, and total levelized power generation cost. 7 refs., 36 figs., 4 tabs

  5. Experimental benchmark and code validation for airfoils equipped with passive vortex generators

    DEFF Research Database (Denmark)

    Baldacchino, D.; Manolesos, M.; Ferreira, Célia Maria Dias

    2016-01-01

    Experimental results and complimentary computations for airfoils with vortex generators are compared in this paper, as part of an effort within the AVATAR project to develop tools for wind turbine blade control devices. Measurements from two airfoils equipped with passive vortex generators, a 30...

  6. Characteristics of CdLS (Cornelia de Lange Syndrome)

    Science.gov (United States)

    ... Celebration and Memorial Gifts Planned Giving Monthly Giving Corporate Partnership Matching Gifts Stocks, Trusts and Other Gifts ... 25 percent of individuals with CdLS. Behavioral and communication issues and developmental delays often exist. Major Characteristics ...

  7. Model tests of a once-through steam generator for land-blocker assessment and THEDA code verification. Final report

    International Nuclear Information System (INIS)

    Carter, H.R.; Childerson, M.T.; Moskal, T.E.

    1983-06-01

    The Babcock and Wilcox Company (B and W) operating Once-Through Steam Generators (OTSGs) have experienced leaking tubes in a region adjacent to the untubed inspection lane. The tube leaks have been attributed to an environmentally-assisted fatigue mechanism with moisture transported up the inspection lane being a major factor in the tube-failure process. B and W has developed a hardware modification (lane blockers) to mitigate the detrimental effects of inspection lane moisture. A 30-tube Laboratory Once-through Steam Generator (Designated OTSGC) was designed, fabricated, and tested. Tests were performed with and without five flat-plate lane blockers installed on tube-support plates (TSPs) 10, 11, 12, 13, and 14. The test results were utilized to determine the effectiveness of lane blockers for eliminating moisture transport to the upper tubesheet in the inspection lanes and to benchmark the predictive capabilities of a three-dimensional steam-generator computer code, THEDA

  8. Migros-3: a code for the generation of group constants for reactor calculations from neutron nuclear data in KEDAK format

    International Nuclear Information System (INIS)

    Broeders, I.; Krieg, B.

    1977-01-01

    The code MIGROS-3 was developed from MIGROS-2. The main advantage of MIGROS-3 is its compatibility with the new conventions of the latest version of the Karlsruhe nuclear data library, KEDAK-3. Moreover, to some extent refined physical models were used and numerical methods were improved. MIGROS-3 allows the calculation of microscopic group cross sections of the ABBN type from isotopic neutron data given in KEDAK-format. All group constants, necessary for diffusion-, consistent P 1 - and Ssub(N)-calculations can be generated. Anisotropy of elastic scattering can be taken into account up to P 5 . A description of the code and the underlying theory is given. The input and output description, a sample problem and the program lists are provided. (orig.) [de

  9. Range and railgun development results at LS and PA ''Soyuz''

    International Nuclear Information System (INIS)

    Babakov, Y.P.; Plekhanov, A.V.; Zheleznyi, V.B.

    1995-01-01

    A rail electromagnetic accelerator is one of the most reliable and simple devices for accelerating macroparticles up to high velocity. These accelerators allow scientists to carry out fundamental and applied investigations to study both equation-of-state of materials at high pressure due to high velocity encounters, and creation of conditions for shock thermonuclear fusion. In the department ''Energophyzika'' LS and PA ''Soyuz'' range was created. It was equipped with an inductor with storage capacity up to 12.5 MJ energized by a solid propellant MHD generator and a capacitor bank (energy capacity up to 6 MJ). These systems deliver currents of 1 MA and 2 MA, respectively. Diagnostic, recording, and autocalculation systems allow use of as many as 120 data channels with acquisition frequency up to 10 MHz. Recent technical successes in railgun construction, using special methods to compact plasma armature and produce high velocity trailing contact, creation of hybrid armatures, and optimizing acceleration made possible to gain velocities in the range of 6.2 to 6.8 km/s (masses of 3.8 to 10 g) and velocities 2.7 to 3.8 km/s (masses of 50 to 100 g) on railguns with length 2 to 4 m

  10. Development of nuclear decay data library JDDL, and nuclear generation and decay calculation code COMRAD

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Ihara, Hitoshi; Katakura, Jun-ichi; Hara, Toshiharu.

    1986-08-01

    For safety evaluation of nuclear fuel facilities, a nuclear decay data library named JDDL and a computer code COMRAD have been developed to calculate isotopic composition of each nuclide, radiation source intensity, energy spectrum of γ-ray and neutron, and decay heat of spent fuel. JDDL has been produced mainly from the evaluated nuclear data file ENSDF to use new nuclear data. To supplement the data file for short life nuclides, the JNDC data set were also used which had been evaluated by Japan Nuclear Data Committee. Using these data, calculations became possible from short period to long period after irradiation. (author)

  11. LS1 Report: alive and kicking!

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Following eleven months of meticulous maintenance and consolidation works, the LHC's extraction kicker magnets (MKDs) and its pulse generators are back in the accelerator for a new phase of tests. Used to dump the beam, these kicker magnets are essential for the safety of the machine.   Pulse generators for the extraction kicker magnets at Point 6. The high voltage cables leading to the magnets can be seen in red. The LHC's kicker magnets are something rather special. Unlike most of the accelerator's extraction magnets, they only operate for a short period of time and focus on providing a quick "kick" to deflect the beam. If fact, they are permanently under voltage to be ready to go, and have only 3 microseconds in order to establish their kicking pulse! This means they have to be very powerful - with the help of their own high-powered pulse generators - and extremely well in synch - with the help of control and electronic specialists. "Du...

  12. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system

    International Nuclear Information System (INIS)

    Yang, W.S.; Lee, C.H.

    2008-01-01

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC 2 -2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC 2 -2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC 2 -2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC 2 -2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC 2 -2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC 2 -2, VIM, and NJOY. For almost all nuclides considered, MC 2 -2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC 2 -2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC 2 -2/TWODANT calculations were in good agreement with MCNP solutions within ∼0.25% Δρ, except a few small LANL fast assemblies. Relative to the MCNP solution, the MC 2 -2/TWODANT

  13. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system.

    Energy Technology Data Exchange (ETDEWEB)

    Yang, W. S.; Lee, C. H. (Nuclear Engineering Division)

    2008-05-16

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC{sup 2}-2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC{sup 2}-2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC{sup 2}-2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC{sup 2}-2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC{sup 2}-2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC{sup 2}-2, VIM, and NJOY. For almost all nuclides considered, MC{sup 2}-2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC{sup 2}-2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC{sup 2}-2/TWODANT calculations were in good agreement with MCNP solutions within {approx}0.25% {Delta}{rho}, except a few small LANL fast assemblies

  14. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  15. ORIGEN-2.2, Isotope Generation and Depletion Code Matrix Exponential Method

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of problem or function: ORIGEN is a computer code system for calculating the buildup, decay, and processing of radioactive materials. ORIGEN2 is a revised version of ORIGEN and incorporates updates of the reactor models, cross sections, fission product yields, decay data, and decay photon data, as well as the source code. ORIGEN-2.1 replaces ORIGEN and includes additional libraries for standard and extended-burnup PWR and BWR calculations, which are documented in ORNL/TM-11018. ORIGEN2.1 was first released in August 1991 and was replaced with ORIGEN2 Version 2.2 in June 2002. Version 2.2 was the first update to ORIGEN2 in over 10 years and was stimulated by a user discovering a discrepancy in the mass of fission products calculated using ORIGEN2 V2.1. Code modifications, as well as reducing the irradiation time step to no more than 100 days/step reduced the discrepancy from ∼10% to 0.16%. The bug does not noticeably affect the fission product mass in typical ORIGEN2 calculations involving reactor fuels because essentially all of the fissions come from actinides that have explicit fission product yield libraries. Thus, most previous ORIGEN2 calculations that were otherwise set up properly should not be affected. 2 - Method of solution: ORIGEN uses a matrix exponential method to solve a large system of coupled, linear, first-order ordinary differential equations with constant coefficients. ORIGEN2 has been variably dimensioned to allow the user to tailor the size of the executable module to the problem size and/or the available computer space. Dimensioned arrays have been set large enough to handle almost any size problem, using virtual memory capabilities available on most mainframe and 386/486 based PCS. The user is provided with much of the framework necessary to put some of the arrays to several different uses, call for the subroutines that perform the desired operations, and provide a mechanism to execute multiple ORIGEN2 problems with a single

  16. Program EAGLE User’s Manual. Volume 3. Grid Generation Code

    Science.gov (United States)

    1988-09-01

    15 1. ompps.te Grid Structure ..... .. .................. . 15 2. Block Interfaces ......... ...................... . 18 3. Fundmental ...in principle it is possible to establish a correspondence between any physical region and a single empty rectangular block for general three...differences. Since this second surrounding layer is not involved in the grid generation, no further account will be taken of its presence in the present

  17. A robust SRAM-PUF key generation scheme based on polar codes

    NARCIS (Netherlands)

    Chen, Bin; Ignatenko, Tanya; Willems, Frans M.J.; Maes, Roel; van der Sluis, Erik; Selimis, Georgios

    2017-01-01

    Physical unclonable functions (PUFs) are relatively new security primitives used for device authentication and device-specific secret key generation. In this paper we focus on SRAM- PUFs. The SRAM-PUFs enjoy uniqueness and randomness properties stemming from the intrinsic randomness of SRAM memory

  18. Generation of one energy group cross section library with MC2 computer code

    International Nuclear Information System (INIS)

    Cunha Menezes Filho, A. da; Souza, A.L. de.

    1982-01-01

    One group temperature dependent cross sections are generated via MC 2 for Pu-242, Ni-58, Fe-56, U-235, U-238, Pu-239, Pu-240, Pu-241, Be-9 e Th-232. The influence of the buckling and the weighting functions is studied throught calculations of an important integral parameter: the critical radius. (author) [pt

  19. Development of a methodology to generate materials constant for the FLARE-G computer code

    International Nuclear Information System (INIS)

    Martinez, A.S.; Rosier, C.J.; Schirru, R.; Silva, F.C. da; Thome Filho, Z.D.

    1983-01-01

    The methodology of calculation aiming to determine the parametrization constants of the multiplication factor and migration area is presented. These physical parameters are necessary in the solution of the diffusion equation with the nodal method, and they represent the adequated form of the macrogroup constants in the cell calculation. An automatic system was done to generate the parametrization constants. (E.G.) [pt

  20. State-of-the-art of wind turbine design codes: main features overview for cost-effective generation

    Energy Technology Data Exchange (ETDEWEB)

    Molenaar, D-P.; Dijkstra, S. [Delft University of Technology (Netherlands). Mechanical Engineering Systems and Control Group

    1999-07-01

    For successful large-scale application of wind energy, the price of electricity generated by wind turbines should decrease. Model-based control can be important since it has the potential to reduce fatigue loads, while simultaneously maintaining a desired amount of energy production. The controller synthesis, however, requires a mathematical model describing the most important dynamics of the complete wind turbine. In the wind energy community there is a wide variety in codes used to model a wind turbine's dynamic behaviour or to carry out design calculations. In this paper, the main features of the state-of-the-art wind turbine design codes have been investigated in order to judge the appropriateness of using one of these for the modeling, identification and control of flexible, variable speed wind turbines. It can be concluded that, although the sophistication of the design codes has increased enormously over the last two decades, they are, in general, not suitable for the design, and easy implementation of optimal operating strategies.

  1. Generation of initial geometries for the simulation of the physical system in the DualPHYsics code

    International Nuclear Information System (INIS)

    Segura Q, E.

    2013-01-01

    In the diverse research areas of the Instituto Nacional de Investigaciones Nucleares (ININ) are different activities related to science and technology, one of great interest is the study and treatment of the collection and storage of radioactive waste. Therefore at ININ the draft on the simulation of the pollutants diffusion in the soil through a porous medium (third stage) has this problem inherent aspects, hence a need for such a situation is to generate the initial geometry of the physical system For the realization of the simulation method is implemented smoothed particle hydrodynamics (SPH). This method runs in DualSPHysics code, which has great versatility and ability to simulate phenomena of any physical system where hydrodynamic aspects combine. In order to simulate a physical system DualSPHysics code, you need to preset the initial geometry of the system of interest, then this is included in the input file of the code. The simulation sets the initial geometry through regular geometric bodies positioned at different points in space. This was done through a programming language (Fortran, C + +, Java, etc..). This methodology will provide the basis to simulate more complex geometries future positions and form. (Author)

  2. Field programmable gate array (FPGA implementation of novel complex PN-code-generator- based data scrambler and descrambler

    Directory of Open Access Journals (Sweden)

    Shabir A. Parah

    2010-04-01

    Full Text Available A novel technique for the generation of complex and lengthy code sequences using low- length linear feedback shift registers (LFSRs for data scrambling and descrambling is proposed. The scheme has been implemented using VHSIC hardware description language (VHDL approach which allows the reconfigurability of the proposed system such that the length of the generated sequences can be changed as per the security requirements. In the present design consideration the power consumption and chip area requirements are small and the operating speed is high compared to conventional discrete I.C. design, which is a pre-requisite for any system designer. The design has been synthesised on device EP2S15F484C3 of Straitx II FPGA family, using Quarts Altera version 8.1. The simulation results have been found satisfactory and are in conformity with the theoretical observations.

  3. Simulation of sludge deposit onto a 900 MW steam generator tubesheet with the 3D code GENEPI

    International Nuclear Information System (INIS)

    Pascal-Ribot, S.; Debec-Mathet, E.; Soussan, D.; Grandotto, M.

    1998-01-01

    Heat transfer processes use fluids which are generally not pure and can react with transfer surfaces. These surfaces are subject to deposits which can be sediments harmful to heat transfer and to integrity of materials. For nuclear plant steam generators, sludge build-up accelerates secondary side corrosion by concentrating chemical species. A major safety problem involved with such a corrosion is the growing of circumferential cracks which are very difficult to detect and size with eddy current probes. With a view to understand and control this problem, it is necessary to develop a mathematical model for the prediction of sludge behavior in PWR steam generators. Based on fundamental principles, this work intends to use different models available in literature for the prediction of the phenomenon leading to the accumulation of sludge particles at the bottom (the tubesheet) of a PWR. For that, a three-dimensional simulation of magnetite particulate fouling with the finite elements code GENEPI is performed on a 900 MWe steam generator. The use of GENEPI code, originally designed and qualified for the analysis of steam generators thermalhydraulics is done in two steps. First, the local thermalhydraulic conditions of the carrier phase are calculated with the classical conservation equations of mass, momentum and enthalpy for the steam/water mixture (homogeneous model). Then, they are used for the solving of a particle transport equation. The mass transfer processes, which have been taken into account, are gravitational settling, sticking probability and reentrainment describing respectively the transport of sludge particles to the tubesheet, the particle attachment to this surface and the re-suspension of deposited particles from the tubesheet. A sink term characterizing the blowdown effect is also considered in the calculations. Deposition on the tube bundle surface area is not modelled. For this first approach, the simulation is made with a single particle size and

  4. A comparative study of the parabolized Navier-Stokes code using various grid-generation techniques

    Science.gov (United States)

    Kaul, U. K.; Chaussee, D. S.

    1985-01-01

    The parabolized Navier-Stokes (PNS) equations are used to calculate the flow-field characteristics about the hypersonic research aircraft X-24C. A comparison of the results obtained using elliptic, hyperbolic and algebraic grid generators is presented. The outer bow shock is treated as a sharp discontinuity, and the discontinuities within the shock layer are captured. Surface pressures and heat-transfer results at angles of attack of 6 deg and 20 deg, obtained using the three grid generators, are compared. The PNS equations are marched downstream over the body in both Cartesian and cylindrical base coordinate systems, and the results are compared. A robust marching procedure is demonstrated by successfully using large marching-step sizes with the implicit shock fitting procedure. A correlation is found between the marching-step size, Reynolds number and the angle of attack at fixed values of smoothing and stability coefficients for the marching scheme.

  5. A Robust SRAM-PUF Key Generation Scheme Based on Polar Codes

    OpenAIRE

    Chen, Bin; Ignatenko, Tanya; Willems, Frans M. J.; Maes, Roel; van der Sluis, Erik; Selimis, Georgios

    2017-01-01

    Physical unclonable functions (PUFs) are relatively new security primitives used for device authentication and device-specific secret key generation. In this paper we focus on SRAM-PUFs. The SRAM-PUFs enjoy uniqueness and randomness properties stemming from the intrinsic randomness of SRAM memory cells, which is a result of manufacturing variations. This randomness can be translated into the cryptographic keys thus avoiding the need to store and manage the device cryptographic keys. Therefore...

  6. Development of a simple detector response function generation program: The CEARDRFs code

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jiaxin, E-mail: jwang3@ncsu.edu [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Wang Zhijian; Peeples, Johanna [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Yu Huawei [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); College of Geo-Resources and Information, China University of Petroleum, Qingdao, Shandong 266555 (China); Gardner, Robin P. [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States)

    2012-07-15

    A simple Monte Carlo program named CEARDRFs has been developed to generate very accurate detector response functions (DRFs) for scintillation detectors. It utilizes relatively rigorous gamma-ray transport with simple electron transport, and accounts for two phenomena that have rarely been treated: scintillator non-linearity and the variable flat continuum part of the DRF. It has been proven that these physics and treatments work well for 3 Multiplication-Sign 3 Double-Prime and 6 Multiplication-Sign 6 Double-Prime cylindrical NaI detector in CEAR's previous work. Now this approach has been expanded to cover more scintillation detectors with various common shapes and sizes. Benchmark experiments of 2 Multiplication-Sign 2 Double-Prime cylindrical BGO detector and 2 Multiplication-Sign 4 Multiplication-Sign 16 Double-Prime rectangular NaI detector have been carried out at CEAR with various radiactive sources. The simulation results of CEARDRFs have also been compared with MCNP5 calculations. The benchmark and comparison show that CEARDRFs can generate very accurate DRFs (more accurate than MCNP5) at a very fast speed (hundred times faster than MCNP5). The use of this program can significantly increase the accuracy of applications relying on detector spectroscopy like prompt gamma-ray neutron activation analysis, X-ray fluorescence analysis, oil well logging and homeland security. - Highlights: Black-Right-Pointing-Pointer CEARDRF has been developed to generate detector response functions (DRFs) for scintillation detectors a. Black-Right-Pointing-Pointer Generated DRFs are very accurate. Black-Right-Pointing-Pointer Simulation speed is hundreds of times faster than MCNP5. Black-Right-Pointing-Pointer It utilizes rigorous gamma-ray transport with simple electron transport. Black-Right-Pointing-Pointer It also accounts for scintillator non-linearity and the variable flat continuum part.

  7. Development of a simple detector response function generation program: The CEARDRFs code

    International Nuclear Information System (INIS)

    Wang Jiaxin; Wang Zhijian; Peeples, Johanna; Yu Huawei; Gardner, Robin P.

    2012-01-01

    A simple Monte Carlo program named CEARDRFs has been developed to generate very accurate detector response functions (DRFs) for scintillation detectors. It utilizes relatively rigorous gamma-ray transport with simple electron transport, and accounts for two phenomena that have rarely been treated: scintillator non-linearity and the variable flat continuum part of the DRF. It has been proven that these physics and treatments work well for 3×3″ and 6×6″ cylindrical NaI detector in CEAR's previous work. Now this approach has been expanded to cover more scintillation detectors with various common shapes and sizes. Benchmark experiments of 2×2″ cylindrical BGO detector and 2×4×16″ rectangular NaI detector have been carried out at CEAR with various radiactive sources. The simulation results of CEARDRFs have also been compared with MCNP5 calculations. The benchmark and comparison show that CEARDRFs can generate very accurate DRFs (more accurate than MCNP5) at a very fast speed (hundred times faster than MCNP5). The use of this program can significantly increase the accuracy of applications relying on detector spectroscopy like prompt gamma-ray neutron activation analysis, X-ray fluorescence analysis, oil well logging and homeland security. - Highlights: ► CEARDRF has been developed to generate detector response functions (DRFs) for scintillation detectors a. ► Generated DRFs are very accurate. ► Simulation speed is hundreds of times faster than MCNP5. ► It utilizes rigorous gamma-ray transport with simple electron transport. ► It also accounts for scintillator non-linearity and the variable flat continuum part.

  8. The neural coding of creative idea generation across adolescence and early adulthood

    Directory of Open Access Journals (Sweden)

    Sietske eKleibeuker

    2013-12-01

    Full Text Available Creativity is considered key to human prosperity, yet the neurocognitive principles underlying creative performance, and their development, are still poorly understood. To fill this void, we examined the neural correlates of divergent thinking in adults (25-30 yrs and adolescents (15-17 yrs. Participants generated alternative uses (AU or ordinary characteristics (OC for common objects while brain activity was assessed using fMRI. Adults outperformed adolescents on the number of solutions for AU and OC trials. Contrasting neural activity for AU with OC trials revealed increased recruitment of left angular gyrus, left supramarginal gyrus, and bilateral middle temporal gyrus in both adults and adolescents. When only trials with multiple alternative uses were included in the analysis, participants showed additional left inferior frontal gyrus (IFG/middle frontal gyrus (MFG activation for AU compared to OC trials. Correspondingly, individual difference analyses showed a positive correlation between activations for AU relative to OC trials in left IFG/MFG and divergent thinking performance and activations were more pronounced in adults than in adolescents. Taken together, the results of this study demonstrated that creative idea generation involves recruitment of mainly left lateralized parietal and temporal brain regions. Generating multiple creative ideas, a hallmark of divergent thinking, shows additional lateral PFC activation that is not yet optimized in adolescence.

  9. Conception and development of an adaptive energy mesher for multigroup library generation of the transport codes

    International Nuclear Information System (INIS)

    Mosca, P.

    2009-12-01

    The deterministic transport codes solve the stationary Boltzmann equation in a discretized energy formalism called multigroup. The transformation of continuous data in a multigroup form is obtained by averaging the highly variable cross sections of the resonant isotopes with the solution of the self-shielding models and the remaining ones with the coarse energy spectrum of the reactor type. So far the error of such an approach could only be evaluated retrospectively. To remedy this, we studied in this thesis a set of methods to control a priori the accuracy and the cost of the multigroup transport computation. The energy mesh optimisation is achieved using a two step process: the creation of a reference mesh and its optimized condensation. In the first stage, by refining locally and globally the energy mesh, we seek, on a fine energy mesh with subgroup self-shielding, a solution equivalent to a reference solver (Monte Carlo or pointwise deterministic solver). In the second step, once fixed the number of groups, depending on the acceptable computational cost, and chosen the most appropriate self-shielding models to the reactor type, we look for the best bounds of the reference mesh minimizing reaction rate errors by the particle swarm optimization algorithm. This new approach allows us to define new meshes for fast reactors as accurate as the currently used ones, but with fewer groups. (author)

  10. Update on the opal opacity code

    International Nuclear Information System (INIS)

    Rogers, F.J.; Iglesias, C.A.; Wilson, B.G.

    1990-01-01

    Persisting discrepancies between theory and observation in a number of astrophysical properties has led to the conjecture that opacity databases may be inaccurate. The OPAL opacity code has been developed to address this question. The physical basis of OPAL removes several of the approximations present in past calculations. For example, it utilizes a much larger and more detailed set of atomic data than was used to construct the los Alamos Astrophysical Library. This data is generated online, in LS or intermediate coupling, from prefitted analytic effective potentials and is of similar quality as single configuration, relativistic, self-consistent-field calculations. The OPAL code has been used to calculate opacities for the solar core and for Cepheid variable stars. In both cases, significant increases in the opacity compared to the Los Alamos Astrophysical Library were found

  11. Development of a 1D thermal-hydraulic analysis code for once-through steam generator in SMRs using straight tubes

    Energy Technology Data Exchange (ETDEWEB)

    Park, Youngjae; Kim, Iljin; Kim, Hyungdae [Kyung Hee University, Yongin (Korea, Republic of)

    2015-10-15

    Diverse integral/small-modular reactors (SMRs) have been developed. Once-through steam generator (OTSG) which generates superheated steam without steam separator and dryer was used in the SMRs to reduce volume of steam generator. It would be possible to design a new steam generator with best estimate thermal-hydraulic codes such as RELAP and MARS. However, it is not convenience to use the general purpose thermal-hydraulic analysis code to design a specific component of nuclear power plants. A widely used simulation tool for thermal-hydraulic analysis of drum-type steam generators is ATHOS, which allows 3D analysis. On the other hand, a simple 1D thermal-hydraulic analysis code might be accurate enough for the conceptual design of OTSG. In this study, thermal-hydraulic analysis code for conceptual design of OTSG was developed using 1D homogeneous equilibrium model (HEM). A benchmark calculation was also conducted to verify and validate the prediction accuracy of the developed code by comparing with the analysis results with MARS. Finally, conceptual design of OTSG was conducted by the developed code. A simple 1D thermal-hydraulic analysis code was developed for the purpose of conceptual design OTSG for SMRs. A set of benchmark calculations was conducted to verify and validate the analysis accuracy of the developed code by comparing results obtained with a best-estimated thermal-hydraulic analysis code, MARS. Finally, analysis of two different OTSG design concepts with superheating and recirculation was demonstrated using the developed code.

  12. Computing LS factor by runoff paths on TIN

    Science.gov (United States)

    Kavka, Petr; Krasa, Josef; Bek, Stanislav

    2013-04-01

    The article shows results of topographic factor (the LS factor in USLE) derivation enhancement focused on detailed Airborne Laser Scanning (ALS) based DEMs. It describes a flow paths generation technique using triangulated irregular network (TIN) for terrain morphology description, which is not yet established in soil loss computations. This technique was compared with other procedures of flow direction and flow paths generation based on commonly used raster model (DEM). These overland flow characteristics together with therefrom derived flow accumulation are significant inputs for many scientific models. Particularly they are used in all USLE-based soil erosion models, from which USLE2D, RUSLE3D, Watem/Sedem or USPED can be named as the most acknowledged. Flow routing characteristics are also essential parameters in physically based hydrological and soil erosion models like HEC-HMS, Wepp, Erosion3D, LISEM, SMODERP, etc. Mentioned models are based on regular raster grids, where the identification of runoff direction is problematic. The most common method is Steepest descent (one directional flow), which corresponds well with the concentration of surface runoff into concentrated flow. The Steepest descent algorithm for the flow routing doesn't provide satisfying results, it often creates parallel and narrow flow lines while not respecting real morphological conditions. To overcome this problem, other methods (such as Flux Decomposition, Multiple flow, Deterministic Infinity algorithm etc.) separate the outflow into several components. This approach leads to unrealistic diffusion propagation of the runoff and makes it impossible to be used for simulation of dominant morphological features, such as artificial rills, hedges, sediment traps etc. The modern methods of mapping ground elevations, especially ALS, provide very detailed models even for large river basins, including morphological details. New algorithms for derivation a runoff direction have been developed as

  13. AMPX: a modular code system for generating coupled multigroup neutron-gamma libraries from ENDF/B

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Lucius, J.L.; Petrie, L.M.; Ford, W.E. III; White, J.E.; Wright, R.Q.

    1976-03-01

    AMPX is a modular system for producing coupled multigroup neutron-gamma cross section sets. Basic neutron and gamma cross-section data for AMPX are obtained from ENDF/B libraries. Most commonly used operations required to generate and collapse multigroup cross-section sets are provided in the system. AMPX is flexibly dimensioned; neutron group structures, and gamma group structures, and expansion orders to represent anisotropic processes are all arbitrary and limited only by available computer core and budget. The basic processes provided will (1) generate multigroup neutron cross sections; (2) generate multigroup gamma cross sections; (3) generate gamma yields for gamma-producing neutron interactions; (4) combine neutron cross sections, gamma cross sections, and gamma yields into final ''coupled sets''; (5) perform one-dimensional discrete ordinates transport or diffusion theory calculations for neutrons and gammas and, on option, collapse the cross sections to a broad-group structure, using the one-dimensional results as weighting functions; (6) plot cross sections, on option, to facilitate the ''evaluation'' of a particular multigroup set of data; (7) update and maintain multigroup cross section libraries in such a manner as to make it not only easy to combine new data with previously processed data but also to do it in a single pass on the computer; and (8) output multigroup cross sections in convenient formats for other codes. (auth)

  14. Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.

    Science.gov (United States)

    Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P

    2018-01-04

    clearly improved with MC-based OSEM reconstruction, e.g., the activity recovery was 88% for the largest sphere, while it was 66% for AC-OSEM and 79% for RRC-OSEM. The GPU-based MC code generated an MC-based SPECT/CT reconstruction within a few minutes, and reconstructed patient images of 177 Lu-DOTATATE treatments revealed clearly improved resolution and contrast.

  15. PLUTON: Three-group neutronic code for burnup analysis of isotope generation and depletion in highly irradiated LWR fuel rods

    Energy Technology Data Exchange (ETDEWEB)

    Lemehov, Sergei E; Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-08-01

    PLUTON is a three-group neutronic code analyzing, as functions of time and burnup, the change of radial profiles, together with average values, of power density, burnup, concentration of trans-uranium elements, plutonium buildup, depletion of fissile elements, and fission product generation in water reactor fuel rod with standard UO{sub 2}, UO{sub 2}-Gd{sub 2}O{sub 3}, inhomogeneous MOX, and UO{sub 2}-ThO{sub 2}. The PLUTON code, which has been designed to be run on Windows PC, has adopted a theoretical shape function of neutron attenuation in pellet, which enables users to perform a very fast and accurate calculation easily. The present code includes the irradiation conditions of the Halden Reactor which gives verification data for the code. The total list of trans-uranium elements included in the calculations consists of {sub 92}U{sup 233-239}, {sub 93}Np{sup 237-239}, {sub 94}Pu{sup 238-243}, {sub 95}Am{sup 241-244} (including isomers), and {sub 96}Cm{sup 242-245}. Poisoning fission products are represented by {sub 54}Xe{sup 131,133,135}, {sub 48}Cd{sup 113}, {sub 62}Sm{sup 149,151,152}, {sub 64}Gd{sup 154-160}, {sub 63}Eu{sup 153,155}, {sub 36}Kr{sup 83,85}, {sub 42}Mo{sup 95}, {sub 43}Tc{sup 99}, {sub 45}Rh{sup 103}, {sub 47}Ag{sup 109}, {sub 53}I{sup 127,129,131}, {sub 55}Cs{sup 133}, {sub 57}La{sup 139}, {sub 59}Pr{sup 141}, {sub 60}Nd{sup 143-150}, {sub 61}Pm{sup 147}. Fission gases and volatiles included in the code are {sub 36}Kr{sup 83-86}, {sub 54}Xe{sup 129-136}, {sub 52}Te{sup 125-130}, {sub 53}I{sup 127-131}, {sub 55}Cs{sup 133-137}, and {sub 56}Ba{sup 135-140}. Verification has been performed up to 83 GWd/tU, and a satisfactory agreement has been obtained. (author)

  16. Automated Narratives and Journalistic Text Generation: The Lead Organization Structure Translated into Code.

    Directory of Open Access Journals (Sweden)

    Márcio Carneiro dos Santos

    2016-07-01

    Full Text Available It describes the experiment of building a software capable of generating leads and newspaper titles in an automated fashion from information obtained from the Internet. The theoretical possibility Lage already provided by the end of last century is based on relatively rigid and simple structure of this type of story construction, which facilitates the representation or translation of its syntax in terms of instructions that the computer can execute. The paper also discusses the relationship between society, technique and technology, making a brief history of the introduction of digital solutions in newsrooms and their impacts. The development was done with the Python programming language and NLTK- Natural Language Toolkit library - and used the results of the Brazilian Soccer Championship 2013 published on an internet portal as a data source.

  17. Cracking the Code of Human Diseases Using Next-Generation Sequencing: Applications, Challenges, and Perspectives

    Directory of Open Access Journals (Sweden)

    Vincenza Precone

    2015-01-01

    Full Text Available Next-generation sequencing (NGS technologies have greatly impacted on every field of molecular research mainly because they reduce costs and increase throughput of DNA sequencing. These features, together with the technology’s flexibility, have opened the way to a variety of applications including the study of the molecular basis of human diseases. Several analytical approaches have been developed to selectively enrich regions of interest from the whole genome in order to identify germinal and/or somatic sequence variants and to study DNA methylation. These approaches are now widely used in research, and they are already being used in routine molecular diagnostics. However, some issues are still controversial, namely, standardization of methods, data analysis and storage, and ethical aspects. Besides providing an overview of the NGS-based approaches most frequently used to study the molecular basis of human diseases at DNA level, we discuss the principal challenges and applications of NGS in the field of human genomics.

  18. Effects of coding dictionary on signal generation: a consideration of use of MedDRA compared with WHO-ART.

    Science.gov (United States)

    Brown, Elliot G

    2002-01-01

    To support signal generation a terminology should facilitate recognition of medical conditions by using terms which represent unique concepts, providing appropriate, homogeneous grouping of related terms. It should allow intuitive or mathematical identification of adverse events reaching a threshold frequency or with disproportionate incidence, permit identification of important events which are commonly drug-related, and support recognition of new syndromes. It is probable that the Medical Dictionary for Regulatory Activities (MedDRA) preferred terms (PTs) or high level terms (HLTs) will be used to represent adverse events for the purposes of signal generation. A comparison with 315 WHO Adverse Reaction Terminology (WHO-ART) PTs showed that for about 72% of WHO-ART PTs, there were one or two corresponding MedDRA PTs. However, there were instances where there were many MedDRA PTs corresponding to single WHO-ART PTs. In many cases, MedDRA HLTs grouped large numbers of PTs and sometimes there could be problems when a single HLT comprises PTs which represent very different medical concepts, or conditions which differ greatly in their clinical importance. Further studies are needed to compare the way in which identical data sets coded with MedDRA and with other terminologies actually function in generating and exploring signals using the same methods of detection and evaluation.

  19. MIRANDA - a module based on multiregion resonance theory for generating cross sections within the AUS neutronics code system

    International Nuclear Information System (INIS)

    Robinson, G.S.

    1985-12-01

    MIRANDA is the cross-section generation module of the AUS neutronics code system used to prepare multigroup cross-section data which are pertinent to a particular study from a general purpose multigroup library of cross sections. Libraries have been prepared from ENDF/B which are suitable for thermal and fast fission reactors and for fusion blanket studies. The libraries include temperature dependent data, resonance cross sections represented by subgroup parameters and may contain photon as well as neutron data. The MIRANDA module includes a multiregion resonance calculation in slab, cylinder or cluster geometry, a homogeneous B L flux solution, and a group condensation facility. This report documents the modifications to an earlier version of MIRANDA and provides a complete user's manual

  20. Computer code for the analysis of destructive pressure generation process during a fuel failure accident, PULSE-2

    International Nuclear Information System (INIS)

    Fujishiro, Toshio

    1978-03-01

    The computer code PULSE-2 has been developed for the analysis of pressure pulse generation process when hot fuel particles come into contact with the coolant in a fuel rod failure accident. In the program, it is assumed that hot fuel fragments mix with the coolant instantly and homogeneously in the failure region. Then, the rapid vaporization of the coolant and transient pressure rise in failure region, and the movement of ejected coolant slugs are calculated. The effect of a fuel-particle size distribution is taken into consideration. Heat conduction in the fuel particles and heat transfer at fuel-coolant interface are calculated. Temperature, pressure and void fraction in the mixed region are calculated from the average enthalpy. With physical property subroutines for liquid sodium and water, the model is usable for both LMFBR and LWR conditions. (auth.)

  1. Cost on Reliability and Production Loss for Power Converters in the Doubly Fed Induction Generator to Support Modern Grid Codes

    DEFF Research Database (Denmark)

    Zhou, Dao; Blaabjerg, Frede; Lau, Mogens

    2016-01-01

    As wind farms are normally located in remote areas, many grid codes have been issued especially related to the reactive power support. Although the Doubly-Fed Induction Generator (DFIG) based power converter is able to control the active power and reactive power independently, the effects...... of providing reactive power on the lifetime of the power converter and the cost-of-energy of the whole system are seldom evaluated, even though it is an important topic. In this paper, the loss models of the DFIG system are established at various conditions of the reactive power injection. If the mission...... profile is taken into account, the lifespan of the power semiconductors as well as the cost of the reactive power can be calculated. It is concluded that an over-excited reactive power injection significantly reduces the power converter lifetime, only 1/4 of the case that there is no reactive power...

  2. PUFF-IV, Code System to Generate Multigroup Covariance Matrices from ENDF/B-VI Uncertainty Files

    International Nuclear Information System (INIS)

    2007-01-01

    1 - Description of program or function: The PUFF-IV code system processes ENDF/B-VI formatted nuclear cross section covariance data into multigroup covariance matrices. PUFF-IV is the newest release in this series of codes used to process ENDF uncertainty information and to generate the desired multi-group correlation matrix for the evaluation of interest. This version includes corrections and enhancements over previous versions. It is written in Fortran 90 and allows for a more modular design, thus facilitating future upgrades. PUFF-IV enhances support for resonance parameter covariance formats described in the ENDF standard and now handles almost all resonance parameter covariance information in the resolved region, with the exception of the long range covariance sub-subsections. PUFF-IV is normally used in conjunction with an AMPX master library containing group averaged cross section data. Two utility modules are included in this package to facilitate the data interface. The module SMILER allows one to use NJOY generated GENDF files containing group averaged cross section data in conjunction with PUFF-IV. The module COVCOMP allows one to compare two files written in COVERX format. 2 - Methods: Cross section and flux values on a 'super energy grid,' consisting of the union of the required energy group structure and the energy data points in the ENDF/B-V file, are interpolated from the input cross sections and fluxes. Covariance matrices are calculated for this grid and then collapsed to the required group structure. 3 - Restrictions on the complexity of the problem: PUFF-IV cannot process covariance information for energy and angular distributions of secondary particles. PUFF-IV does not process covariance information in Files 34 and 35; nor does it process covariance information in File 40. These new formats will be addressed in a future version of PUFF

  3. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  4. A mid-term report for LS1

    CERN Multimedia

    2014-01-01

    As the LHC’s first long shutdown, LS1, enters its second calendar year, it’s a good time for a mid-term report on how things are progressing.    Towards the end of last year, I had the pleasure to go down to the LHC tunnel to witness the closure of the first of the machine’s sectors to be completed. As I write, three sectors are now closed up, with a fourth not far behind. These are important milestones, and you can follow progress in detail in the regular LS1 reports in the Bulletin. They show that we’re on schedule for physics to resume in about a year from now, but more than that, they are an important reminder of the LS1 motto: safety, quality, schedule. It is fantastic news that we are on schedule, and testimony to the rigour that went into the detailed and complex planning of all the work that had to be undertaken in LS1. But more important than the schedule is the fact that we’ve carried out the work safely and that the qualit...

  5. Safety, Quality, Schedule: the motto of LS1

    CERN Multimedia

    2013-01-01

    The LHC’s first long shutdown, LS1, is a marathon that began on 16 February and will take us through to the beginning of 2015. Just as Olympic marathon runners have a motto, Citius, Altius, Fortius, so the athletes of LS1 work to the mantra of Safety, Quality, Schedule. Four months into LS1, they have settled into their rhythm, and things are going to plan.   The first task of LS1 was to bring the LHC up to room temperature - this was achieved in just 10 weeks. In parallel, preliminary tests for electrical quality assurance and leaks revealed essentially the level of wear and tear we’d expect after three years of running. One slightly anxious moment came when we looked at the RF fingers – the devices that ensure electrical contact in the beam pipes as they pass from one magnet to the next. Those of you with long memories will recall that before start-up, some of these got damaged at warm-up. The good news today is that with all eight sectors test...

  6. The LS-STAG immersed boundary/cut-cell method for non-Newtonian flows in 3D extruded geometries

    Science.gov (United States)

    Nikfarjam, F.; Cheny, Y.; Botella, O.

    2018-05-01

    The LS-STAG method is an immersed boundary/cut-cell method for viscous incompressible flows based on the staggered MAC arrangement for Cartesian grids, where the irregular boundary is sharply represented by its level-set function, results in a significant gain in computer resources (wall time, memory usage) compared to commercial body-fitted CFD codes. The 2D version of LS-STAG method is now well-established (Cheny and Botella, 2010), and this paper presents its extension to 3D geometries with translational symmetry in the z direction (hereinafter called 3D extruded configurations). This intermediate step towards the fully 3D implementation can be applied to a wide variety of canonical flows and will be regarded as the keystone for the full 3D solver, since both discretization and implementation issues on distributed memory machines are tackled at this stage of development. The LS-STAG method is then applied to various Newtonian and non-Newtonian flows in 3D extruded geometries (axisymmetric pipe, circular cylinder, duct with an abrupt expansion) for which benchmark results and experimental data are available. The purpose of these investigations are (a) to investigate the formal order of accuracy of the LS-STAG method, (b) to assess the versatility of method for flow applications at various regimes (Newtonian and shear-thinning fluids, steady and unsteady laminar to turbulent flows) (c) to compare its performance with well-established numerical methods (body-fitted and immersed boundary methods).

  7. Generating Health Estimates by Zip Code: A Semiparametric Small Area Estimation Approach Using the California Health Interview Survey.

    Science.gov (United States)

    Wang, Yueyan; Ponce, Ninez A; Wang, Pan; Opsomer, Jean D; Yu, Hongjian

    2015-12-01

    We propose a method to meet challenges in generating health estimates for granular geographic areas in which the survey sample size is extremely small. Our generalized linear mixed model predicts health outcomes using both individual-level and neighborhood-level predictors. The model's feature of nonparametric smoothing function on neighborhood-level variables better captures the association between neighborhood environment and the outcome. Using 2011 to 2012 data from the California Health Interview Survey, we demonstrate an empirical application of this method to estimate the fraction of residents without health insurance for Zip Code Tabulation Areas (ZCTAs). Our method generated stable estimates of uninsurance for 1519 of 1765 ZCTAs (86%) in California. For some areas with great socioeconomic diversity across adjacent neighborhoods, such as Los Angeles County, the modeled uninsured estimates revealed much heterogeneity among geographically adjacent ZCTAs. The proposed method can increase the value of health surveys by providing modeled estimates for health data at a granular geographic level. It can account for variations in health outcomes at the neighborhood level as a result of both socioeconomic characteristics and geographic locations.

  8. Changing priorities of codes and standards: An A/E's perspective for operating units and new generation

    International Nuclear Information System (INIS)

    Meyers, B.L.; Jackson, R.W.; Morowski, B.D.

    1994-01-01

    As the nuclear power industry has shifted emphasis from the construction of new plants to the reliability and maintenance of operating units, the industry's commitment to safety has been well guarded and maintained. Many other important indicators of nuclear industry performance are also positive. Unfortunately, by some projections, as many as 25 operating nuclear units could prematurely shutdown because of increasing O ampersand M and total operating costs. The immediate impact of higher generating costs on the nuclear industry is evident. However, when viewed over the longer-term, high generating costs will also affect license renewals, progress in the development of advanced light water reactor designs and prospects for a return to the building of new plants. Today's challenge is to leverage the expertise and contribution of the nuclear industry partner organizations to steadily improve the work processes and methods necessary to reduce operating costs, to achieve higher levels in the performance of operating units, and to maintain high standards of technical excellence and safety. From the experience and perspective of an A/E and partner in the nuclear industry, this paper will discuss the changing priorities of codes and standards as they relate to opportunities for the communication of lessons learned and improving the responsiveness to industry needs

  9. Report of the generation of the nuclear bank Presto-Hot for the SVEA-96 fuel with the FMS codes

    International Nuclear Information System (INIS)

    Alonso V, G.

    1991-12-01

    In this work it is described in a general way the form in that was generated the database of the SVEA-96 fuel for Laguna Verde. The formation of the bank it was carried out with the ECLIPSE 86-2D, RECORD 89-1A and POLGEN 88-lB codes of the FMS package installed in the VAX system of the offices of the National Commission of Nuclear Safety and Safeguards in Mexico, D.F. The formed bank is denominated 'LlPG9102'. All this was carried out following the '6F3/I/CN029/90/P1' procedure. By means of the MERGE code of the FMS package installed in the VAX system of the offices of the Federal Commission of Electricity in Mexico, D.F., it was annex this information to the contained bank 'LlPG3314' being generated the one bank 'LlPG9701'. This contains the information of the 5 fuel types of the initial load of the unit 1 and of the first reload of Laguna Verde as well as the information corresponding to the SVEA-96 fuel. The results obtained during the formation of the data bank of the fuel as for the behavior of those different cell parameters regarding the burnt of the fuel and the variation of vacuums in the coolant channel is compared with those reported in the documents of fuel design provided by ABB-ATOM. These comparisons, although they are not exhaustive they show the general tendency of the results the which is quite favorable. The generated database contains the enough information in terms of constant in two dependent groups of burnt and instantaneous vacuums, for the different arrangements of present fuel bars in the one assemble as well as those coefficients that take into account the presence of the control bar, the variation in the fuel temperature and the one effect of the 'historical' vacuums. All this included in that is knows as SUPER option of the bank for PRESTO with the options PRCOEF and POLRAM. Also, in the Annex G of this report its were provided for separate the M-Factor, the coefficients of Xenon and the parameters of burnt of the control bar for Presto

  10. Testing of a Code for the Calculation of Spectra of Neutrons Produced in a Target of a Neutron Generator

    Science.gov (United States)

    Gaganov, V. V.

    2017-12-01

    The correctness of calculations performed with the SRIANG code for modeling the spectra of DT neutrons is estimated by comparing the obtained spectra to the results of calculations carried out with five different codes based on the Monte Carlo method.

  11. Bacillus amyloliquefaciens L-S60 Reforms the Rhizosphere Bacterial Community and Improves Growth Conditions in Cucumber Plug Seedling

    Directory of Open Access Journals (Sweden)

    Yuxuan Qin

    2017-12-01

    Full Text Available Vegetable plug seedling has become the most important way to produce vegetable seedlings in China. This seedling method can significantly improve the quality and yield of vegetables compared to conventional methods. In the process of plug seedling, chemical fertilizers or pesticides are often used to improve the yield of the seedlings albeit with increasing concerns. Meanwhile, little is known about the impact of beneficial bacteria on the rhizosphere microbiota and the growth conditions of vegetables during plug seedling. In this study, we applied a culture-independent next-generation sequencing-based approach and investigated the impact of a plant beneficial bacterium, Bacillus amyloliquefaciens L-S60, on the composition and dynamics of rhizosphere microbiota and the growth conditions of cucumbers during plug seedling. Our results showed that application of L-S60 significantly altered the structure of the bacterial community associated with the cucumber seedling; presence of beneficial rhizosphere species such as Bacillus, Rhodanobacter, Paenibacillus, Pseudomonas, Nonomuraea, and Agrobacterium was higher upon L-S60 treatment than in the control group. We also measured the impact of L-S60 application on the physiological properties of the cucumber seedlings as well as the availability of main mineral elements in the seedling at different time points during the plug seedling. Results from those measurements indicated that L-S60 application promoted growth conditions of cucumber seedlings and that more available mineral elements were detected in the cucumber seedlings from the L-S60 treated group than from the control group. The findings in this study provided evidence for the beneficial effects of plant growth-promoting rhizosphere bacteria on the bacterial community composition and growth conditions of the vegetables during plug seedling.

  12. A Monte Carlo Code (PHOEL) for generating initial energies of photoelectrons and compton electrons produced by photons in water

    International Nuclear Information System (INIS)

    Turner, J.E.; Modolo, J.T.; Sordi, G.M.A.A.; Hamm, R.N.; Wright, H.A.

    1979-01-01

    PHOEL provides a source term for a Monte Carlo code which calculates the electron transport and energy degradation in liquid water. This code is used to study the relative biological effectiveness (RBE) of low-LET radiation at low doses. The basic numerical data used and their mathematical treatment are described as well as the operation of the code [pt

  13. Hypertension Knowledge-Level Scale (HK-LS): A Study on Development, Validity and Reliability

    OpenAIRE

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-01-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensio...

  14. Performance potential of the injectors after LS1

    International Nuclear Information System (INIS)

    Bartosik, H.; Carli, C.; Damerau, H.; Garoby, R.; Gilardoni, S.; Goddard, B.; Hancock, S.; Hanke, K.; Lombardi, A.; Mikulec, B.; Raginel, V.; Rumolo, G.; Shaposhnikova, E.; Vretenar, M.

    2012-01-01

    The main upgrades of the injector chain in the framework of the LIU Project will only be implemented in the second long shutdown (LS2), in particular the increase of the PSB-PS transfer energy to 2 GeV or the implementation of cures/solutions against instabilities/e-cloud effects etc. in the SPS. On the other hand, Linac4 will become available by the end of 2014. Until the end of 2015 it may replace Linac2 at short notice, taking 50 MeV protons into the PSB via the existing injection system but with reduced performance. Afterwards, the H - injection equipment will be ready and Linac4 could be connected for 160 MeV H - injection into the PSB during a prolonged winter shutdown before LS2. The anticipated beam performance of the LHC injectors after LS1 in these different cases is presented. Space charge on the PS flat-bottom will remain a limitation because the PSB-PS transfer energy will stay at 1.4 GeV. As a mitigation measure new RF manipulations are presented which can improve brightness for 25 ns bunch spacing, allowing for more than nominal luminosity in the LHC. (authors)

  15. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    Science.gov (United States)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be

  16. Revitalising Mathematics Classroom Teaching through Lesson Study (LS): A Malaysian Case Study

    Science.gov (United States)

    Lim, Chap Sam; Kor, Liew Kee; Chia, Hui Min

    2016-01-01

    This paper discusses how implementation of Lesson Study (LS) has brought about evolving changes in the quality of mathematics classroom teaching in one Chinese primary school. The Japanese model of LS was adapted as a teacher professional development to improve mathematics teachers' teaching practices. The LS group consisted of five mathematics…

  17. Development of tube rupture evaluation code for FBR steam generator (II). Modification of heat transfer model in sodium side

    International Nuclear Information System (INIS)

    Hamada, H.; Kurihara, A.

    2003-05-01

    The thermal effect of sodium-water reaction jet on neighboring heat transfer tubes was examined to rationally evaluate the structural integrity of the tube for overheating rupture under a water leak in an FBR steam generator. Then, the development of new heat transfer model and the application analysis were carried out. Main results in this paper are as follows. (1) The evaluation method of heat flux and heat transfer coefficient (HTC) on the tube exposed to reaction jet was developed. By using the method, it was confirmed that the heat flux could be realistically evaluated in comparison with the previous method. (2) The HTC between reaction jet and the tube was theoretically examined in the two-phase flow model, and new heat transfer model considering the effect of fluid temperature and cover gas pressure was developed. By applying the model, a tentative experimental correlation was conservatively obtained by using SWAT-1R test data. (3) The new model was incorporated to the Tube Rupture Evaluation Code (TRUE), and the conservatism of the model was confirmed by using sodium-water reaction data such as the SWAT-3 tests. (4) In the application analysis of the PFR large leak event, there was no significant difference of calculation results between the new model and previous one; the importance of depressurization in the tube was confirmed. (5) In the application analysis of the Monju evaporator, it was confirmed that the calculation result in the previous model would be more conservative than that in the new one and that the maximum cumulative damage of 25% could be reduced in the new model. (author)

  18. LHC Experimental Beam Pipe Upgrade during LS1

    CERN Document Server

    Lanza, G; Baglin, V; Chiggiato, P

    2014-01-01

    The LHC experimental beam pipes are being improved during the ongoing Long Shutdown 1 (LS1). Several vacuum chambers have been tested and validated before their installation inside the detectors. The validation tests include: leak tightness, ultimate vacuum pressure, material outgassing rate, and residual gas composition. NEG coatings are assessed by sticking probability measurement with the help of Monte Carlo simulations. In this paper the motivation for the beam pipe upgrade, the validation tests of the components and the results are presented and discussed.

  19. Analysis of flow-induced vibration of heat exchanger and steam generator tube bundles using the AECL computer code PIPEAU-2

    International Nuclear Information System (INIS)

    Gorman, D.J.

    1983-12-01

    PIPEAU-2 is a computer code developed at the Chalk River Nuclear Laboratories for the flow-induced vibration analysis of heat exchanger and steam generator tube bundles. It can perform this analysis for straight and 'U' tubes. All the theoretical work underlying the code is analytical rather than numerical in nature. Highly accurate evaluation of the free vibration frequencies and mode shapes is therefore obtained. Using the latest experimentally determined parameters available, the free vibration analysis is followed by a forced vibration analysis. Tube response due to fluid turbulence and vortex shedding is determined, as well as critical fluid velocity associated with fluid-elastic instability

  20. LS1 Report: The cryogenic line goes through the scanner

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    In spite of the complexity of LS1, with many different activities taking place in parallel and sometimes overlapping, the dashboard shows that work is progressing on schedule. This week, teams have started X-raying the cryogenic line to examine its condition in minute detail.   The LS1 schedule is pretty unfathomable for those who don't work in the tunnels or installations, but if you look down all the columns and stop at the line indicating today’s date, you can see that all of the priority and critical items are bang on time, like a Swiss watch. More specifically: the SMACC project in the LHC is on schedule, with a new testing phase for the interconnections which have already been consolidated; preparations are under way for the cable replacement campaign at Point 1 of the SPS (about 20% of the cables will not be replaced as they are completely unused); and the demineralised water distribution line is back in service, as are the electrical substations for the 400 and 66 kV line...

  1. LS1 Report: Handing in the ATLAS keys

    CERN Multimedia

    Antonella Del Rosso, Katarina Anthony

    2014-01-01

    After completing more than 250 work packages concerning the whole detector and experimental site, the ATLAS and CERN teams involved with LS1 operations are now wrapping things up before starting the commissioning phase in preparation for the LHC restart. The giant detector is now more efficient, safer and even greener than ever thanks to the huge amount of work carried out over the past two years.   Cleaning up the ATLAS cavern and detector in preparation for Run 2. Hundreds of people, more than 3000 certified interventions, huge and delicate parts of the detector completely refurbished: the ATLAS detector that will take data during Run 2 is a brand new machine, which will soon be back in the hands of the thousands of scientists who are preparing for the high-energy run of the LHC accelerator. “During LS1, we have upgraded the detector’s basic infrastructure and a few of its sub-detectors,” explains Beniamino Di Girolamo, ATLAS Technical Coordinator. &...

  2. EN-CV during LS1: upgrade, consolidation, maintenance, operation

    International Nuclear Information System (INIS)

    Nonis, M.

    2012-01-01

    The Cooling and Ventilation (CV) Group in the Engineering Department (EN) will be heavily involved in several projects and activities during the long shutdown in 2013 and 2014 (LS1) within a time-frame limited to around twelve months. According to the requests received so far, most projects are related to the upgrade of users' equipment, consolidation work, and the construction of new plants. However, through the experience gained from the first years of the LHC run, some projects are also needed to adapt the existing installations to the new operating parameters. Some of these projects are presented hereafter, outlining the impact that they will have on operational working conditions or risks of breakdown. Among these projects we find: the PM32 raising pumps, the cooling of the CERN Control Center, R2E, the backup cooling towers for ATLAS and cryogenics, a thermosyphon for ATLAS, or new pumps in UWs. Finally, EN-CV activities during LS1 for maintenance, operation, and commissioning will be mentioned since they represent a major workload for the Group

  3. Lymphoscintigraphy (LS) in infants and children: techniques and scintigraphic patterns

    International Nuclear Information System (INIS)

    Somerville, J.; Parsons, G.; Howman-Giles, R.; Lewis, G.; Uren, R.; Mansberg, R.

    1999-01-01

    Full text: Radionuclide imaging of the lymphatic system with intradermal injection of radiopharmaceutical is a rapid, safe and simple technique for the evaluation of lymphatic abnormalities in infants and children. 99 Tc m -antimony sulphide colloid is the agent of choice. The radiopharmaceutical (dose 5 mBq in 0.1 ml) is injected intradermally into both limbs being investigated. Emla cream is useful to reduce the initial discomfort of the injection. Imaging is performed immediately for approximately 30-60 min to assess the flow rate and lymph channels to the draining node fields. Further imaging at 2 and 4 h may be necessary. Normal LS in the lower limbs shows the tracer to pass into lymphatics almost immediately and channels are usually visualized within 5 min. In the lower limbs, symmetrical lymph flow to nodes are seen in the groin, iliac and paravertebral region with activity later seen in the liver. In 31 patients studied over the last 3 years, 17 studies were normal and 14 were abnormal: Klippel-Trenaunay-Weber syndrome (n = 4) with delayed flow and dermal backflow; congenital lymph/vascular malformations (n = 6) with various delayed flow patterns and focal accumulations; congenital lymphoedema (n 3) and pulmonary lymphangiectasia (n = 1). Aplasia and hypoplasia of lymph systems are readily identified. In conclusion, LS is a valuable diagnostic technique to assess lymph flow and diagnose lymphatic malformations and the causes of lymphoedema in children

  4. People typically experience extended periods of relative happiness or unhappiness due to positive feedback loops between LS and variables which are both causes and consequences of LS

    NARCIS (Netherlands)

    Headey, Bruce; Muffels, R.J.A.

    2015-01-01

    Long term panel data enable researchers to construct trajectories of LS for individuals over time. Bar charts of trajectories, and subsequent statistical analysis, show that respondents typically spend multiple consecutive years above and below their own long-term mean level of LS. We attempt to

  5. Short-term hydro generation scheduling of Xiluodu and Xiangjiaba cascade hydropower stations using improved binary-real coded bee colony optimization algorithm

    International Nuclear Information System (INIS)

    Lu, Peng; Zhou, Jianzhong; Wang, Chao; Qiao, Qi; Mo, Li

    2015-01-01

    Highlights: • STHGS problem is decomposed into two parallel sub-problems of UC and ELD. • Binary coded BCO is used to solve UC sub-problem with 0–1 discrete variables. • Real coded BCO is used to solve ELD sub-problem with continuous variables. • Some heuristic repairing strategies are designed to handle various constraints. • The STHGS of Xiluodu and Xiangjiaba cascade stations is solved by IB-RBCO. - Abstract: Short-term hydro generation scheduling (STHGS) of cascade hydropower stations is a typical nonlinear mixed integer optimization problem to minimize the total water consumption while simultaneously meeting the grid requirements and other hydraulic and electrical constraints. In this paper, STHGS problem is decomposed into two parallel sub-problems of unit commitment (UC) and economic load dispatch (ELD), and the methodology of improved binary-real coded bee colony optimization (IB-RBCO) algorithm is proposed to solve them. Firstly, the improved binary coded BCO is used to solve the UC sub-problem with 0–1 discrete variables, and the heuristic repairing strategy for unit state constrains is applied to generate the feasible unit commitment schedule. Then, the improved real coded BCO is used to solve the ELD sub-problem with continuous variables, and an effective method is introduced to handle various unit operation constraints. Especially, the new updating strategy of DE/best/2/bin method with dynamic parameter control mechanism is applied to real coded BCO to improve the search ability of IB-RBCO. Finally, to verify the feasibility and effectiveness of the proposed IB-RBCO method, it is applied to solve the STHGS problem of Xiluodu and Xiangjiaba cascaded hydropower stations, and the simulating results are compared with other intelligence algorithms. The simulation results demonstrate that the proposed IB-RBCO method can get higher-quality solutions with less water consumption and shorter calculating time when facing the complex STHGS problem

  6. Development of a new nuclide generation and depletion code using a topological solver based on graph theory

    Energy Technology Data Exchange (ETDEWEB)

    Kasselmann, S., E-mail: s.kasselmann@fz-juelich.de [Forschungszentrum Jülich, 52425 Jülich (Germany); Schitthelm, O. [Forschungszentrum Jülich, 52425 Jülich (Germany); Tantillo, F. [Forschungszentrum Jülich, 52425 Jülich (Germany); Institute for Reactor Safety and Reactor Technology, RWTH-Aachen, 52064 Aachen (Germany); Scholthaus, S.; Rössel, C. [Forschungszentrum Jülich, 52425 Jülich (Germany); Allelein, H.-J. [Forschungszentrum Jülich, 52425 Jülich (Germany); Institute for Reactor Safety and Reactor Technology, RWTH-Aachen, 52064 Aachen (Germany)

    2016-09-15

    The problem of calculating the amounts of a coupled nuclide system varying with time especially when exposed to a neutron flux is a well-known problem and has been addressed by a number of computer codes. These codes cover a broad spectrum of applications, are based on comprehensive validation work and are therefore justifiably renowned among their users. However, due to their long development history, they are lacking a modern interface, which impedes a fast and robust internal coupling to other codes applied in the field of nuclear reactor physics. Therefore a project has been initiated to develop a new object-oriented nuclide transmutation code. It comprises an innovative solver based on graph theory, which exploits the topology of nuclide chains and therefore speeds up the calculation scheme. Highest priority has been given to the existence of a generic software interface well as an easy handling by making use of XML files for the user input. In this paper we report on the status of the code development and present first benchmark results, which prove the applicability of the selected approach.

  7. RADHEAT-V4: a code system to generate multigroup constants and analyze radiation transport for shielding safety evaluation

    International Nuclear Information System (INIS)

    Yamano, Naoki; Minami, Kazuyoshi; Koyama, Kinji; Naito, Yoshitaka.

    1989-03-01

    A modular code system RADHEAT-V4 has been developed for performing precisely neutron and photon transport analyses, and shielding safety evaluations. The system consists of the functional modules for producing coupled multi-group neutron and photon cross section sets, for analyzing the neutron and photon transport, and for calculating the atom displacement and the energy deposition due to radiations in nuclear reactor or shielding material. A precise method named Direct Angular Representation (DAR) has been developed for eliminating an error associated with the method of the finite Legendre expansion in evaluating angular distributions of cross sections and radiation fluxes. The DAR method implemented in the code system has been described in detail. To evaluate the accuracy and applicability of the code system, some test calculations on strong anisotropy problems have been performed. From the results, it has been concluded that RADHEAT-V4 is successfully applicable to evaluating shielding problems accurately for fission and fusion reactors and radiation sources. The method employed in the code system is very effective in eliminating negative values and oscillations of angular fluxes in a medium having an anisotropic source or strong streaming. Definitions of the input data required in various options of the code system and the sample problems are also presented. (author)

  8. Development of a new nuclide generation and depletion code using a topological solver based on graph theory

    International Nuclear Information System (INIS)

    Kasselmann, S.; Scholthaus, S.; Rössel, C.; Allelein, H.-J.

    2014-01-01

    The problem of calculating the amounts of a coupled nuclide system varying with time especially when exposed to a neutron flux is a well-known problem and has been addressed by a number of computer codes. These codes cover a broad spectrum of applications, are based on comprehensive validation work and are therefore justifiably renowned among their users. However, due to their long development history, they are lacking a modern interface, which impedes a fast and robust internal coupling to other codes applied in the field of nuclear reactor physics. Therefore a project has been initiated to develop a new object-oriented nuclide transmutation code. It comprises an innovative solver based on graph theory, which exploits the topology of nuclide chains. This allows to always deal with the smallest nuclide system for the problem of interest. Highest priority has been given to the existence of a generic software interfaces well as an easy handling by making use of XML files for input and output. In this paper we report on the status of the code development and present first benchmark results, which prove the applicability of the selected approach. (author)

  9. Characterization of open-cycle coal-fired MHD generators. Quarterly technical summary report No. 6, October 1--December 31, 1977. [PACKAGE code

    Energy Technology Data Exchange (ETDEWEB)

    Kolb, C.E.; Yousefian, V.; Wormhoudt, J.; Haimes, R.; Martinez-Sanchez, M.; Kerrebrock, J.L.

    1978-01-30

    Research has included theoretical modeling of important plasma chemical effects such as: conductivity reductions due to condensed slag/electron interactions; conductivity and generator efficiency reductions due to the formation of slag-related negative ion species; and the loss of alkali seed due to chemical combination with condensed slag. A summary of the major conclusions in each of these areas is presented. A major output of the modeling effort has been the development of an MHD plasma chemistry core flow model. This model has been formulated into a computer program designated the PACKAGE code (Plasma Analysis, Chemical Kinetics, And Generator Efficiency). The PACKAGE code is designed to calculate the effect of coal rank, ash percentage, ash composition, air preheat temperatures, equivalence ratio, and various generator channel parameters on the overall efficiency of open-cycle, coal-fired MHD generators. A complete description of the PACKAGE code and a preliminary version of the PACKAGE user's manual are included. A laboratory measurements program involving direct, mass spectrometric sampling of the positive and negative ions formed in a one atmosphere coal combustion plasma was also completed during the contract's initial phase. The relative ion concentrations formed in a plasma due to the methane augmented combustion of pulverized Montana Rosebud coal with potassium carbonate seed and preheated air are summarized. Positive ions measured include K/sup +/, KO/sup +/, Na/sup +/, Rb/sup +/, Cs/sup +/, and CsO/sup +/, while negative ions identified include PO/sub 3//sup -/, PO/sub 2//sup -/, BO/sub 2//sup -/, OH/sup -/, SH/sup -/, and probably HCrO/sub 3/, HMoO/sub 4//sup -/, and HWO/sub 3//sup -/. Comparison of the measurements with PACKAGE code predictions are presented. Preliminary design considerations for a mass spectrometric sampling probe capable of characterizing coal combustion plasmas from full scale combustors and flow trains are presented

  10. RADHEAT-V3, a code system for generating coupled neutron and gamma-ray group constants and analyzing radiation transport

    International Nuclear Information System (INIS)

    Koyama, Kinji; Taji, Yukichi; Miyasaka, Shun-ichi; Minami, Kazuyoshi.

    1977-07-01

    The modular code system RADHEAT is for producing coupled multigroup neutron and gamma-ray cross section sets, analyzing the neutron and gamma-ray transport, and calculating the energy deposition and atomic displacements due to these radiations in a nuclear reactor or shield. The basic neutron cross sections and secondary gamma-ray production data are taken from ENDF/B and POPOP4 libraries respectively. The system (1) generates multigroup neutron cross sections, energy deposition coefficients and atomic displacement factors due to neutron reactions, (2) generates multigroup gamma-ray cross sections and energy transfer coefficients, (3) generates secondary gamma-ray production cross sections, (4) combines these cross sections into the coupled set, (5) outputs and updates the multigroup cross section libraries in convenient formats for other transport codes, (6) analyzes the neutron and gamma-ray transport and calculates the energy deposition and the number density of atomic displacements in a medium, (7) collapses the cross sections to a broad-group structure, by option, using the weighting functions obtained by one-dimensional transport calculation, and (8) plots, by option, multigroup cross sections, and neutron and gamma-ray distributions. Definitions of the input data required in various options of the code system are also given. (auth.)

  11. R2E strategy and activities during LS1

    International Nuclear Information System (INIS)

    Perrot, A.L.

    2012-01-01

    The level of the flux of hadrons with energy in the multi MeV range expected from the collimation system at Point 7 and from the collisions at the interaction Points 1, 5 and 8 will induce Single Event Errors (SEEs) of the standard electronics present in the equipment located around these Points. Such events would perturb LHC operation. As a consequence, within the framework of the R2E (Radiation to Electronics) Mitigation Project, the sensitive equipment will be shielded or relocated to safer areas. These mitigation activities will be performed mainly during Long Shutdown 1 (LS1). About 15 groups (including equipment owners) will be involved in these activities with work periods from a few days to several months. Some of them will have to work in parallel in several LHC points. This document presents these mitigation activities with their associated planning, organization process, and main concerns as identified today. (author)

  12. ATLAS TDAQ application gateway upgrade during LS1

    CERN Document Server

    KOROL, A; The ATLAS collaboration; BOGDANCHIKOV, A; BRASOLIN, F; CONTESCU, A C; DUBROV, S; HAFEEZ, M; LEE, C J; SCANNICCHIO, D A; TWOMEY, M; VORONKOV, A; ZAYTSEV, A

    2014-01-01

    The ATLAS Gateway service is implemented with a set of dedicated computer nodes to provide a fine-grained access control between CERN General Public Network (GPN) and ATLAS Technical Control Network (ATCN). ATCN connects the ATLAS online farm used for ATLAS Operations and data taking, including the ATLAS TDAQ (Trigger and Data Aquisition) and DCS (Detector Control System) nodes. In particular, it provides restricted access to the web services (proxy), general login sessions (via SSH and RDP protocols), NAT and mail relay from ATCN. At the Operating System level the implementation is based on virtualization technologies. Here we report on the Gateway upgrade during Long Shutdown 1 (LS1) period: it includes the transition to the last production release of the CERN Linux distribution (SLC6), the migration to the centralized configuration management system (based on Puppet) and the redesign of the internal system architecture.

  13. Development of next generation code system as an engineering modeling language (5). Investigation on restructuring method of conventional codes into two-layer system

    International Nuclear Information System (INIS)

    Yokoyama, Kenji

    2006-10-01

    A proposed method for gradually restructuring to the two-level system of next generation analysis system by reusing the conventional analysis system, called 'incremental method', was applied and evaluated. The following functions were selected for the evaluation: Neutron diffusion calculation for the three-dimensional XYZ system based on finite differential method, and input utilities of the cross-section data file used in the conventional system. In order to evaluate the effect of the restructuring, 'Module Coupling Index (MCI)' and 'McCabe's Cyclomatic Complexity (MCC)' were used for quantifying the quality of the modular design and the complexity of the program sequence of each module. Although MCIs of each module before restructuring were mainly 6 - 7 degrees, it was possible to reduce them to under 4 degrees in most module by restructuring with the incremental method. And, it is found that the modules under 4 degrees of MCI can be easily combined with different programming languages, which are necessary for building the two-layer system. In the meantime, MCCs in most module before restructuring were over 20 and some were over 50. The incremental method could reduce them to under 10 when C++ was used, and reduce them to under 20 when FORTRAN was used. It is correspondent to reduction of the error frequency occurred in its modification from 20 - 40% to 5 - 10%. The total number of MCC could be reduced to 1/3 when C++ was used, and to 1/2 when FORTRAN was used. By using the restructured functions in the present study and some previously developed functions, a reactor analysis tool was systematized and applied to criticality analysis of the Experimental Fast Reactor 'JOYO' MR-I. In addition, the following two functionality expansion tests were performed: To add cross section direct perturbation functionality, and to add control rod criticality position search functionality. In the tests, both the functionality expansions were carried out satisfying the condition

  14. The little-studied cluster Berkeley 90. I. LS III +46 11: a very massive O3.5 If* + O3.5 If* binary

    Science.gov (United States)

    Maíz Apellániz, J.; Negueruela, I.; Barbá, R. H.; Walborn, N. R.; Pellerin, A.; Simón-Díaz, S.; Sota, A.; Marco, A.; Alonso-Santiago, J.; Sanchez Bermudez, J.; Gamen, R. C.; Lorenzo, J.

    2015-07-01

    Context. It appears that most (if not all) massive stars are born in multiple systems. At the same time, the most massive binaries are hard to find owing to their low numbers throughout the Galaxy and the implied large distances and extinctions. Aims: We want to study LS III +46 11, identified in this paper as a very massive binary; another nearby massive system, LS III +46 12; and the surrounding stellar cluster, Berkeley 90. Methods: Most of the data used in this paper are multi-epoch high S/N optical spectra, although we also use Lucky Imaging and archival photometry. The spectra are reduced with dedicated pipelines and processed with our own software, such as a spectroscopic-orbit code, CHORIZOS, and MGB. Results: LS III +46 11 is identified as a new very early O-type spectroscopic binary [O3.5 If* + O3.5 If*] and LS III +46 12 as another early O-type system [O4.5 V((f))]. We measure a 97.2-day period for LS III +46 11 and derive minimum masses of 38.80 ± 0.83 M⊙ and 35.60 ± 0.77 M⊙ for its two stars. We measure the extinction to both stars, estimate the distance, search for optical companions, and study the surrounding cluster. In doing so, a variable extinction is found as well as discrepant results for the distance. We discuss possible explanations and suggest that LS III +46 12 may be a hidden binary system where the companion is currently undetected.

  15. Simulation of thermal fluid dynamics in parabolic trough receiver tubes with direct steam generation using the computer code ATHLET

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, Alexander; Merk, Bruno [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany); Hirsch, Tobias; Pitz-Paal, Robert [DLR Deutsches Zentrum fuer Luft- und Raumfahrt e.V., Stuttgart (Germany). Inst. fuer Solarforschung

    2014-06-15

    In the present feasibility study the system code ATHLET, which originates from nuclear engineering, is applied to a parabolic trough test facility. A model of the DISS (DIrect Solar Steam) test facility at Plataforma Solar de Almeria in Spain is assembled and the results of the simulations are compared to measured data and the simulation results of the Modelica library 'DissDyn'. A profound comparison between ATHLET Mod 3.0 Cycle A and the 'DissDyn' library reveals the capabilities of these codes. The calculated mass and energy balance in the ATHLET simulations are in good agreement with the results of the measurements and confirm the applicability for thermodynamic simulations of DSG processes in principle. Supplementary, the capabilities of the 6-equation model with transient momentum balances in ATHLET are used to study the slip between liquid and gas phases and to investigate pressure wave oscillations after a sudden valve closure. (orig.)

  16. BOT3P: a mesh generation software package for the transport analysis codes Dort, Tort, Twodant, Threedant and MCNP

    International Nuclear Information System (INIS)

    Orsi, R.

    2003-01-01

    Bot3p consists of a set of standard Fortran 77 language programs that gives the users of the deterministic transport codes Dort and Tort some useful diagnostic tools to prepare and check the geometry of their input data files for both Cartesian and cylindrical geometries including graphical display modules. Bot3p produces at the same time the geometrical and material distribution data for the deterministic transport codes Twodant and Threedant and, only in three-dimensional (3D) Cartesian geometry, for the Monte Carlo Transport Code MCNP. This makes it possible to compare directly for the same geometry the effects stemming from the use of different data libraries and solution approaches on transport analysis results. Through the use of Bot3p, radiation transport problems with complex 3D geometrical structures can be modelled easily, as a relatively small amount of engineer-time is required and refinement is achieved by changing few parameters. This tool is useful for solving very large challenging problems. (author)

  17. SMILEI: A collaborative, open-source, multi-purpose PIC code for the next generation of super-computers

    Science.gov (United States)

    Grech, Mickael; Derouillat, J.; Beck, A.; Chiaramello, M.; Grassi, A.; Niel, F.; Perez, F.; Vinci, T.; Fle, M.; Aunai, N.; Dargent, J.; Plotnikov, I.; Bouchard, G.; Savoini, P.; Riconda, C.

    2016-10-01

    Over the last decades, Particle-In-Cell (PIC) codes have been central tools for plasma simulations. Today, new trends in High-Performance Computing (HPC) are emerging, dramatically changing HPC-relevant software design and putting some - if not most - legacy codes far beyond the level of performance expected on the new and future massively-parallel super computers. SMILEI is a new open-source PIC code co-developed by both plasma physicists and HPC specialists, and applied to a wide range of physics-related studies: from laser-plasma interaction to astrophysical plasmas. It benefits from an innovative parallelization strategy that relies on a super-domain-decomposition allowing for enhanced cache-use and efficient dynamic load balancing. Beyond these HPC-related developments, SMILEI also benefits from additional physics modules allowing to deal with binary collisions, field and collisional ionization and radiation back-reaction. This poster presents the SMILEI project, its HPC capabilities and illustrates some of the physics problems tackled with SMILEI.

  18. Nonlinear Time Series Prediction Using LS-SVM with Chaotic Mutation Evolutionary Programming for Parameter Optimization

    International Nuclear Information System (INIS)

    Xu Ruirui; Chen Tianlun; Gao Chengfeng

    2006-01-01

    Nonlinear time series prediction is studied by using an improved least squares support vector machine (LS-SVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.

  19. ANITA-2000 activation code package - updating of the decay data libraries and validation on the experimental data of the 14 MeV Frascati Neutron Generator

    Directory of Open Access Journals (Sweden)

    Frisoni Manuela

    2016-01-01

    Full Text Available ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1 containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2 containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E are shown and discussed in this paper.

  20. CMS DAQ current and future hardware upgrades up to post Long Shutdown 3 (LS3) times

    CERN Document Server

    Racz, Attila; Behrens, Ulf; Branson, James; Chaze, Olivier; Cittolin, Sergio; Contescu, Cristian; da Silva Gomes, Diego; Darlea, Georgiana-Lavinia; Deldicque, Christian; Demiragli, Zeynep; Dobson, Marc; Doualot, Nicolas; Erhan, Samim; Fulcher, Jonathan Richard; Gigi, Dominique; Gladki, Maciej; Glege, Frank; Gomez-Ceballos, Guillelmo; Hegeman, Jeroen; Holzner, Andre; Janulis, Mindaugas; Lettrich, Michael; Meijers, Frans; Meschi, Emilio; Mommsen, Remigius K; Morovic, Srecko; O'Dell, Vivian; Orn, Samuel Johan; Orsini, Luciano; Papakrivopoulos, Ioannis; Paus, Christoph; Petrova, Petia; Petrucci, Andrea; Pieri, Marco; Rabady, Dinyar; Reis, Thomas; Sakulin, Hannes; Schwick, Christoph; Simelevicius, Dainius; Vazquez Velez, Cristina; Vougioukas, Michail; Zejdl, Petr

    2017-01-01

    Following the first LHC collisions seen and recorded by CMS in 2009, the DAQ hardware went through a major upgrade during LS1 (2013- 2014) and new detectors have been connected during 2015-2016 and 2016-2017 winter shutdowns. Now, LS2 (2019-2020) and LS3 (2024-mid 2026) are actively being prepared. This paper shows how CMS DAQ hardware has evolved from the beginning and will continue to evolve in order to meet the future challenges posed by High Luminosity LHC (HL-LHC) and the CMS detector evolution. In particular, post LS3 DAQ architectures are focused upon.

  1. LS1 general planning and strategy for the LHC, LHC injectors

    International Nuclear Information System (INIS)

    Foraz, K.

    2012-01-01

    The goal of Long Shutdown 1 (LS1) is to perform the full maintenance of equipment and the necessary consolidation and upgrade activities in order to ensure reliable LHC operation at nominal performance from mid-2014. LS1 is scheduled to last 20 months. LS1 not only concerns the LHC but also its injectors. To ensure resources will be available an analysis is in progress to detect conflict/overload and decide what is compulsory, what we can afford, and what can be postponed until LS2. The strategy, time key drivers, constraints, and draft schedule are presented here. (author)

  2. Generation of multigroup cross-sections from micro-group ones in code system SUHAM-U used for VVER-1000 reactor core calculations with MOX loading

    Energy Technology Data Exchange (ETDEWEB)

    Boyarinov, V.F.; Davidenko, V.D.; Polismakov, A.A.; Tsybulsky, V.F. [RRC Kurchatov Institute, Moscow (Russian Federation)

    2005-07-01

    At the present time, the new code system SUHAM-U for calculation of the neutron-physical processes in nuclear reactor core with triangular and square lattices based both on the modern micro-group (about 7000 groups) cross-sections library of code system UNK and on solving the multigroup (up to 89 groups) neutron transport equation by Surface Harmonics Method is elaborated. In this paper the procedure for generation of multigroup cross-sections from micro-group ones for calculation of VVER-1000 reactor core with MOX loading is described. The validation has consisted in computing VVER-1000 fuel assemblies with uranium and MOX fuel and has shown enough high accuracy under corresponding selection of the number and boundaries of the energy groups. This work has been fulfilled in the frame of ISTC project 'System Analyses of Nuclear Safety for VVER Reactors with MOX Fuels'.

  3. Further development of KAVERN and code development on gas generation from the containment basement during concrete decomposition

    International Nuclear Information System (INIS)

    Schwarzott, W.; Artnik, J.; Hassmann, K.; Kemner, H.; Stuckenberg, X.

    1983-04-01

    The events during the melt/concrete interaction, e.g. the shape of the cavity and the mass and energy of gases released to the containment atmosphere can be analysed by the computer code KAVERN. In case of basaltic conrete sump water contacts the melt surface after 7 hours. Overpressurization of the containment is calculated to occur after appr. 5 days. For different paths out of the reactor cavity to the containment atmosphere STROMI calculates the mass flow of the gases released during melt concrete interaction. Results show max. temperatures up to 1200 0 C which is well above the self ignition temperature of H 2 . (orig.) [de

  4. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  5. Estimation of reactor core calculation by HELIOS/MASTER at power generating condition through DeCART, whole-core transport code

    International Nuclear Information System (INIS)

    Kim, H. Y.; Joo, H. G.; Kim, K. S.; Kim, G. Y.; Jang, M. H.

    2003-01-01

    The reactivity and power distribution errors of the HELIOS/MASTER core calculation under power generating conditions are assessed using a whole core transport code DeCART. For this work, the cross section tablesets were generated for a medium sized PWR following the standard procedure and two group nodal core calculations were performed. The test cases include the HELIOS calculations for 2-D assemblies at constant thermal conditions, MASTER 3D assembly calculations at power generating conditions, and the core calculations at HZP, HFP, and an abnormal power conditions. In all these cases, the results of the DeCART code in which pinwise thermal feedback effects are incorporated are used as the reference. The core reactivity, assemblywise power distribution, axial power distribution, peaking factor, and thermal feedback effects are then compared. The comparison shows that the error of the HELIOS/MASTER system in the core reactivity, assembly wise power distribution, pin peaking factor are only 100∼300 pcm, 3%, and 2%, respectively. As far as the detailed pinwise power distribution is concerned, however, errors greater than 15% are observed

  6. LS1 Report: antimatter research on the starting blocks

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    The consolidation work at the Antiproton Decelerator (AD) has been very intensive and the operators now have a basically new machine to “drive”. Thanks to the accurate preparation work still ongoing, the machine will soon deliver its first beam of antiprotons to the experiments. The renewed efficiency of the whole complex will ensure the best performance of the whole of CERN’s antimatter research programme in the long term.   The test bench for the new Magnetic Horn stripline. On the left, high voltage cables are connected to the stripline, which then feeds a 6 kV 400 kA pulse to the Horn. The Horn itself (the cylindrical object on the right) can be seen mounted on its chariot. The consolidation programme at the AD planned during LS1 has involved some of the most vital parts of the decelerator such as the target area, the ring magnets, the stochastic cooling system, vacuum system, control system and various aspects of the instrumentation. In addit...

  7. QPS upgrade and machine protection during LS1

    International Nuclear Information System (INIS)

    Denz, R.

    2012-01-01

    The presentation will explain all the proposed changes and discuss the impact on other shutdown activities. The upgrade of the LHC Quench Protection System QPS during LS1 with respect to radiation to electronics will concern the re-location of equipment and installation of new radiation tolerant hardware. The mid-term plan for further R2E upgrades will be addressed. The protection systems for insertion region magnets and inner triplets will be equipped with a dedicated bus-bar splice supervision including some additional modifications in order to improve the EMC immunity. The extension of the supervision capabilities of the QPS will concern the quench heater circuits, the earth voltage feelers and some tools to ease the system maintenance. The protection of the undulators will be revised in order to allow more transparent operation. The installation of snubber capacitors and arc chambers for the main quad circuits will complete the upgrade of the energy extraction systems. Finally the re-commissioning of the protection systems prior to the powering tests will be addressed. (author)

  8. LS1 Report: the electric atmosphere of the LHC

    CERN Multimedia

    Simon Baird

    2013-01-01

    In the LHC, testing of the main magnet (dipole and quadrupole) circuits has been completed. At the same time, the extensive tests of all the other circuits up to current levels corresponding to 7 TeV beam operation have been performed, and now the final ElQA (Electrical Quality Assurance) tests of the electrical circuits are proceeding.   In Sectors 4-5 and 5-6, where the ElQA checks have been finished, the process of removing and storing the helium has started (see the article Heatwave warning for the LHC, in this issue). This is the first step in warming up the whole machine to room temperature so that the main LS1 activities, SMACC (Super Conducting Magnet and Circuit Consolidation) and the R2E (Radiation Two Electronics) programmes, which are scheduled to start on 19 April and 22 March respectively, can get under way. As far as the LHC injectors are concerned, LINAC2 and the PS Booster are in shutdown mode, having completed their preparatory hardware test programmes, and shutdown work has alr...

  9. LS1 Report: on the home straight in 2014

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    At 7.24 a.m. on 14 February 2013 the last beams for physics were absorbed into the LHC, marking the end of Run 1. The achievements since then over the first ten months of LS1 have been remarkable. The excellent progress of the maintenance work on CERN's accelerators, which is overwhelmingly on schedule – and even ahead of schedule in some cases! – was praised by the CERN Council last week.   That being said, there is still a long way to go before the LHC re-start, with many challenges and potential pitfalls to be overcome. An overview of what still lies ahead: For the injectors (Linac 2, PS booster, LEIR, PS and AD), 2014 will begin with the recommissioning of all the access systems (scheduled for mid-February). The first power tests (to check the magnets and the power converters) will follow hot on its heels, starting in early April in the case of the PS booster. The final power tests of the injectors will be carried out at the Antiproton Decelerator in June. Th...

  10. LS1 Report: PS Booster prepares for beam

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    With Linac2 already up and running, the countdown to beam in the LHC has begun! The next in line is the PS Booster, which will close up shop to engineers early next week. The injector will be handed over to the Operations Group who are tasked with getting it ready for active duty.   Taken as we approach the end of LS1 activities, this image shows where protons will soon be injected from Linac2 into the four PS Booster rings. Over the coming two months, the Operations Group will be putting the Booster's new elements through their paces. "Because of the wide range of upgrades and repairs carried out in the Booster, we have a very full schedule of tests planned for the machine," says Bettina Mikulec, PS Booster Engineer in Charge. "We will begin with cold checks; these are a wide range of tests carried out without beam, including system tests with power on/off and with varying settings, as well as verification of the controls system and timings." Amon...

  11. Development of PARA-ID Code to Simulate Inelastic Constitutive Equations and Their Parameter Identifications for the Next Generation Reactor Designs

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, J. H.

    2006-03-01

    The establishment of the inelastic analysis technology is essential issue for a development of the next generation reactors subjected to elevated temperature operations. In this report, the peer investigation of constitutive equations in points of a ratcheting and creep-fatigue analysis is carried out and the methods extracting the constitutive parameters from experimental data are established. To perform simulations for each constitutive model, the PARA-ID (PARAmeter-IDentification) computer program is developed. By using this code, various simulations related with the parameter identification of the constitutive models are carried out

  12. Suggested Grid Code Modifications to Ensure Wide-Scale Adoption of Photovoltaic Energy in Distributed Power Generation Systems

    DEFF Research Database (Denmark)

    Yang, Yongheng; Enjeti, Prasad; Blaabjerg, Frede

    2013-01-01

    Current grid standards seem to largely require low power (e.g. several kilowatts) single-phase photovoltaic (PV) systems to operate at unity power factor with maximum power point tracking, and disconnect from the grid under grid faults. However, in case of a wide-scale penetration of single......-phase PV systems in the distributed grid, the disconnection under grid faults can contribute to: a) voltage flickers, b) power outages, and c) system instability. In this paper, grid code modifications are explored for wide-scale adoption of PV systems in the distribution grid. More recently, Italy...... and Japan, have undertaken a major review of standards for PV power conversion systems connected to low voltage networks. In view of this, the importance of low voltage ride-through for single-phase PV power systems under grid faults along with reactive power injection is studied in this paper. Three...

  13. On-line generation of core monitoring power distribution in the SCOMS couppled with core design code

    International Nuclear Information System (INIS)

    Lee, K. B.; Kim, K. K.; In, W. K.; Ji, S. K.; Jang, M. H.

    2002-01-01

    The paper provides the description of the methodology and main program module of power distribution calculation of SCOMS(SMART COre Monitoring System). The simulation results of the SMART core using the developed SCOMS are included. The planar radial peaking factor(Fxy) is relatively high in SMART core because control banks are inserted to the core at normal operation. If the conventional core monitoring method is adapted to SMART, highly skewed planar radial peaking factor Fxy yields an excessive conservatism and reduces the operation margin. In addition to this, the error of the core monitoring would be enlarged and thus operating margin would be degraded, because it is impossible to precalculate the core monitoring constants for all the control banks configurations taking into account the operation history in the design stage. To get rid of these drawbacks in the conventional power distribution calculation methodology, new methodology to calculate the three dimensional power distribution is developed. Core monitoring constants are calculated with the core design code (MASTER) which is on-line coupled with SCOMS. Three dimensional (3D) power distribution and the several peaking factors are calculated using the in-core detector signals and core monitoring constant provided at real time. Developed methodology is applied to the SMART core and the various core states are simulated. Based on the simulation results, it is founded that the three dimensional peaking factor to calculate the Linear Power Density and the pseudo hot-pin axial power distribution to calculate the Departure Nucleate Boiling Ratio show the more conservative values than those of the best-estimated core design code, and SCOMS adapted developed methodology can secures the more operation margin than the conventional methodology

  14. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  15. Generation of the library of neutron cross sections for the Record code of the Fuel Management System (FMS); Generacion de la biblioteca de secciones eficaces de neutrones para el codigo Record del Sistema de Administracion de Combustible (FMS)

    Energy Technology Data Exchange (ETDEWEB)

    Alonso V, G; Hernandez L, H [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1991-11-15

    On the basis of the library structure of the RECORD code a method to generate the neutron cross sections by means of the ENDF-B/IV database and the NJOY code has been developed. The obtained cross sections are compared with those of the current library which was processed using the ENDF-B/III version. (Author)

  16. Status of reactor physics activities on cross section generation and functionalization for the prismatic very high temperature reactor, and development of spatially-heterogeneous codes

    International Nuclear Information System (INIS)

    Lee, C. H.; Zhong, Z.; Taiwo, T. A.; Yang, W. S.; Smith, M. A.; Palmiotti, G.

    2006-01-01

    The cross section generation methodology and procedure for design and analysis of the prismatic Very High Temperature Gas-cooled Reactor (VHTR) core have been addressed for the DRAGON and REBUS-3/DIF3D code suite. Approaches for tabulation and functionalization of cross sections have been investigated and implemented. The cross sections are provided at different burnup and fuel and moderator temperature states. In the tabulation approach, the multigroup cross sections are tabulated as a function of the state variables so that a cross section file is able to cover the range of core operating conditions. Cross sections for points between tabulated data points are fitted simply by linear interpolation. For the functionalization approach, an investigation of the applicability of quadratic polynomials and linear coupling for fuel and moderator temperature changes has been conducted, based on the observation that cross sections are monotonically changing with fuel or moderator temperatures. Preliminary results show that the functionalization makes it possible to cover a wide range of operating temperature conditions with only six sets of data per burnup, while maintaining a good accuracy and significantly reducing the size of the cross section file. In these approaches, the number of fission products has been minimized to a few nuclides (I/Xe/Pm/Sm and a lumped fission product) to reduce the overall computation time without sacrificing solution accuracy. Discontinuity factors (DFs) based on nodal equivalence theory have been introduced to accurately represent the significant change in neutron spectrum at the interface of the fuel and reflector regions as well as between different fuel blocks (e.g., fuel elements with burnable poisons or control rods). Using the DRAGON code, procedures have been established for generating cross sections for fuel and reflector blocks with and without control absorbers. The preliminary results indicate that the solution accuracy is improved

  17. Generation of point isotropic source dose buildup factor data for the PFBR special concretes in a form compatible for usage in point kernel computer code QAD-CGGP

    International Nuclear Information System (INIS)

    Radhakrishnan, G.

    2003-01-01

    Full text: Around the PFBR (Prototype Fast Breeder Reactor) reactor assembly, in the peripheral shields special concretes of density 2.4 g/cm 3 and 3.6 g/cm 3 are to be used in complex geometrical shapes. Point-kernel computer code like QAD-CGGP, written for complex shield geometry comes in handy for the shield design optimization of peripheral shields. QAD-CGGP requires data base for the buildup factor data and it contains only ordinary concrete of density 2.3 g/cm 3 . In order to extend the data base for the PFBR special concretes, point isotropic source dose buildup factors have been generated by Monte Carlo method using the computer code MCNP-4A. For the above mentioned special concretes, buildup factor data have been generated in the energy range 0.5 MeV to 10.0 MeV with the thickness ranging from 1 mean free paths (mfp) to 40 mfp. Capo's formula fit of the buildup factor data compatible with QAD-CGGP has been attempted

  18. Report of the generation of the nuclear bank 'L1PG9121' of the SVEA-96 'collapsed' assemble for the FCS-II program with the FMS codes

    International Nuclear Information System (INIS)

    Alonso V, G.

    1992-01-01

    In this work it is described in a general way the form in that was generated the collapsed bank of the SVEA-96 fuel for Laguna Verde. The formation of the bank it was carried out with the ECLIPSE 86-2D, RECORD 89-1A and POLGEN 88-1B codes of the FMS package installed in the VAX system of the office of the National Commission of Nuclear Safety and Safeguards in Mexico D.F. The formed bank is denominated 'LlPG9121'. All this one carries out following the procedure '6F3/I/CN029/90/P1'. To generate the bank, both RECORD 'cells' that compose the assemble its were 'collapsed' in an alone one, representing this, the complete assemble in what refers to the distribution of fuel bar and enrichment. The collapsed of the assemble was made averaging the content of UO 2 and Gd 2 O 3 in each fuel bar of the one assemble. By this way the x-y array of fuel bars is conserved but a representative fuel cell of all the one assemble is obtained, being this the studied RECORD cell. In accordance with the requirements of nuclear information of FCS-II, the nuclear information generated with RECORD only was of the defined type as series 1 in the generation procedure of nuclear banks '6F3/I/CN029/90/P1'. This only means that only was generated nuclear information as function of the fuel burnt and of the vacuum in the fuel cell. Although the nuclear bank (L1PG9121) it was generated in these circumstances, it was also generates information of the defined type as series 2 with the present control bar for possible reactor analysis under these conditions. (Author)

  19. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  20. Detailed study of spontaneous rotation generation in diverted H-mode plasma using the full-f gyrokinetic code XGC1

    Science.gov (United States)

    Seo, Janghoon; Chang, C. S.; Ku, S.; Kwon, J. M.; Yoon, E. S.

    2013-10-01

    The Full-f gyrokinetic code XGC1 is used to study the details of toroidal momentum generation in H-mode plasma. Diverted DIII-D geometry is used, with Monte Carlo neutral particles that are recycled at the limiter wall. Nonlinear Coulomb collisions conserve particle, momentum, and energy. Gyrokinetic ions and adiabatic electrons are used in the present simulation to include the effects from ion gyrokinetic turbulence and neoclassical physics, under self-consistent radial electric field generation. Ion orbit loss physics is automatically included. Simulations show a strong co-Ip flow in the H-mode layer at outside midplane, similarly to the experimental observation from DIII-D and ASDEX-U. The co-Ip flow in the edge propagates inward into core. It is found that the strong co-Ip flow generation is mostly from neoclassical physics. On the other hand, the inward momentum transport is from turbulence physics, consistently with the theory of residual stress from symmetry breaking. Therefore, interaction between the neoclassical and turbulence physics is a key factor in the spontaneous momentum generation.

  1. Hypertension Knowledge-Level Scale (HK-LS: A Study on Development, Validity and Reliability

    Directory of Open Access Journals (Sweden)

    Cemalettin Kalyoncu

    2012-03-01

    Full Text Available This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensions encompassed 60.3% of the total variance. Cronbach alpha coefficients were 0.82 for the entire scale and 0.92, 0.59, 0.67, 0.77, 0.72, and 0.76 for the sub-dimensions of definition, medical treatment, drug compliance, lifestyle, diet, and complications, respectively. The scale ensured internal consistency in reliability and construct validity, as well as stability over time. Significant relationships were found between knowledge score and age, gender, educational level, and history of hypertension of the participants. No correlation was found between knowledge score and working at an income-generating job. The present scale, developed to measure the knowledge level of hypertension among Turkish adults, was found to be valid and reliable.

  2. Hypertension Knowledge-Level Scale (HK-LS): a study on development, validity and reliability.

    Science.gov (United States)

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-03-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥ 18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensions encompassed 60.3% of the total variance. Cronbach alpha coefficients were 0.82 for the entire scale and 0.92, 0.59, 0.67, 0.77, 0.72, and 0.76 for the sub-dimensions of definition, medical treatment, drug compliance, lifestyle, diet, and complications, respectively. The scale ensured internal consistency in reliability and construct validity, as well as stability over time. Significant relationships were found between knowledge score and age, gender, educational level, and history of hypertension of the participants. No correlation was found between knowledge score and working at an income-generating job. The present scale, developed to measure the knowledge level of hypertension among Turkish adults, was found to be valid and reliable.

  3. Software design and code generation for the engineering graphical user interface of the ASTRI SST-2M prototype for the Cherenkov Telescope Array

    Science.gov (United States)

    Tanci, Claudio; Tosti, Gino; Antolini, Elisa; Gambini, Giorgio F.; Bruno, Pietro; Canestrari, Rodolfo; Conforti, Vito; Lombardi, Saverio; Russo, Federico; Sangiorgi, Pierluca; Scuderi, Salvatore

    2016-08-01

    ASTRI is an on-going project developed in the framework of the Cherenkov Telescope Array (CTA). An end- to-end prototype of a dual-mirror small-size telescope (SST-2M) has been installed at the INAF observing station on Mt. Etna, Italy. The next step is the development of the ASTRI mini-array composed of nine ASTRI SST-2M telescopes proposed to be installed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort carried on by Italy, Brazil and South-Africa and led by the Italian National Institute of Astrophysics, INAF. To control the ASTRI telescopes, a specific ASTRI Mini-Array Software System (MASS) was designed using a scalable and distributed architecture to monitor all the hardware devices for the telescopes. Using code generation we built automatically from the ASTRI Interface Control Documents a set of communication libraries and extensive Graphical User Interfaces that provide full access to the capabilities offered by the telescope hardware subsystems for testing and maintenance. Leveraging these generated libraries and components we then implemented a human designed, integrated, Engineering GUI for MASS to perform the verification of the whole prototype and test shared services such as the alarms, configurations, control systems, and scientific on-line outcomes. In our experience the use of code generation dramatically reduced the amount of effort in development, integration and testing of the more basic software components and resulted in a fast software release life cycle. This approach could be valuable for the whole CTA project, characterized by a large diversity of hardware components.

  4. Comparison of electron dose-point kernels in water generated by the Monte Carlo codes, PENELOPE, GEANT4, MCNPX, and ETRAN.

    Science.gov (United States)

    Uusijärvi, Helena; Chouin, Nicolas; Bernhardt, Peter; Ferrer, Ludovic; Bardiès, Manuel; Forssell-Aronsson, Eva

    2009-08-01

    Point kernels describe the energy deposited at a certain distance from an isotropic point source and are useful for nuclear medicine dosimetry. They can be used for absorbed-dose calculations for sources of various shapes and are also a useful tool when comparing different Monte Carlo (MC) codes. The aim of this study was to compare point kernels calculated by using the mixed MC code, PENELOPE (v. 2006), with point kernels calculated by using the condensed-history MC codes, ETRAN, GEANT4 (v. 8.2), and MCNPX (v. 2.5.0). Point kernels for electrons with initial energies of 10, 100, 500, and 1 MeV were simulated with PENELOPE. Spherical shells were placed around an isotropic point source at distances from 0 to 1.2 times the continuous-slowing-down-approximation range (R(CSDA)). Detailed (event-by-event) simulations were performed for electrons with initial energies of less than 1 MeV. For 1-MeV electrons, multiple scattering was included for energy losses less than 10 keV. Energy losses greater than 10 keV were simulated in a detailed way. The point kernels generated were used to calculate cellular S-values for monoenergetic electron sources. The point kernels obtained by using PENELOPE and ETRAN were also used to calculate cellular S-values for the high-energy beta-emitter, 90Y, the medium-energy beta-emitter, 177Lu, and the low-energy electron emitter, 103mRh. These S-values were also compared with the Medical Internal Radiation Dose (MIRD) cellular S-values. The greatest differences between the point kernels (mean difference calculated for distances, electrons was 1.4%, 2.5%, and 6.9% for ETRAN, GEANT4, and MCNPX, respectively, compared to PENELOPE, if omitting the S-values when the activity was distributed on the cell surface for 10-keV electrons. The largest difference between the cellular S-values for the radionuclides, between PENELOPE and ETRAN, was seen for 177Lu (1.2%). There were large differences between the MIRD cellular S-values and those obtained from

  5. Biodegradation test of SPS-LS blends as polymer electrolyte membrane fuel cells

    International Nuclear Information System (INIS)

    Putri, Zufira; Arcana, I Made

    2014-01-01

    Sulfonated polystyrene (SPS) can be applied as a proton exchange membrane fuel cell due to its fairly good chemical stability. In order to be applied as polymer electrolyte membrane fuel cells (PEMFCs), membrane polymer should have a good ionic conductivity, high proton conductivity, and high mechanical strength. Lignosulfonate (LS) is a complex biopolymer which has crosslinks and sulfonate groups. SPS-LS blends with addition of SiO 2 are used to increase the proton conductivity and to improve the mechanical properties and thermal stability. However, the biodegradation test of SPS-LS blends is required to determine whether the application of these membranes to be applied as an environmentally friendly membrane. In this study, had been done the synthesis of SPS, biodegradability test of SPS-LS blends with variations of LS and SiO 2 compositions. The biodegradation test was carried out in solid medium of Luria Bertani (LB) with an activated sludge used as a source of microorganism at incubation temperature of 37°C. Based on the results obtained indicated that SPS-LS-SiO 2 blends are more decomposed by microorganism than SPS-LS blends. This result is supported by analysis of weight reduction percentage, functional groups with Fourier Transform Infrared (FTIR) Spectroscopy, and morphological surface with Scanning Electron Microscopy (SEM)

  6. Biodegradation test of SPS-LS blends as polymer electrolyte membrane fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Putri, Zufira, E-mail: zufira.putri@gmail.com, E-mail: arcana@chem.itb.ac.id; Arcana, I Made, E-mail: zufira.putri@gmail.com, E-mail: arcana@chem.itb.ac.id [Inorganic and Physical Chemistry Research Groups, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung, Bandung (Indonesia)

    2014-03-24

    Sulfonated polystyrene (SPS) can be applied as a proton exchange membrane fuel cell due to its fairly good chemical stability. In order to be applied as polymer electrolyte membrane fuel cells (PEMFCs), membrane polymer should have a good ionic conductivity, high proton conductivity, and high mechanical strength. Lignosulfonate (LS) is a complex biopolymer which has crosslinks and sulfonate groups. SPS-LS blends with addition of SiO{sub 2} are used to increase the proton conductivity and to improve the mechanical properties and thermal stability. However, the biodegradation test of SPS-LS blends is required to determine whether the application of these membranes to be applied as an environmentally friendly membrane. In this study, had been done the synthesis of SPS, biodegradability test of SPS-LS blends with variations of LS and SiO{sub 2} compositions. The biodegradation test was carried out in solid medium of Luria Bertani (LB) with an activated sludge used as a source of microorganism at incubation temperature of 37°C. Based on the results obtained indicated that SPS-LS-SiO{sub 2} blends are more decomposed by microorganism than SPS-LS blends. This result is supported by analysis of weight reduction percentage, functional groups with Fourier Transform Infrared (FTIR) Spectroscopy, and morphological surface with Scanning Electron Microscopy (SEM)

  7. IMPARARE L’ITALIANO L2/LS CON TESTI TEATRALI

    Directory of Open Access Journals (Sweden)

    Erminia Ardissino

    2010-09-01

    Full Text Available Il saggio tratta dell'impiego di testi teatrali come fonti di esercizi per l'apprendimento dell'italiano L2/LS. Dopo alcune riflessioni teoriche, si presentano sei proposte di lavoro, con relative soluzioni, adatte a studenti del livello B2-C2 del quadro di riferimento. Si tratta di esercizi ricavati da drammi in un unico atto di Pirandello (La morsa e Lumìe di Sicilia, da Verga (un confronto fra Cavalleria rusticana nella forma drammatica e novellistica, e da Tommaso Landolfi (Ombre. Ogni proposta sfrutta una peculiarità del testo teatrale che si costituisce nell'incontro di dialoghi con didascalie. Anzitutto si tratta di comprendere come si costruiscono i personaggi, quindi di vedere la funzione delle didascalie in relazione al testo, infine di riflettere sulle diverse modalità in cui avvengono i dialoghi, includendo le forme di silenzio. Il testo teatrale appare così molto adatto ad esercizi di lingua, perché mette in gioco le capacità interpretative e immaginative degli studenti, li fa discutere e parlare sulla base delle loro intuizioni.   This paper investigates the use of play scripts as inspiration for Italian L2/FL exercises. After a brief discussion on theory, six project proposals and their solutions, suitable for B2-C2 level students, are presented.  These exercises are based on one-act plays by Pirandello (La Morsa and Lumìe di Sicilia, Verga (a comparison between Cavalleria Rusticana in drama and narrative forms and Tommaso Landolfi (Ombre.  Each project focuses on a specific aspect of the script  and is made up of dialogues with captions. After the way the characters are constructed is investigated, then the function of the captions in relation to the texts is considered, and finally students reflect on the different ways the dialogues are presented, including the pauses. Play scripts lend themselves to language exercises, because they encourage students to use their interpretation skills and imaginations to talk about

  8. LS1 general planning and strategy for the LHC, LHC injectors

    CERN Document Server

    Foraz, K

    2012-01-01

    The goal of Long Shutdown 1 (LS1) is to perform the full maintenance of equipment, and the necessary consolidation and upgrade activities in order to ensure reliable LHC operation at nominal performance from mid 2014. LS1 not only concerns LHC but also its injectors. To ensure resources will be available an analysis is in progress to detect conflict/overload and decide what is compulsary, what we can afford, and what can be postponed to LS2. The strategy, time key drivers, constraints, and draft schedule will be presented here.

  9. Steady Modeling for an Ammonia Synthesis Reactor Based on a Novel CDEAS-LS-SVM Model

    Directory of Open Access Journals (Sweden)

    Zhuoqian Liu

    2014-01-01

    Full Text Available A steady-state mathematical model is built in order to represent plant behavior under stationary operating conditions. A novel modeling using LS-SVR based on Cultural Differential Evolution with Ant Search is proposed. LS-SVM is adopted to establish the model of the net value of ammonia. The modeling method has fast convergence speed and good global adaptability for identification of the ammonia synthesis process. The LS-SVR model was established using the above-mentioned method. Simulation results verify the validity of the method.

  10. Real-time generation of images with pixel-by-pixel spectra for a coded aperture imager with high spectral resolution

    International Nuclear Information System (INIS)

    Ziock, K.P.; Burks, M.T.; Craig, W.; Fabris, L.; Hull, E.L.; Madden, N.W.

    2003-01-01

    The capabilities of a coded aperture imager are significantly enhanced when a detector with excellent energy resolution is used. We are constructing such an imager with a 1.1 cm thick, crossed-strip, planar detector which has 38 strips of 2 mm pitch in each dimension followed by a large coaxial detector. Full value from this system is obtained only when the images are 'fully deconvolved' meaning that the energy spectrum is available from each pixel in the image. The large number of energy bins associated with the spectral resolution of the detector, and the fixed pixel size, present significant computational challenges in generating an image in a timely manner at the conclusion of a data acquisition. The long computation times currently preclude the generation of intermediate images during the acquisition itself. We have solved this problem by building the images on-line as each event comes in using pre-imaged arrays of the system response. The generation of these arrays and the use of fractional mask-to-detector pixel sampling is discussed

  11. Simulation of electron, positron and Bremsstrahlung spectrum generated due to electromagnetic cascade by 2.5 GeV electron hitting lead target using FLUKA code

    International Nuclear Information System (INIS)

    Sahani, P.K.; Dev, Vipin; Haridas, G.; Thakkar, K.K.; Singh, Gurnam; Sarkar, P.K.; Sharma, D.N.

    2009-01-01

    INDUS-2 is a high energy electron accelerator facility where electrons are accelerated in circular ring up to maximum energy 2.5 GeV, to generate synchrotron radiation. During normal operation of the machine a fraction of these electrons is lost, which interact with the accelerator structures and components like vacuum chamber and residual gases in the cavity and hence generates significant amount of Bremsstrahlung radiation. The Bremsstrahlung radiation is highly dependent on the incident electron energy, target material and its thickness. The Bremsstrahlung radiation dominates the radiation environment in such electron storage rings. Because of its broad spectrum extending up to incident electron energy and pulsed nature, it is very difficult to segregate the Bremsstrahlung component from the mixed field environment in accelerators. With the help of FLUKA Monte Carlo code, Bremsstrahlung spectrum generated from 2.5 GeV electron on bombardment of high Z lead target is simulated. To study the variation in Bremsstrahlung spectrum on target thickness, lead targets of 3, 6, 9, 12, 15, 18 mm thickness was used. The energy spectrum of emerging electron and positron is also simulated. The study suggests that as the target thickness increases, the emergent Bremsstrahlung photon fluence increases. With increase in the target thickness Bremsstrahlung photons in the spectrum dominate the low energy part and degrade in high energy part. The electron and positron spectra also extend up to incident electron energy. (author)

  12. Steady-state simulations of a 30-tube once-through steam generator with the RELAP5/MOD3 and RELAP5/MOD2 computer codes

    International Nuclear Information System (INIS)

    Hassan, Y.A.; Salim, P.

    1991-01-01

    This paper reports on a steady-state analysis of a 30-tube once-through steam generator that has been performed on the RELAPS/MOD3 and RELAPS/MOD2 computer codes for 100, 75, and 65% loads. Results obtained are compared with experimental data. The RELAP5/MOD3 results for the test facility generally agree reasonably well with the data for the primary-side temperature profiles. The secondary-side temperature profile predicted by RELAP5/MOD3 at 75 and 65% loads agrees fairly well with the data and is better than the RELAP5/MOD2 results. However, the RELAP5/MOD3 calculated secondary-side temperature profile does not compare well with the 100% load data

  13. MOPABA-H2 - Computer code for calculation of hydrogen generation and distribution in the equipment of power plants with WWER type reactors in design modes of operation

    International Nuclear Information System (INIS)

    Arkhipov, O.P.; Kharitonov, Yu.V.; Shumskiy, A.M.; Kabakchi, S.A.

    2002-01-01

    With the aim of ensuring the hydrogen explosive-proof situation in the reactor plant, a complex of scientific-and-research work was carried out including the following: revealing the mechanisms of generation and release of hydrogen in the primary equipment components under design operation modes of the reactor plant with WWER; development of calculation procedure and computer code MOPABA-H2 enabling to determine the hydrogen content in RP equipment components under design operation modes. In the process of procedure development it was found out that the calculation of hydrogen content in the plant equipment requires development of the following main mathematical models: radiochemical processes in the primary coolant which has impurities and added special reagents; absorption of the core ionizing radiation by the coolant; steam-zirconium reaction (during design-basis accident of LOCA type); coolant mass transfer over the reactor plant equipment including transition of the phase boundary by the components of the coolant. (author)

  14. Hebbian learning in a model with dynamic rate-coded neurons: an alternative to the generative model approach for learning receptive fields from natural scenes.

    Science.gov (United States)

    Hamker, Fred H; Wiltschut, Jan

    2007-09-01

    Most computational models of coding are based on a generative model according to which the feedback signal aims to reconstruct the visual scene as close as possible. We here explore an alternative model of feedback. It is derived from studies of attention and thus, probably more flexible with respect to attentive processing in higher brain areas. According to this model, feedback implements a gain increase of the feedforward signal. We use a dynamic model with presynaptic inhibition and Hebbian learning to simultaneously learn feedforward and feedback weights. The weights converge to localized, oriented, and bandpass filters similar as the ones found in V1. Due to presynaptic inhibition the model predicts the organization of receptive fields within the feedforward pathway, whereas feedback primarily serves to tune early visual processing according to the needs of the task.

  15. LS-SNP/PDB: annotated non-synonymous SNPs mapped to Protein Data Bank structures.

    Science.gov (United States)

    Ryan, Michael; Diekhans, Mark; Lien, Stephanie; Liu, Yun; Karchin, Rachel

    2009-06-01

    LS-SNP/PDB is a new WWW resource for genome-wide annotation of human non-synonymous (amino acid changing) SNPs. It serves high-quality protein graphics rendered with UCSF Chimera molecular visualization software. The system is kept up-to-date by an automated, high-throughput build pipeline that systematically maps human nsSNPs onto Protein Data Bank structures and annotates several biologically relevant features. LS-SNP/PDB is available at (http://ls-snp.icm.jhu.edu/ls-snp-pdb) and via links from protein data bank (PDB) biology and chemistry tabs, UCSC Genome Browser Gene Details and SNP Details pages and PharmGKB Gene Variants Downloads/Cross-References pages.

  16. Shock Transmission Analyses of a Simplified Frigate Compartment Using LS-DYNA

    National Research Council Canada - National Science Library

    Trouwborst, W

    1999-01-01

    This report gives results as obtained with finite element analyses using the explicit finite element program LS-DYNA for a longitudinal slice of a frigate's compartment loaded with a shock pulse based...

  17. Diagnosis of Elevator Faults with LS-SVM Based on Optimization by K-CV

    Directory of Open Access Journals (Sweden)

    Zhou Wan

    2015-01-01

    Full Text Available Several common elevator malfunctions were diagnosed with a least square support vector machine (LS-SVM. After acquiring vibration signals of various elevator functions, their energy characteristics and time domain indicators were extracted by theoretically analyzing the optimal wavelet packet, in order to construct a feature vector of malfunctions for identifying causes of the malfunctions as input of LS-SVM. Meanwhile, parameters about LS-SVM were optimized by K-fold cross validation (K-CV. After diagnosing deviated elevator guide rail, deviated shape of guide shoe, abnormal running of tractor, erroneous rope groove of traction sheave, deviated guide wheel, and tension of wire rope, the results suggested that the LS-SVM based on K-CV optimization was one of effective methods for diagnosing elevator malfunctions.

  18. Summary of sessions 5 and 6: long shutdown 1 (LS1) 2013-2014

    International Nuclear Information System (INIS)

    Bordry, F.; Foraz, K.

    2012-01-01

    The minimal duration for LS1 is 20 months, meaning the time from physics to physics will be about 2 years. The actual start of LS1 is set for the 17. November 2012, which will allow the Liquid Helium emptying from the machine before Christmas. However, delivery dates for certain components are on the critical path of the experiments, which will allow the first beams for beam commissioning not before September 2014. Depending on the results of mid-2012 physics, the start date of LS1 will be reviewed. The actual plan for injectors is in line with the LHC plan, but the risk of running injectors for two years has to be assessed. The analysis of resources is progressing well throughout the complex (collaborations and internal mobility) and is being done according to priorities. Certain activities have already been postponed to LS2, and new requests will be carefully analyzed, as well as open issues

  19. SPECT3D - A multi-dimensional collisional-radiative code for generating diagnostic signatures based on hydrodynamics and PIC simulation output

    Science.gov (United States)

    MacFarlane, J. J.; Golovkin, I. E.; Wang, P.; Woodruff, P. R.; Pereyra, N. A.

    2007-05-01

    SPECT3D is a multi-dimensional collisional-radiative code used to post-process the output from radiation-hydrodynamics (RH) and particle-in-cell (PIC) codes to generate diagnostic signatures (e.g. images, spectra) that can be compared directly with experimental measurements. This ability to post-process simulation code output plays a pivotal role in assessing the reliability of RH and PIC simulation codes and their physics models. SPECT3D has the capability to operate on plasmas in 1D, 2D, and 3D geometries. It computes a variety of diagnostic signatures that can be compared with experimental measurements, including: time-resolved and time-integrated spectra, space-resolved spectra and streaked spectra; filtered and monochromatic images; and X-ray diode signals. Simulated images and spectra can include the effects of backlighters, as well as the effects of instrumental broadening and time-gating. SPECT3D also includes a drilldown capability that shows where frequency-dependent radiation is emitted and absorbed as it propagates through the plasma towards the detector, thereby providing insights on where the radiation seen by a detector originates within the plasma. SPECT3D has the capability to model a variety of complex atomic and radiative processes that affect the radiation seen by imaging and spectral detectors in high energy density physics (HEDP) experiments. LTE (local thermodynamic equilibrium) or non-LTE atomic level populations can be computed for plasmas. Photoabsorption rates can be computed using either escape probability models or, for selected 1D and 2D geometries, multi-angle radiative transfer models. The effects of non-thermal (i.e. non-Maxwellian) electron distributions can also be included. To study the influence of energetic particles on spectra and images recorded in intense short-pulse laser experiments, the effects of both relativistic electrons and energetic proton beams can be simulated. SPECT3D is a user-friendly software package that runs

  20. LS-DYNA Analysis of a Full-Scale Helicopter Crash Test

    Science.gov (United States)

    Annett, Martin S.

    2010-01-01

    A full-scale crash test of an MD-500 helicopter was conducted in December 2009 at NASA Langley's Landing and Impact Research facility (LandIR). The MD-500 helicopter was fitted with a composite honeycomb Deployable Energy Absorber (DEA) and tested under vertical and horizontal impact velocities of 26 ft/sec and 40 ft/sec, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of a system integrated LS-DYNA finite element model. In preparation for the full-scale crash test, a series of sub-scale and MD-500 mass simulator tests was conducted to evaluate the impact performances of various components, including a new crush tube and the DEA blocks. Parameters defined within the system integrated finite element model were determined from these tests. The objective of this paper is to summarize the finite element models developed and analyses performed, beginning with pre-test and continuing through post test validation.

  1. The Control System of CERN Accelerators Vacuum (LS1 Activities and New Developments)

    CERN Document Server

    Gomes, P; Bellorini, F; Blanchard, S; Boivin, J P; Gama, J; Girardot, G; Pigny, G; Rio, B; Vestergard, H; Kopylov, L; Merker, S; Mikheev, M

    2014-01-01

    After 3 years of operation, the LHC entered its first Long Shutdown period (LS1), in February 2013 [1]. Major consolidation and maintenance works are being performed across the whole CERN’s Accelerator chain, in order to prepare the LHC to restart at higher energy, in 2015. The injector chain shall resume earlier, in mid-14. We report about the on-going vacuum-controls projects. Some of them concern the renovation of the controls of certain machines; others are associated with the consolidations of the vacuum systems of LHC and its injectors; and a few are completely new installations. ue to the wide age-span of the existing vacuum installations, there is a mix of design philosophies and of control-equipment generations. The renovations and the novel projects offer an opportunity to improve the uniformity and efficiency of vacuum controls by: reducing the number of equipment versions with similar functionality; identifying, naming, labelling, and documenting all pieces of equipment; homogenizing the contr...

  2. Comparison of central corneal thickness measured by Lenstar LS900, OrbscanⅡ and ultrasonic pachmetry

    Directory of Open Access Journals (Sweden)

    Hong-Tao Zhang

    2013-09-01

    Full Text Available AIM: To investigate the difference of central corneal thickness(CCTmeasured by Lenstar LS900, OrbscahⅡ system and ultrasonic pachmetry, and to evaluate the correlation and consistency of the results for providing a theoretical basis for clinical application.METHODS: The mean value of CCT in 70 eyes of 35 patients measured three times by Lenstar LS900, OrbscahⅡ system and ultrasonic pachmetry underwent statistical analysis. The difference of CCT was compared, and the correlation and consistency of three measurements were analyzed to provide theoretical basis for clinical application. CCT values measured by different methods were analyzed with randomized block variance analysis. LSD-t test was used for pairwise comparison between groups. The correlation of three measurement methods were analyzed by linear correlation analysis, and Bland-Altman was used to analyze the consistency.RESULTS: The mean CCT values measured by Lenstar LS900, OrbscanⅡ and ultrasonic pachmetry were 542.75±40.06, 528.74±39.59, 538.54±40.93μm, respectively. The mean difference of CCT measurement was 4.21±8.78μm between Lenstar LS900 and ultrasonic pachmetry, 14.01±13.39μm between Lenstar LS900 and Orbscan Ⅱ, 9.8±10.57μm between ultrasonic pachmetry and Orbscan Ⅱ. The difference was statistically significant(PP>0.05: There was positive correlation between CCT with Lenstar LS900 and ultrasonic pachmetry(r=0.977, 0.944; PCONCLUSION: There are excellent correlation among Lenstar LS900, Orbscan Ⅱ and ultrasonic pachmetry. Lenstar LS900 can be used as CCT non-contact measurement tool.

  3. Report of generation of the 'L1PG9321' nuclear bank of the CAORSO 3.26 collapsed assembly with the FMS codes

    International Nuclear Information System (INIS)

    Alonso V, G.

    1991-03-01

    In this work it is described in a general way the form in that the collapsed bank of the CAORSO 3.26 fuel as possible reload fuel for Laguna Verde it was generated. The formation of the bank was carried out with the ECLIPSE 86-2D, RECORD 89-1A and POLGEN 88-1B codes of the FMS package installed in the VAX system of the offices of the National Commission of Nuclear Safety and Safeguards in Mexico City. The formed bank is denominated L 1PG9321 . All this one was carried out following the '6F3/I/CN029/90/P1' procedure. To generate the bank, the five RECORD cells that compose the one assemble its were 'collapsed' in one alone, representing this to the complete assemble in what refers to the fuel bar distribution and enrichment. The one collapsed of the it assemble it is made averaging the content of UO 2 and Gd 2 O 3 in each fuel bar of the one assemble. Of this way is conserved the arrangement x-y of fuel bars but a representative fuel cell is obtained of everything the one it assembles being this the studied RECORD cell. In accordance with the requirements of nuclear information of FCS-II the nuclear information generated with RECORD should be of the one defined type as series 1 in the generation procedure of '6F3/I/CN029/90/P1' nuclear banks. This means that only the nuclear information as function of the burnt one of the fuel and of the hole in the fuel cell is required; however the bank also contains information of the defined type as series 2 with the control bar present for possible analysis of the reactor under these conditions. (Author)

  4. Effects of grit roughness and pitch oscillations on the LS(1)-0417MOD airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Janiszewska, J.M.; Ramsay, R.R.; Hoffman, M.J.; Gregorek, G.M. [Ohio State Univ., Columbus, OH (United States)

    1996-01-01

    Horizontal axis wind turbine rotors experience unsteady aerodynamics due to wind shear when the rotor is yawed, when rotor blades pass through the support tower wake, and when the wind is gusting. An understanding of this unsteady behavior is necessary to assist in the calculations of rotor performance and loads. The rotors also experience performance degradation caused by surface roughness. These surface irregularities are due to the accumulation of insect debris, ice, and/or the aging process. Wind tunnel studies that examine both the steady and unsteady behavior of airfoils can help define pertinent flow phenomena, and the resultant data can be used to validate analytical computer codes. An LS(l)-0417MOD airfoil model was tested in The Ohio State University Aeronautical and Astronautical Research Laboratory (OSU/AARL) 3{times}5 subsonic wind tunnel (3{times}5) under steady flow and stationary model conditions, as well as with the model undergoing pitch oscillations. To study the possible extent of performance loss due to surface roughness, a standard grit pattern (LEGR) was used to simulate leading edge contamination. After baseline cases were completed, the LEGR was applied for both steady state and model pitch oscillation cases. The Reynolds numbers for steady state conditions were 0.75, 1, 1.25, and 1.5 million, while the angle of attack ranged from {minus}20{degrees} to +40{degrees}. With the model undergoing pitch oscillations, data were acquired at Reynolds numbers of 0.75, 1, 1.25, and 1.5 million, at frequencies of 0.6, 1.2, and 1.8 Hz. Two sine wave forcing functions were used, {plus_minus} 5.5%{degrees} and {plus_minus} 10{degrees}, at mean angles of attack of 8{degrees}, 14{degrees}, and 20{degrees}. For purposes herein, any reference to unsteady conditions foil model was in pitch oscillation about the quarter chord.

  5. ORIGEN-ARP 2.00, Isotope Generation and Depletion Code System-Matrix Exponential Method with GUI and Graphics Capability

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: ORIGEN-ARP was developed for the Nuclear Regulatory Commission and the Department of Energy to satisfy a need for an easy-to-use standardized method of isotope depletion/decay analysis for spent fuel, fissile material, and radioactive material. It can be used to solve for spent fuel characterization, isotopic inventory, radiation source terms, and decay heat. This release of ORIGEN-ARP is a standalone code package that contains an updated version of the SCALE-4.4a ORIGEN-S code. It contains a subset of the modules, data libraries, and miscellaneous utilities in SCALE-4.4a. This package is intended for users who do not need the entire SCALE package. ORIGEN-ARP 2.00 (2-12-2002) differs from the previous release ORIGEN-ARP 1.0 (July 2001) in the following ways: 1.The neutron source and energy spectrum routines were replaced with computational algorithms and data from the SOURCES-4B code (RSICC package CCC-661) to provide more accurate spontaneous fission and (alpha,n) neutron sources, and a delayed neutron source capability was added. 2.The printout of the fixed energy group structure photon tables was removed. Gamma sources and spectra are now printed for calculations using the Master Photon Library only. 2 - Methods: ORIGEN-ARP is an automated sequence to perform isotopic depletion / decay calculations using the ARP and ORIGEN-S codes of the SCALE system. The sequence includes the OrigenArp for Windows graphical user interface (GUI) that prepares input for ARP (Automated Rapid Processing) and ORIGEN-S. ARP automatically interpolates cross sections for the ORIGEN-S depletion/decay analysis using enrichment, burnup, and, optionally moderator density, from a set of libraries generated with the SCALE SAS2 depletion sequence. Library sets for four LWR fuel assembly designs (BWR 8 x 8, PWR 14 x 14, 15 x 15, 17 x 17) are included. The libraries span enrichments from 1.5 to 5 wt% U-235 and burnups of 0 to 60,000 MWD/MTU. Other

  6. Effect of Lactobacillus salivarius Ls-33 on fecal microbiota in obese adolescents.

    Science.gov (United States)

    Larsen, Nadja; Vogensen, Finn K; Gøbel, Rikke Juul; Michaelsen, Kim F; Forssten, Sofia D; Lahtinen, Sampo J; Jakobsen, Mogens

    2013-12-01

    This study is a part of the clinical trials with probiotic bacterium Lactobacillus salivarius Ls-33 conducted in obese adolescents. Previously reported clinical studies showed no effect of Ls-33 consumption on the metabolic syndrome in the subject group. The aim of the study was to investigate the impact of L. salivarius Ls-33 on fecal microbiota in obese adolescents. The study was a double-blinded intervention with 50 subjects randomized to intake of L. salivarius Ls-33 or placebo for 12 weeks. The fecal microbiota was assessed by real-time quantitative PCR before and after intervention. Concentrations of fecal short chain fatty acids were determined using gas chromatography. Ratios of Bacteroides-Prevotella-Porphyromonas group to Firmicutes belonging bacteria, including Clostridium cluster XIV, Blautia coccoides_Eubacteria rectale group and Roseburia intestinalis, were significantly increased (p ≤ 0.05) after administration of Ls-33. The cell numbers of fecal bacteria, including the groups above as well as Clostridium cluster I, Clostridium cluster IV, Faecalibacterium prausnitzii, Enterobacteriaceae, Enterococcus, the Lactobacillus group and Bifidobacterium were not significantly altered by intervention. Similarly, short chain fatty acids remained unaffected. L. salivarius Ls-33 might modify the fecal microbiota in obese adolescents in a way not related to metabolic syndrome. NCT 01020617. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  7. Prediction of the strength of concrete radiation shielding based on LS-SVM

    International Nuclear Information System (INIS)

    Juncai, Xu; Qingwen, Ren; Zhenzhong, Shen

    2015-01-01

    Highlights: • LS-SVM was introduced for prediction of the strength of RSC. • A model for prediction of the strength of RSC was implemented. • The grid search algorithm was used to optimize the parameters of the LS-SVM. • The performance of LS-SVM in predicting the strength of RSC was evaluated. - Abstract: Radiation-shielding concrete (RSC) and conventional concrete differ in strength because of their distinct constituents. Predicting the strength of RSC with different constituents plays a vital role in radiation shielding (RS) engineering design. In this study, a model to predict the strength of RSC is established using a least squares-support vector machine (LS-SVM) through grid search algorithm. The algorithm is used to optimize the parameters of the LS-SVM on the basis of traditional prediction methods for conventional concrete. The predicted results of the LS-SVM model are compared with the experimental data. The results of the prediction are stable and consistent with the experimental results. In addition, the studied parameters exhibit significant effects on the simulation results. Therefore, the proposed method can be applied in predicting the strength of RSC, and the predicted results can be adopted as an important reference for RS engineering design

  8. GIS-based Analysis of LS Factor under Coal Mining Subsidence Impacts in Sandy Region

    Directory of Open Access Journals (Sweden)

    W. Xiao

    2014-09-01

    Full Text Available Coal deposits in the adjacent regions of Shanxi, Shaanxi, and Inner Mongolia province (SSI account for approximately two-thirds of coal in China; therefore, the SSI region has become the frontier of coal mining and its westward movement. Numerous adverse impacts to land and environment have arisen in these sandy, arid, and ecologically fragile areas. Underground coal mining activities cause land to subside and subsequent soil erosion, with slope length and slope steepness (LS as the key influential factor. In this investigation, an SSI mining site was chosen as a case study area, and 1 the pre-mining LS factor was obtained using a digital elevation model (DEM dataset; 2 a mining subsidence prediction was implemented with revised subsidence prediction factors; and 3 the post-mining LS factor was calculated by integrating the pre-mining DEM dataset and coal mining subsidence prediction data. The results revealed that the LS factor leads to some changes in the bottom of subsidence basin and considerable alterations at the basin’s edges of basin. Moreover, the LS factor became larger in the steeper terrain under subsidence impacts. This integrated method could quantitatively analyse LS changes and spatial distribution under mining impacts, which will benefit and provide references for soil erosion evaluations in this region

  9. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  10. ertCPN: The adaptations of the coloured Petri-Net theory for real-time embedded system modeling and automatic code generation

    Directory of Open Access Journals (Sweden)

    Wattanapong Kurdthongmee

    2003-05-01

    Full Text Available A real-time system is a computer system that monitors or controls an external environment. The system must meet various timing and other constraints that are imposed on it by the real-time behaviour of the external world. One of the differences between a real-time and a conventional software is that a real-time program must be both logically and temporally correct. To successfully design and implement a real-time system, some analysis is typically done to assure that requirements or designs are consistent and that they satisfy certain desirable properties that may not be immediately obvious from specification. Executable specifications, prototypes and simulation are particularly useful in real-time systems for debugging specifications. In this paper, we propose the adaptations to the coloured Petri-net theory to ease the modeling, simulation and code generation process of an embedded, microcontroller-based, real-time system. The benefits of the proposed approach are demonstrated by use of our prototype software tool called ENVisAge (an Extended Coloured Petri-Net Based Visual Application Generator Tool.

  11. ASSOCIATING LONG-TERM {gamma}-RAY VARIABILITY WITH THE SUPERORBITAL PERIOD OF LS I +61 Degree-Sign 303

    Energy Technology Data Exchange (ETDEWEB)

    Ackermann, M.; Buehler, R. [Deutsches Elektronen Synchrotron DESY, D-15738 Zeuthen (Germany); Ajello, M. [Space Sciences Laboratory, 7 Gauss Way, University of California, Berkeley, CA 94720-7450 (United States); Ballet, J.; Casandjian, J. M. [Laboratoire AIM, CEA-IRFU/CNRS/Universite Paris Diderot, Service d' Astrophysique, CEA Saclay, F-91191 Gif sur Yvette (France); Barbiellini, G. [Istituto Nazionale di Fisica Nucleare, Sezione di Trieste, I-34127 Trieste (Italy); Bastieri, D.; Buson, S. [Istituto Nazionale di Fisica Nucleare, Sezione di Padova, I-35131 Padova (Italy); Bellazzini, R.; Bregeon, J. [Istituto Nazionale di Fisica Nucleare, Sezione di Pisa, I-56127 Pisa (Italy); Bonamente, E.; Cecchi, C. [Istituto Nazionale di Fisica Nucleare, Sezione di Perugia, I-06123 Perugia (Italy); Brandt, T. J. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Brigida, M. [Dipartimento di Fisica ' ' M. Merlin' ' dell' Universita e del Politecnico di Bari, I-70126 Bari (Italy); Bruel, P. [Laboratoire Leprince-Ringuet, Ecole polytechnique, CNRS/IN2P3, F-91128 Palaiseau (France); Caliandro, G. A. [Institute of Space Sciences (IEEE-CSIC), Campus UAB, E-08193 Barcelona (Spain); Cameron, R. A. [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States); Caraveo, P. A. [INAF-Istituto di Astrofisica Spaziale e Fisica Cosmica, I-20133 Milano (Italy); Cavazzuti, E. [Agenzia Spaziale Italiana (ASI) Science Data Center, I-00044 Frascati (Roma) (Italy); Chekhtman, A., E-mail: andrea.caliandro@ieec.uab.es, E-mail: hadasch@ieec.uab.es, E-mail: dtorres@ieec.uab.es [Center for Earth Observing and Space Research, College of Science, George Mason University, Fairfax, VA 22030 (United States); and others

    2013-08-20

    Gamma-ray binaries are stellar systems for which the spectral energy distribution (discounting the thermal stellar emission) peaks at high energies. Detected from radio to TeV gamma rays, the {gamma}-ray binary LS I +61 Degree-Sign 303 is highly variable across all frequencies. One aspect of this system's variability is the modulation of its emission with the timescale set by the {approx}26.4960 day orbital period. Here we show that, during the time of our observations, the {gamma}-ray emission of LS I +61 Degree-Sign 303 also presents a sinusoidal variability consistent with the previously known superorbital period of 1667 days. This modulation is more prominently seen at orbital phases around apastron, whereas it does not introduce a visible change close to periastron. It is also found in the appearance and disappearance of variability at the orbital period in the power spectrum of the data. This behavior could be explained by a quasi-cyclical evolution of the equatorial outflow of the Be companion star, whose features influence the conditions for generating gamma rays. These findings open the possibility to use {gamma}-ray observations to study the outflows of massive stars in eccentric binary systems.

  12. ASSOCIATING LONG-TERM γ-RAY VARIABILITY WITH THE SUPERORBITAL PERIOD OF LS I +61°303

    International Nuclear Information System (INIS)

    Ackermann, M.; Buehler, R.; Ajello, M.; Ballet, J.; Casandjian, J. M.; Barbiellini, G.; Bastieri, D.; Buson, S.; Bellazzini, R.; Bregeon, J.; Bonamente, E.; Cecchi, C.; Brandt, T. J.; Brigida, M.; Bruel, P.; Caliandro, G. A.; Cameron, R. A.; Caraveo, P. A.; Cavazzuti, E.; Chekhtman, A.

    2013-01-01

    Gamma-ray binaries are stellar systems for which the spectral energy distribution (discounting the thermal stellar emission) peaks at high energies. Detected from radio to TeV gamma rays, the γ-ray binary LS I +61°303 is highly variable across all frequencies. One aspect of this system's variability is the modulation of its emission with the timescale set by the ∼26.4960 day orbital period. Here we show that, during the time of our observations, the γ-ray emission of LS I +61°303 also presents a sinusoidal variability consistent with the previously known superorbital period of 1667 days. This modulation is more prominently seen at orbital phases around apastron, whereas it does not introduce a visible change close to periastron. It is also found in the appearance and disappearance of variability at the orbital period in the power spectrum of the data. This behavior could be explained by a quasi-cyclical evolution of the equatorial outflow of the Be companion star, whose features influence the conditions for generating gamma rays. These findings open the possibility to use γ-ray observations to study the outflows of massive stars in eccentric binary systems

  13. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    International Nuclear Information System (INIS)

    Pescarini, M.; Orsi, R.; Martinelli, T.

    2003-01-01

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  14. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  15. LHCb: The LHCb Trigger Architecture beyond LS1

    CERN Multimedia

    Albrecht, J; Neubert, S; Raven, G; Sokoloff, M D; Williams, M

    2013-01-01

    The LHCb experiment is a spectrometer dedicated to the study of heavy flavor at the LHC. The rate of proton-proton collisions at the LHC is 15 MHz, but resource limitations mean that only 5 kHz can be written to storage for offline analytsis. For this reason the LHCb data acquisition system -- trigger -- plays a key role in selecting signal events and rejecting background. In contrast to previous experiments at hadron colliders like for example CDF or D0, the bulk of the LHCb trigger is implemented in software and deployed on a farm of 20k parallel processing nodes. This system, called the High Level Trigger (HLT) is responsible for reducing the rate from the maximum at which the detector can be read out, 1.1 MHz, to the 5 kHz which can be processed offline,and has 20 ms in which to process and accept/reject each event. In order to minimize systematic uncertainties, the HLT was designed from the outset to reuse the offline reconstruction and selection code. During the long shutdown it is proposed to extend th...

  16. ASME power test code ptc 4.1 for steam generators; Codigo de pruebas de potencia ASME ptc 4.1 para generadores de vapor

    Energy Technology Data Exchange (ETDEWEB)

    Plauchu Alcantara, Jorge Alberto [Plauchu Consultores, Morelia, Michoacan (Mexico)

    2001-07-01

    This presentation is oriented towards those who in this subject have experience in the design and equipment specification, plant projects, factory and field testing, operation or result analyses. An important fraction of the national energy supply, approximately 13%, is applied to the steam generation in the different aspects of the industrial activity, in the electrical industry of public service and in the commercial and services sector. The development of the national programs of energy efficiency verifies this when dedicating to this use of the energy important projects, some of them with support of the USAID. The measurement of the energy utilization or the efficiency of steam generators (or boilers) is made applying some procedure agreed by the parts and the one of greater acceptance and best known in Mexico and internationally is the ASME Power Test Code PTC 4.1 for Steam Generators. The purpose and formality in the determination of efficiency and of steam generation capacity behavior, thermal basic regime or fulfillment of guarantees, radically changes the exigencies of strict attachment to the PTC 4.1 This definition will determine the importance of the test method selected, the deviations and convened exceptions, the influence of the precision and the measurement errors, the consideration of auxiliary equipment, etc. An interpretation or incorrect application of the Test Code has lead and will lead to results and nonreliable decisions. [Spanish] Esta exposicion se orienta a quienes en este tema cuenta con experiencia en diseno y especificacion de equipo, proyecto de planta, pruebas en fabrica y campo, operacion o analisis de resultados. Una fraccion importante de la oferta nacional de energia, 13% aproximadamente, se aplica a la generacion de vapor en diferentes giros de actividad industrial, en la industria electrica, de servicio publico y en el sector de servicios y comercial. El desarrollo de los programas nacionales de eficiencia energetica comprueba

  17. Experimental and Analytical Studies on Improved Feedforward ML Estimation Based on LS-SVR

    Directory of Open Access Journals (Sweden)

    Xueqian Liu

    2013-01-01

    Full Text Available Maximum likelihood (ML algorithm is the most common and effective parameter estimation method. However, when dealing with small sample and low signal-to-noise ratio (SNR, threshold effects are resulted and estimation performance degrades greatly. It is proved that support vector machine (SVM is suitable for small sample. Consequently, we employ the linear relationship between least squares support vector regression (LS-SVR’s inputs and outputs and regard LS-SVR process as a time-varying linear filter to increase input SNR of received signals and decrease the threshold value of mean square error (MSE curve. Furthermore, it is verified that by taking single-tone sinusoidal frequency estimation, for example, and integrating data analysis and experimental validation, if LS-SVR’s parameters are set appropriately, not only can the LS-SVR process ensure the single-tone sinusoid and additive white Gaussian noise (AWGN channel characteristics of original signals well, but it can also improves the frequency estimation performance. During experimental simulations, LS-SVR process is applied to two common and representative single-tone sinusoidal ML frequency estimation algorithms, the DFT-based frequency-domain periodogram (FDP and phase-based Kay ones. And the threshold values of their MSE curves are decreased by 0.3 dB and 1.2 dB, respectively, which obviously exhibit the advantage of the proposed algorithm.

  18. [Measurement of soil organic matter and available K based on SPA-LS-SVM].

    Science.gov (United States)

    Zhang, Hai-Liang; Liu, Xue-Mei; He, Yong

    2014-05-01

    Visible and short wave infrared spectroscopy (Vis/SW-NIRS) was investigated in the present study for measurement of soil organic matter (OM) and available potassium (K). Four types of pretreatments including smoothing, SNV, MSC and SG smoothing+first derivative were adopted to eliminate the system noises and external disturbances. Then partial least squares regression (PLSR) and least squares-support vector machine (LS-SVM) models were implemented for calibration models. The LS-SVM model was built by using characteristic wavelength based on successive projections algorithm (SPA). Simultaneously, the performance of LSSVM models was compared with PLSR models. The results indicated that LS-SVM models using characteristic wavelength as inputs based on SPA outperformed PLSR models. The optimal SPA-LS-SVM models were achieved, and the correlation coefficient (r), and RMSEP were 0. 860 2 and 2. 98 for OM and 0. 730 5 and 15. 78 for K, respectively. The results indicated that visible and short wave near infrared spectroscopy (Vis/SW-NIRS) (325 approximately 1 075 nm) combined with LS-SVM based on SPA could be utilized as a precision method for the determination of soil properties.

  19. A Modified LS+AR Model to Improve the Accuracy of the Short-term Polar Motion Prediction

    Science.gov (United States)

    Wang, Z. W.; Wang, Q. X.; Ding, Y. Q.; Zhang, J. J.; Liu, S. S.

    2017-03-01

    There are two problems of the LS (Least Squares)+AR (AutoRegressive) model in polar motion forecast: the inner residual value of LS fitting is reasonable, but the residual value of LS extrapolation is poor; and the LS fitting residual sequence is non-linear. It is unsuitable to establish an AR model for the residual sequence to be forecasted, based on the residual sequence before forecast epoch. In this paper, we make solution to those two problems with two steps. First, restrictions are added to the two endpoints of LS fitting data to fix them on the LS fitting curve. Therefore, the fitting values next to the two endpoints are very close to the observation values. Secondly, we select the interpolation residual sequence of an inward LS fitting curve, which has a similar variation trend as the LS extrapolation residual sequence, as the modeling object of AR for the residual forecast. Calculation examples show that this solution can effectively improve the short-term polar motion prediction accuracy by the LS+AR model. In addition, the comparison results of the forecast models of RLS (Robustified Least Squares)+AR, RLS+ARIMA (AutoRegressive Integrated Moving Average), and LS+ANN (Artificial Neural Network) confirm the feasibility and effectiveness of the solution for the polar motion forecast. The results, especially for the polar motion forecast in the 1-10 days, show that the forecast accuracy of the proposed model can reach the world level.

  20. Installation of a second superconducting wiggler at SAGA-LS

    Energy Technology Data Exchange (ETDEWEB)

    Kaneyasu, T., E-mail: kaneyasu@saga-ls.jp; Takabayashi, Y.; Iwasaki, Y.; Koda, S. [SAGA Light Source, 8-7 Yayoigaoka, Tosu 841-0005 (Japan)

    2016-07-27

    The SAGA Light Source is a synchrotron radiation facility consisting of a 255 MeV injector linac and a 1.4 GeV storage ring with a circumference of 75.6 m. A superconducting wiggler (SCW) with a peak magnetic field of 4 T has been routinely operating for generating hard X-rays since its installation in 2010. In light of this success, it was decided to install a second SCW as a part of the beamline construction by Sumitomo Electric Industries. To achieve this, machine modifications including installation of a new magnet power supply, improvement of the magnet control system, and replacement of the vacuum chambers in the storage ring were carried out. Along with beamline construction, installation and commissioning of the second SCW are scheduled to take place in 2015.

  1. Calculation of “LS-curves” for coincidence summing corrections in gamma ray spectrometry

    Science.gov (United States)

    Vidmar, Tim; Korun, Matjaž

    2006-01-01

    When coincidence summing correction factors for extended samples are calculated in gamma-ray spectrometry from full-energy-peak and total efficiencies, their variation over the sample volume needs to be considered. In other words, the correction factors cannot be computed as if the sample were a point source. A method developed by Blaauw and Gelsema takes the variation of the efficiencies over the sample volume into account. It introduces the so-called LS-curve in the calibration procedure and only requires the preparation of a single standard for each sample geometry. We propose to replace the standard preparation by calculation and we show that the LS-curves resulting from our method yield coincidence summing correction factors that are consistent with the LS values obtained from experimental data.

  2. A threshold-based fixed predictor for JPEG-LS image compression

    Science.gov (United States)

    Deng, Lihua; Huang, Zhenghua; Yao, Shoukui

    2018-03-01

    In JPEG-LS, fixed predictor based on median edge detector (MED) only detect horizontal and vertical edges, and thus produces large prediction errors in the locality of diagonal edges. In this paper, we propose a threshold-based edge detection scheme for the fixed predictor. The proposed scheme can detect not only the horizontal and vertical edges, but also diagonal edges. For some certain thresholds, the proposed scheme can be simplified to other existing schemes. So, it can also be regarded as the integration of these existing schemes. For a suitable threshold, the accuracy of horizontal and vertical edges detection is higher than the existing median edge detection in JPEG-LS. Thus, the proposed fixed predictor outperforms the existing JPEG-LS predictors for all images tested, while the complexity of the overall algorithm is maintained at a similar level.

  3. Can the proton injectors meet the HL-LHC requirements after LS2?

    International Nuclear Information System (INIS)

    Goddard, B.; Bartosik, H.; Bracco, C.; Bruening, O.; Carli, C.; Cornelis, K.; Damerau, H.; Garoby, R.; Gilardoni, S.; Hancock, S.; Hanke, K.; Kain, V.; Meddahi, M.; Mikulec, B.; Papaphilippou, Y.; Rumolo, G.; Shaposhnikova, E.; Steerenberg, R.; Vretenar, M.

    2012-01-01

    The LIU project has as mandate the upgrade of the LHC injector chain to match the requirements of HL-LHC. The present planning assumes that the upgrade work will be completed in LS2, for commissioning in the following operational year. The known limitations in the different injectors are described, together with the various upgrades planned to improve the performance. The expected performance reach after the upgrade with 25 and 50 ns beams is examined. The project planning is discussed in view of the present LS1 and LS2 planning. The main unresolved questions and associated decision points are presented, and the key issues to be addressed by the end of 2012 are detailed in the context of the machine development programs and hardware construction activities. (authors)

  4. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  5. BOT3P5.2, 3D Mesh Generator and Graphical Display of Geometry for Radiation Transport Codes, Display of Results

    International Nuclear Information System (INIS)

    Orsi, Roberto; Bidaud, Adrien

    2007-01-01

    1 - Description of program or function: BOT3P was originally conceived as a set of standard FORTRAN 77 language programs in order to give the users of the DORT and TORT deterministic transport codes some useful diagnostic tools to prepare and check their input data files. Later versions extended the possibility to produce the geometrical, material distribution and fixed neutron source data to other deterministic transport codes such as TWODANT/THREEDANT of the DANTSYS system, PARTISN and, potentially, to any transport code through BOT3P binary output files that can be easily interfaced (see, for example, the Russian two-dimensional (2D) and three-dimensional (3D) discrete ordinates neutron, photon and charged particle transport codes KASKAD-S-2.5 and KATRIN-2.0). As from Version 5.1 BOT3P contained important additions specifically addressed to radiation transport analysis for medical applications. BOT3P-5.2 contains new graphics capabilities. Some of them enable users to select space sub-domains of the total mesh grid in order to improve the zoom simulation of the geometry, both in 2D cuts and in 3D. Moreover the new BOT3P module (PDTM) may improve the interface of BOT3P geometrical models to transport analysis codes. The following programs are included in the BOT3P software package: GGDM, DDM, GGTM, DTM2, DTM3, RVARSCL, COMPARE, MKSRC, CATSM, DTET, and PDTM. The main features of these different programs are described. 2 - Methods: GGDM and GGTM work similarly from the logical point of view. Since the 3D case is more general, the following description refers to GGTM. All the co-ordinate values that characterise the geometrical scheme at the basis of the 3D transport code geometrical and material model are read, sorted and all stored if different from the neighbouring ones more than an input tolerance established by the user. These co-ordinates are always present in the fine-mesh boundary arrays independently of the mesh grid refinement options, because they

  6. Wien Automatic System Package (WASP). A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 2: Appendices

    International Nuclear Information System (INIS)

    1995-01-01

    With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). This second volume of the document to support use of the WASP-Ill Plus computer code consists of 5 appendices giving some additional information about the WASP-Ill Plus program. Appendix A is mainly addressed to the WASP-Ill Plus system analyst and supplies some information which could help in the implementation of the program on the user computer facilities. This appendix also includes some aspects about WASP-Ill Plus that could not be treated in detail in Chapters 1 to 11. Appendix B identifies all error and warning messages that may appear in the WASP printouts and advises the user how to overcome the problem. Appendix C presents the flow charts of the programs along with a brief description of the objectives and structure of each module. Appendix D describes the

  7. LA ICLASSE – SMARTPHONE E ITALIANO L2/LS

    Directory of Open Access Journals (Sweden)

    Filippo Zanoli

    2012-02-01

    Full Text Available Mai come oggi le classi (di Italiano L2, ma non solo sono così fortemente colonizzate dal digitale. L’invasione si è palesata sotto forma di telefonini cellulari (smartphone e tablet dotati di connessione internet e infinite potenzialità sotto forma di applicazioni che possono sia aiutare il discente sia, se male impiegati, ostacolarlo dall’intraprendere un proficuo percorso di apprendimento. Compito dell’insegnante di lingua è circoscrivere l’influenza e l’uso di tali strumenti – che non sono mero techné, ma un modo di funzionare di una generazione – non  per vietarli ma per proporre un approccio e un loro utilizzo costruttivo quali mezzi dalle grandi potenzialità nell’apprendimento di una lingua straniera.     The 1st class – Smartphones and Italian L2/FL   Classrooms (for Italian L2, but not only have never before been so strongly digitally oriented. This invasion is made tangible by smartphones and tablets equipped with internet connections and infinite potential thanks to applications which can help students in their studies, or hinder them if used incorrectly. The language teacher’s task is to circumscribe the influence and use of these tools– which are not mere techné, but the way an entire generation functions – not to prohibit using them but to propose using them constructively, thanks to their great potential for foreign language learning.

  8. Intermediate coupling collision strengths from LS coupled R-matrix elements

    International Nuclear Information System (INIS)

    Clark, R.E.H.

    1978-01-01

    Fine structure collision strength for transitions between two groups of states in intermediate coupling and with inclusion of configuration mixing are obtained from LS coupled reactance matrix elements (R-matrix elements) and a set of mixing coefficients. The LS coupled R-matrix elements are transformed to pair coupling using Wigner 6-j coefficients. From these pair coupled R-matrix elements together with a set of mixing coefficients, R-matrix elements are obtained which include the intermediate coupling and configuration mixing effects. Finally, from the latter R-matrix elements, collision strengths for fine structure transitions are computed (with inclusion of both intermediate coupling and configuration mixing). (Auth.)

  9. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-IV. User's manual

    International Nuclear Information System (INIS)

    2001-01-01

    generated by each plant and the user specified characteristics of fuels used. Expanded dimensions for handling up to 90 types of plants and a larger number of configurations (up to 500 per year and 5000 for the study period). The present manual allows us to support the use of the WASP-IV version and to illustrate the capabilities of the model. This manual contains 13 chapters. Chapter 1 gives a summary description of WASP-IV Computer Code and its Modules and file system. Chapter 2 explains the hardware requirement and the installation of the package. The sequence of the execution of WASP-IV is also briefly introduced in this chapter. Chapters 3 to 9 explains, in detail, how to execute each of the module of WASP-IV package, the organisation of input files and output from the run of the model. Special attention was paid to the description of the linkage of modules. Chapter 10 specially guides the users on how to effectively search for an optimal solution. Chapter 11 describes the execution of sensitivity analyses that can be (recommend to be) performed with WASP-IV. To ease the debugging during the running of the software, Chapter 12 provides technical details of the new features incorporated in this version. Chapter 13 provides a list of error and warning messages produced for each module of WASP. The reader of this manual is assumed to have experience in the field of power generation expansion planning and to be familiar with all concepts related to such type of analysis, therefore these aspects are not treated in this manual. Additional information on power generation expansion planning can be found in the IAEA publication 'Expansion Planning for Electrical Generating Systems, A Guidebook', Technical Reports Series No. 241 (1984) or User's Manual of WASP-IV Plus, Computer Manual Series No. 8, (1995)

  10. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  11. Application and evaluation of LS-PIV technique for the monitoring of river surface velocities in high flow conditions

    OpenAIRE

    Jodeau , M.; Hauet , A.; Paquier , A.; Le Coz , J.; Dramais , G.

    2008-01-01

    Large Scale Particle Image Velocimetry (LS-PIV) is used to measure the surface flow velocities in a mountain stream during high flow conditions due to a reservoir release. A complete installation including video acquisition from a mobile elevated viewpoint and artificial flow seeding has been developed and implemented. The LS-PIV method was adapted in order to take into account the specific constraints of these high flow conditions. Using a usual LS-PIV data processing, significant variations...

  12. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  13. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  14. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    Science.gov (United States)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and

  15. Development and Validation of an Instrument to Measure the Impact of Genetic Testing on Self-Concept in Lynch Syndrome (LS)

    Science.gov (United States)

    Esplen, Mary Jane; Stuckless, Noreen; Wong, Jiahui; Gallinger, Steve; Aronson, Melyssa; Rothenmund, Heidi; Semotiuk, Kara; Stokes, Jackie; Way, Chris; Green, Jane; Butler, Kate; Petersen, Helle Vendel

    2011-01-01

    Background A positive genetic test result may impact on a person’s self-concept and affect quality of life. Purpose The purpose of the study was to develop a self concept scale to measure such impact for individuals carrying mutations for a heritable colorectal cancer- Lynch Syndrome (LS). Methods Two distinct phases were involved: Phase 1 generated specific colorectal self-concept candidate scale items from interviews with eight LS carriers and five genetic counselors which were added to a previously developed self-concept scale for BRCA1/2 mutation carriers. Phase II had 115 LS carriers complete the candidate scale and a battery of validating measures. Results A 20 item scale was developed with two dimensions identified through factor analysis: stigma/vulnerability and bowel symptom-related anxiety. The scale demonstrated excellent reliability (Cronbach’s α = .93), good convergent validity by a high correlation with impact of event scale (r(102) = .55, pRosenberg Self-Esteem Scale (r(108) = −.59, pscale’s performance was stable across participant characteristics. Conclusions This new scale for measuring self-concept has potential to be used as a clinical tool and as a measure for future studies. PMID:21883167

  16. LS-SVM: uma nova ferramenta quimiométrica para regressão multivariada. Comparação de modelos de regressão LS-SVM e PLS na quantificação de adulterantes em leite em pó empregando NIR LS-SVM: a new chemometric tool for multivariate regression. Comparison of LS-SVM and pls regression for determination of common adulterants in powdered milk by nir spectroscopy

    Directory of Open Access Journals (Sweden)

    Marco F. Ferrão

    2007-08-01

    Full Text Available Least-squares support vector machines (LS-SVM were used as an alternative multivariate calibration method for the simultaneous quantification of some common adulterants found in powdered milk samples, using near-infrared spectroscopy. Excellent models were built using LS-SVM for determining R², RMSECV and RMSEP values. LS-SVMs show superior performance for quantifying starch, whey and sucrose in powdered milk samples in relation to PLSR. This study shows that it is possible to determine precisely the amount of one and two common adulterants simultaneously in powdered milk samples using LS-SVM and NIR spectra.

  17. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  18. Loft CIS analysis 2''-LS-118-AB outside containment penetration S5-D

    International Nuclear Information System (INIS)

    Morton, D.K.

    1978-01-01

    A stress analysis was performed on the 2''-LS-118-AB pipe system outside containment penetration S5-D. Deadweight, thermal expansion, and seismic loads were considered. The results indicate that this piping will meet ASME Section III, Class 2 requirements provided a U-bolt (S4) is installed as indicated in this report

  19. Remediation of Learning Disable Children Following L.S. Vygotsky's Approach

    Directory of Open Access Journals (Sweden)

    Janna M. Glozman

    2011-01-01

    Full Text Available The paper defines remediating education, its peculiarities against trasitional education, main tasks and principles, based upon the cultural-historical theory of L.S. Vygotsky. Base functional systems formed during remediation are discussed. Peculiarities of individual, group and dyadic methods of remediation are described with regard to its potential for mediating child's activity.

  20. Introducing instrumental variables in the LS-SVM based identification framework

    NARCIS (Netherlands)

    Laurain, V.; Zheng, W-X.; Toth, R.

    2011-01-01

    Least-Squares Support Vector Machines (LS-SVM) represent a promising approach to identify nonlinear systems via nonparametric estimation of the nonlinearities in a computationally and stochastically attractive way. All the methods dedicated to the solution of this problem rely on the minimization of

  1. Loft CIS analysis 2''-LS-118-AB outside containment penetration S5-D

    Energy Technology Data Exchange (ETDEWEB)

    Morton, D.K.

    1978-09-28

    A stress analysis was performed on the 2''-LS-118-AB pipe system outside containment penetration S5-D. Deadweight, thermal expansion, and seismic loads were considered. The results indicate that this piping will meet ASME Section III, Class 2 requirements provided a U-bolt (S4) is installed as indicated in this report.

  2. Regional-scale calculation of the LS factor using parallel processing

    Science.gov (United States)

    Liu, Kai; Tang, Guoan; Jiang, Ling; Zhu, A.-Xing; Yang, Jianyi; Song, Xiaodong

    2015-05-01

    With the increase of data resolution and the increasing application of USLE over large areas, the existing serial implementation of algorithms for computing the LS factor is becoming a bottleneck. In this paper, a parallel processing model based on message passing interface (MPI) is presented for the calculation of the LS factor, so that massive datasets at a regional scale can be processed efficiently. The parallel model contains algorithms for calculating flow direction, flow accumulation, drainage network, slope, slope length and the LS factor. According to the existence of data dependence, the algorithms are divided into local algorithms and global algorithms. Parallel strategy are designed according to the algorithm characters including the decomposition method for maintaining the integrity of the results, optimized workflow for reducing the time taken for exporting the unnecessary intermediate data and a buffer-communication-computation strategy for improving the communication efficiency. Experiments on a multi-node system show that the proposed parallel model allows efficient calculation of the LS factor at a regional scale with a massive dataset.

  3. Extension of a GIS procedure for calculating the RUSLE equation LS factor

    NARCIS (Netherlands)

    Zhang, H.; Yang, Q.; Li, R.; Liu, Q.; Moore, D.; He, P.; Ritsema, C.J.; Geissen, V.

    2013-01-01

    The Universal Soil Loss Equation (USLE) and revised USLE (RUSLE) are often used to estimate soil erosion at regional landscape scales, however a major limitation is the difficulty in extracting the LS factor. The geographic information system-based (GIS-based) methods which have been developed for

  4. Electromagnetic transitions of heavy quarkonia in the boosted LS-coupling scheme

    International Nuclear Information System (INIS)

    Ishida, Shin; Morikawa, Akiyoshi; Oda, Masuho

    1998-01-01

    Radiative transitions among heavy quarkonium systems are investigated in a general framework of the boosted LS-coupling (BLS) scheme, where mesons are treated in a manifestly covariant way and conserved effective currents are explicitly given. As a result it is shown that our theory reproduces the qualitative features of experiments remarkably well, giving evidence for the validity of the BLS scheme. (author)

  5. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    International Nuclear Information System (INIS)

    1995-01-01

    (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). The present document has been produced to support use of the WASP-Ill Plus computer code and to illustrate the capabilities of the program. This Manual is organized in two separate volumes. This first one includes 11 main chapters describing how to use the WASP-Ill Plus computer program. Chapter 1 gives a summary description and some background information about the program. Chapter 2 introduces some concepts, mainly related to the computer requirements imposed by the program, that are used throughout the Manual. Chapters 3 to 9 describe how to execute each of the various programs (or modules) of the WASP-Ill Plus package. The description for each module shows the user how to prepare the Job Control statements and input data needed to execute the module and how to interpret the printed output produced. The iterative process that should be followed in order to obtain the 'optimal solution' for a WASP case study is covered in Chapters 6 to 8. Chapter 10 explains the use of an auxiliary program of the WASP package which is mainly intended for saving computer time. Lastly, Chapter 11 recapitulates the use of WASP-Ill Plus for executing a generation expansion planning study; describes the several phases normally involved in this type of study; and provides the user with practical hints about the most important aspects that need to be verified at each phase while executing the various WASP modules

  6. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  7. Method to generate the first design of the reload pattern to be used with the Presto-B code in the simulation of the CNLV U-1 reactor

    International Nuclear Information System (INIS)

    Montes T, J.L.; Cortes C, C.C.

    1992-08-01

    This guide is applied for the reload pattern's formation for mirror symmetry of a core room and in accordance with the Control Cell core technique (of the english Control Cell Core - CCC) for the PRESTO-B code. (Author)

  8. Dual Coding, Reasoning and Fallacies.

    Science.gov (United States)

    Hample, Dale

    1982-01-01

    Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)

  9. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    As a continuation of its effort to provide comprehensive and impartial guidance to Member States facing the need for introducing nuclear power, the IAEA has completed a new version of the Wien Automatic System Planning (WASP) Package for carrying out power generation expansion planning studies. WASP was originally developed in 1972 in the USA to meet the IAEA's needs to analyze the economic competitiveness of nuclear power in comparison to other generation expansion alternatives for supplying the future electricity requirements of a country or region. The model was first used by the IAEA to conduct global studies (Market Survey for Nuclear Power Plants in Developing Countries, 1972-1973) and to carry out Nuclear Power Planning Studies for several Member States. The WASP system developed into a very comprehensive planning tool for electric power system expansion analysis. Following these developments, the so-called WASP-Ill version was produced in 1979. This version introduced important improvements to the system, namely in the treatment of hydroelectric power plants. The WASP-III version has been continually updated and maintained in order to incorporate needed enhancements. In 1981, the Model for Analysis of Energy Demand (MAED) was developed in order to allow the determination of electricity demand, consistent with the overall requirements for final energy, and thus, to provide a more adequate forecast of electricity needs to be considered in the WASP study. MAED and WASP have been used by the Agency for the conduct of Energy and Nuclear Power Planning Studies for interested Member States. More recently, the VALORAGUA model was completed in 1992 as a means for helping in the preparation of the hydro plant characteristics to be input in the WASP study and to verify that the WASP overall optimized expansion plan takes also into account an optimization of the use of water for electricity generation. The combined application of VALORAGUA and WASP permits the

  10. Lindas Leen paštēls, tēls drukātajos medijos un fanu publikās

    OpenAIRE

    Komarovskis, Jānis

    2011-01-01

    Bakalaura darba tēma ir „Lindas Leen paštēls, tēls drukātajos medijos un fanu publikās”. Darba mērķis ir izpētīt Lindas Leen paštēlu, tēlu drukātajos medijos un fanu publikās. Darba teorētiskā daļa balstās uz Klausa Mertena (Klaus Merten) tēla veidošanās teoriju, kura apgalvo, ka tēla veidošanās auditorijas uztverē ir cieši saistīta ar medijos publicēto informāciju par noteikto objektu. Pētījuma veikšanai ir izmantotas četras pētniecības metodes: daļēji strukturētā intervija, mediju kon...

  11. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  12. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  13. Extension of ANISN and DOT 3.5 transport computer codes to calculate heat generation by radiation and temperature distribution in nuclear reactors

    International Nuclear Information System (INIS)

    Torres, L.M.R.; Gomes, I.C.; Maiorino, J.R.

    1986-01-01

    The ANISN and DOT 3.5 codes solve the transport equation using the discrete ordinate method, in one and two-dimensions, respectively. The objectives of the study were to modify these two codes, frequently used in reactor shielding problems, to include nuclear heating calculations due to the interaction of neutrons and gamma-rays with matter. In order to etermine the temperature distribution, a numerical algorithm was developed using the finite difference method to solve the heat conduction equation, in one and two-dimensions, considering the nuclear heating from neutron and gamma-rays, as the source term. (Author) [pt

  14. Semantic Web applications and tools for the life sciences: SWAT4LS 2010.

    Science.gov (United States)

    Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott; Splendiani, Andrea

    2012-01-25

    As Semantic Web technologies mature and new releases of key elements, such as SPARQL 1.1 and OWL 2.0, become available, the Life Sciences continue to push the boundaries of these technologies with ever more sophisticated tools and applications. Unsurprisingly, therefore, interest in the SWAT4LS (Semantic Web Applications and Tools for the Life Sciences) activities have remained high, as was evident during the third international SWAT4LS workshop held in Berlin in December 2010. Contributors to this workshop were invited to submit extended versions of their papers, the best of which are now made available in the special supplement of BMC Bioinformatics. The papers reflect the wide range of work in this area, covering the storage and querying of Life Sciences data in RDF triple stores, tools for the development of biomedical ontologies and the semantics-based integration of Life Sciences as well as clinicial data.

  15. LS1 “First Long Shutdown of LHC and its Injector Chains”

    CERN Multimedia

    Foraz, K; Barberan, M; Bernardini, M; Coupard, J; Gilbert, N; Hay, D; Mataguez, S; McFarlane, D

    2014-01-01

    The LHC and its Injectors were stopped in February 2013, in order to maintain, consolidate and upgrade the different equipment of the accelerator chain, with the goal of achieving LHC operation at the design energy of 14 TeV in the centre-of-mass. Prior to the start of this First Long Shutdown (LS1), a major effort of preparation was performed in order to optimize the schedule and the use of resources across the different machines, with the aim of resuming LHC physics in early 2015. The rest of the CERN complex will restart beam operation in the second half of 2014. This paper presents the schedule of the LS1, describes the organizational set-up for the coordination of the works, the main activities, the different main milestones, which have been achieved so far, and the decisions taken in order to mitigate the issues encountered.

  16. Calculating the LS factor of Universal Soil Loss Equation (USLE for the watershed of River Silver, Castelo-ES = Cálculo do fator LS da Equação Universal de Perdas de Solos (EUPS para a bacia do Rio da Prata, Castelo-ES

    Directory of Open Access Journals (Sweden)

    Luciano Melo Coutinho

    2014-01-01

    Full Text Available Erosion is considered the main cause of depletion of agricultural land, which generates approximate annual losses of billions of dollars in Brazil. Water erosion is the most common, caused by effective precipitation basin, since its potential is the main agent of reshaping the land. Universal Equation Soil Loss (USLE show great applicability to estimate erosion in watersheds from their physical and geographical elements (PS = R*K*L*S*C*P. The intensity of erosion can be influenced by the profile of the slope, measured by the length (L and grade of slope (S. The topographic factor (LS of the USLE is the most difficult to obtain for large and/or diverse relief areas. We calculated the spatial map of the LS factor Silver watershed (Castelo-ES from the processing of cartographic data in environment Geographic Information Systems (GIS. The relief of the study area was represented by interpolation from contour lines supported mapped hydrography, give the digital elevation model hydrologically consistent (MDEHC and declivity map. For generation of the LS factor was used to equation developed by Bertoni and Lombardi Neto (2005, suitable for slopes of different length and declivity. The Silver watershed has diversified relief, marked by flat and steep declivity areas, which indicates erosive vulnerability. The main values of LS were identified minimum (0, medium (8.2 and maximum (80.5. = A erosão é apontada como a principal causa do depauperamento de terras agrícolas, o que gera prejuízos anuais aproximados da ordem de bilhões de dólares no Brasil. A erosão hídrica é a mais comum, ocasionada pela precipitação efetiva em bacias, que devido ao seu potencial erosivo é o principal agente de remodelagem do terreno. A Equação Universal de Perdas de Solos (EUPS mostra-se de grande aplicabilidade para estimar a erosão de bacias hidrográficas a partir de seus elementos físicos e geográficos (PS = R*K*L*S*C*P. A intensidade da erosão pode sofrer

  17. ABCB1 (P-glycoprotein) reduces bacterial attachment to human gastrointestinal LS174T epithelial cells.

    Science.gov (United States)

    Crowe, Andrew; Bebawy, Mary

    2012-08-15

    The aim of this project was to show elevated P-glycoprotein (P-gp) expression decreasing bacterial association with LS174T human gastrointestinal cells, and that this effect could be reversed upon blocking functional P-gp efflux. Staphylococcus aureus, Klebsiella pneumoniae, Pseudomonas aeruginosa, Lactobacillus acidophilus and numerous strains of Escherichia coli, from commensal to enteropathogenic and enterohaemorrhagic strains (O157:H7) were fluorescently labelled and incubated on LS174T cultures either with or without P-gp amplification using rifampicin. PSC-833 was used as a potent functional P-gp blocking agent. Staphylococcus and Pseudomonas displayed the greatest association with the LS174T cells. Surprisingly, lactobacilli retained more fluorescence than enteropathogenic-E. coli in this system. Irrespective of attachment differences between the bacterial species, the increase in P-gp protein expression decreased bacterial fluorescence by 25-30%. This included the GFP-labelled E. coli, and enterohaemorrhagic E. coli (O157:H7). Blocking P-gp function through the co-administration of PSC-833 increased the amount of bacteria associated with P-gp expressing LS174T cells back to control levels. As most bacteria were affected to the same degree, irrespective of pathogenicity, it is unlikely that P-gp has a direct influence on adhesion of bacteria, and instead P-gp may be playing an indirect role by secreting a bank of endogenous factors or changing the local environment to one less suited to bacterial growth in general. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  18. Frequency domain based LS channel estimation in OFDM based Power line communications

    OpenAIRE

    Bogdanović, Mario

    2015-01-01

    This paper is focused on low voltage power line communication (PLC) realization with an emphasis on channel estimation techniques. The Orthogonal Frequency Division Multiplexing (OFDM) scheme is preferred technology in PLC systems because of its effective combat with frequency selective fading properties of PLC channel. As the channel estimation is one of the crucial problems in OFDM based PLC system because of a problematic area of PLC signal attenuation and interference, the improved LS est...

  19. Quenching of the Gamow-Teller matrix element in closed LS-shell-plus-one nuclei

    International Nuclear Information System (INIS)

    Towner, I.S.

    1989-06-01

    It is evident that nuclear Gamow-Teller matrix elements determined from β-decay and charge-exchange reactions are significantly quenched compared to simple shell-model estimates based on one-body operators and free-nucleon coupling constants. Here we discuss the theoretical origins of this quenching giving examples from light nuclei near LS-closed shells, such as 16 0 and 40 Ca. (Author) 12 refs., 2 tabs

  20. Novel Hybrid of LS-SVM and Kalman Filter for GPS/INS Integration

    Science.gov (United States)

    Xu, Zhenkai; Li, Yong; Rizos, Chris; Xu, Xiaosu

    Integration of Global Positioning System (GPS) and Inertial Navigation System (INS) technologies can overcome the drawbacks of the individual systems. One of the advantages is that the integrated solution can provide continuous navigation capability even during GPS outages. However, bridging the GPS outages is still a challenge when Micro-Electro-Mechanical System (MEMS) inertial sensors are used. Methods being currently explored by the research community include applying vehicle motion constraints, optimal smoother, and artificial intelligence (AI) techniques. In the research area of AI, the neural network (NN) approach has been extensively utilised up to the present. In an NN-based integrated system, a Kalman filter (KF) estimates position, velocity and attitude errors, as well as the inertial sensor errors, to output navigation solutions while GPS signals are available. At the same time, an NN is trained to map the vehicle dynamics with corresponding KF states, and to correct INS measurements when GPS measurements are unavailable. To achieve good performance it is critical to select suitable quality and an optimal number of samples for the NN. This is sometimes too rigorous a requirement which limits real world application of NN-based methods.The support vector machine (SVM) approach is based on the structural risk minimisation principle, instead of the minimised empirical error principle that is commonly implemented in an NN. The SVM can avoid local minimisation and over-fitting problems in an NN, and therefore potentially can achieve a higher level of global performance. This paper focuses on the least squares support vector machine (LS-SVM), which can solve highly nonlinear and noisy black-box modelling problems. This paper explores the application of the LS-SVM to aid the GPS/INS integrated system, especially during GPS outages. The paper describes the principles of the LS-SVM and of the KF hybrid method, and introduces the LS-SVM regression algorithm. Field

  1. Development of the next generation code system as an engineering modeling language (6). Development of a cross section adjustment and nuclear design accuracy evaluation solver

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Numata, Kazuyuki

    2008-01-01

    A new cross section adjustment and nuclear design accuracy evaluation solver was developed as a set of modules for MARBLE (multi-purpose advanced reactor physics analysis system based on language of engineering). In order to enhance the system extendibility and flexibility, the object-oriented design and analysis technique was adopted to the development. In the new system, it is easy to add a new design accuracy evaluation method because a new numerical calculation module is independent from other modules. Further, several new functions such as searching and editing calculation data are provided in the new solver. These functions can be easily customised by users because they are designed to work cooperatively with Python scripting language, which is used as a user interface of the MARBLE system. In order to validate the new solver, a test calculation was performed for a realistic calculation case of creating a new unified cross section library. In the test calculation, results calculated by the new solver agreed well with those by the conventional code system. In addition, it is possible to reuse existing input data files prepared for the conventional code system because the new solver utilities support the conventional formats. Because the new solver implements all main functions of the conventional code system, MARBLE can be used as a new calculation code system for cross section adjustment and nuclear design accuracy evaluation

  2. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    Directory of Open Access Journals (Sweden)

    Fuqiang Sun

    2017-06-01

    Full Text Available Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM and a genetic algorithm (GA. Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  3. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    Science.gov (United States)

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  4. Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films

    Science.gov (United States)

    Abazari, M.; Choi, T.; Cheong, S.-W.; Safari, A.

    2010-01-01

    We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K0.44,Na0.52,Li0.04)(Nb0.84,Ta0.1,Sb0.06)O3 (KNN-LT-LS) thin films on SrTiO3 substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180° domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d33) of the films were calculated using piezoelectric displacement curves and shown to be ~53 pm V-1 for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.

  5. Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films

    Energy Technology Data Exchange (ETDEWEB)

    Abazari, M; Safari, A [Glenn Howatt Electroceramics Laboratories, Department of Materials Science and Engineering, Rutgers-The state University of New Jersey, Piscataway, NJ 08854 (United States); Choi, T; Cheong, S-W [Rutgers Center for Emergent Materials, Department of Physics and Astronomy, Rutgers-The state University of New Jersey, Piscataway, NJ 08854 (United States)

    2010-01-20

    We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K{sub 0.44},Na{sub 0.52},Li{sub 0.04})(Nb{sub 0.84},Ta{sub 0.1},Sb{sub 0.06})O{sub 3} (KNN-LT-LS) thin films on SrTiO{sub 3} substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180{sup 0} domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d{sub 33}) of the films were calculated using piezoelectric displacement curves and shown to be {approx}53 pm V{sup -1} for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.

  6. Simple Detection of Large InDeLS by DHPLC: The ACE Gene as a Model

    Directory of Open Access Journals (Sweden)

    Renata Guedes Koyama

    2008-01-01

    Full Text Available Insertion-deletion polymorphism (InDeL is the second most frequent type of genetic variation in the human genome. For the detection of large InDeLs, researchers usually resort to either PCR gel analysis or RFLP, but these are time consuming and dependent on human interpretation. Therefore, a more efficient method for genotyping this kind of genetic variation is needed. In this report, we describe a method that can detect large InDeLs by DHPLC (denaturating high-performance liquid chromatography using the angiotensin-converting enzyme (ACE gene I/D polymorphism as a model. The InDeL targeted in this study is characterized by a 288 bp Alu element insertion (I. We used DHPLC at nondenaturating conditions to analyze the PCR product with a flow through the chromatographic column under two different gradients based on the differences between D and I sequences. The analysis described is quick and easy, making this technique a suitable and efficient means for DHPLC users to screen InDeLs in genetic epidemiological studies.

  7. Launch of new e-learning course “Safety during LS1”

    CERN Multimedia

    HSE Unit

    2013-01-01

    After 3 years of activity, the LHC and the rest of the accelerator chain have been shut down for about 2 years (from February 2013 to December 2014) due to maintenance and upgrade work, on the surface as well as underground.   CERN has developed a new e-learning course related to this Long Shutdown period (LS1) so as to provide all the collaborators working in the LS1 area with accurate security-oriented information. The objectives of this new course are to: Present LS1 and its context, Identify CERN facilities’ major risks, Identify the main risks associated with our co-activities, Explain how safety issues are being handled at CERN, Present all the basic safety measures to be respected, Present all emergency and rescue instructions. The course is available via the e-learning SIR application. It is compulsory for all newcomers at CERN, along with the “CERN Safety Introduction” training. It is also highly recommended that people who were already w...

  8. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  9. Quasi-cyclic unit memory convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Paaske, Erik; Ballan, Mark

    1990-01-01

    Unit memory convolutional codes with generator matrices, which are composed of circulant submatrices, are introduced. This structure facilitates the analysis of efficient search for good codes. Equivalences among such codes and some of the basic structural properties are discussed. In particular......, catastrophic encoders and minimal encoders are characterized and dual codes treated. Further, various distance measures are discussed, and a number of good codes, some of which result from efficient computer search and some of which result from known block codes, are presented...

  10. The use of the computer code PE2D in the electrostatic modelling of an electron beam generator vacuum diode interface

    International Nuclear Information System (INIS)

    Biddlecombe, C.S.; Edwards, C.B.; Shaw, M.J.

    1981-10-01

    The computer code PE2D has been used to optimise the design of a compact, 500kV, low inductance vacuum diode interface assembly for SPRITE, a sophisticated electron beam pumped exciplex laser system under construction at RAL. Electrostatic modelling of various dielectric interfaces has been achieved in cylindrical symmetry under conditions not amenable to more traditional methods of electrostatic field plotting. (author)

  11. Investigation and Applications of In-Source Oxidation in Liquid Sampling-Atmospheric Pressure Afterglow Microplasma Ionization (LS-APAG) Source.

    Science.gov (United States)

    Xie, Xiaobo; Wang, Zhenpeng; Li, Yafeng; Zhan, Lingpeng; Nie, Zongxiu

    2017-06-01

    A liquid sampling-atmospheric pressure afterglow microplasma ionization (LS-APAG) source is presented for the first time, which is embedded with both electrospray ionization (ESI) and atmospheric pressure afterglow microplasma ionization (APAG) techniques. This ion source is capable of analyzing compounds with diverse molecule weights and polarities. An unseparated mixture sample was detected as a proof-of-concept, giving complementary information (both polarities and non-polarities) with the two ionization modes. It should also be noted that molecular mass can be quickly identified by ESI with clean and simple spectra, while the structure can be directly studied using APAG with in-source oxidation. The ionization/oxidation mechanism and applications of the LS-APAG source have been further explored in the analysis of nonpolar alkanes and unsaturated fatty acids/esters. A unique [M + O - 3H] + was observed in the case of individual alkanes (C 5 -C 19 ) and complex hydrocarbons mixture under optimized conditions. Moreover, branched alkanes generated significant in-source fragments, which could be further applied to the discrimination of isomeric alkanes. The technique also facilitates facile determination of double bond positions in unsaturated fatty acids/esters due to diagnostic fragments (the acid/ester-containing aldehyde and acid oxidation products) generated by on-line ozonolysis in APAG mode. Finally, some examples of in situ APAG analysis by gas sampling and surface sampling were given as well. Graphical Abstract ᅟ.

  12. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  13. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  14. Study on the performance of the Particle Identification Detectors at LHCb after the LHC First Long Shutdown (LS1)

    CERN Document Server

    Fontana, Marianna

    2016-01-01

    During the First Long Shutdown (LS1), the LHCb experiment has introduced major modification in the data-processing procedure and modified part of the detector to deal with the increased energy and the increased heavy-hadron production cross-section. In this contribution we review the performance of the particle identification detectors at LHCb, Rich, Calorimeters, and Muon system, after the LS1

  15. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  16. High-energy emissions from the gamma-ray binary LS 5039

    Energy Technology Data Exchange (ETDEWEB)

    Takata, J.; Leung, Gene C. K.; Cheng, K. S. [Department of Physics, University of Hong Kong, Pokfulam Road (Hong Kong); Tam, P. H. T.; Kong, A. K. H. [Institute of Astronomy and Department of Physics, National Tsing Hua University, Hsinchu, Taiwan (China); Hui, C. Y., E-mail: takata@hku.hk, E-mail: gene930@connect.hku.hk, E-mail: hrspksc@hku.hk [Department of Astronomy and Space Science, Chungnam National University, Daejeon (Korea, Republic of)

    2014-07-20

    We study mechanisms of multi-wavelength emissions (X-ray, GeV, and TeV gamma-rays) from the gamma-ray binary LS 5039. This paper is composed of two parts. In the first part, we report on results of observational analysis using 4 yr data of the Fermi Large Area Telescope. Due to the improvement of instrumental response function and increase of the statistics, the observational uncertainties of the spectrum in the ∼100-300 MeV bands and >10 GeV bands are significantly improved. The present data analysis suggests that the 0.1-100 GeV emissions from LS 5039 contain three different components: (1) the first component contributes to <1 GeV emissions around superior conjunction, (2) the second component dominates in the 1-10 GeV energy bands, and (3) the third component is compatible with the lower-energy tail of the TeV emissions. In the second part, we develop an emission model to explain the properties of the phase-resolved emissions in multi-wavelength observations. Assuming that LS 5039 includes a pulsar, we argue that emissions from both the magnetospheric outer gap and the inverse-Compton scattering process of cold-relativistic pulsar wind contribute to the observed GeV emissions. We assume that the pulsar is wrapped by two kinds of termination shock: Shock-I due to the interaction between the pulsar wind and the stellar wind and Shock-II due to the effect of the orbital motion. We propose that the X-rays are produced by the synchrotron radiation at the Shock-I region and the TeV gamma-rays are produced by the inverse-Compton scattering process at the Shock-II region.

  17. SURF: a subroutine code to draw the axonometric projection of a surface generated by a scalar function over a discretized plane domain using finite element computations

    International Nuclear Information System (INIS)

    Giuliani, Giovanni; Giuliani, Silvano.

    1980-01-01

    The FORTRAN IV subroutine SURF has been designed to help visualising the results of Finite Element computations. It drawns the axonometric projection of a surface generated in 3-dimensional space by a scalar function over a discretized plane domain. The most important characteristic of the routine is to remove the hidden lines and in this way it enables a clear vision of the details of the generated surface

  18. The Bacillus subtilis Conjugative Plasmid pLS20 Encodes Two Ribbon-Helix-Helix Type Auxiliary Relaxosome Proteins That Are Essential for Conjugation.

    Science.gov (United States)

    Miguel-Arribas, Andrés; Hao, Jian-An; Luque-Ortega, Juan R; Ramachandran, Gayetri; Val-Calvo, Jorge; Gago-Córdoba, César; González-Álvarez, Daniel; Abia, David; Alfonso, Carlos; Wu, Ling J; Meijer, Wilfried J J

    2017-01-01

    Bacterial conjugation is the process by which a conjugative element (CE) is transferred horizontally from a donor to a recipient cell via a connecting pore. One of the first steps in the conjugation process is the formation of a nucleoprotein complex at the origin of transfer ( oriT ), where one of the components of the nucleoprotein complex, the relaxase, introduces a site- and strand specific nick to initiate the transfer of a single DNA strand into the recipient cell. In most cases, the nucleoprotein complex involves, besides the relaxase, one or more additional proteins, named auxiliary proteins, which are encoded by the CE and/or the host. The conjugative plasmid pLS20 replicates in the Gram-positive Firmicute bacterium Bacillus subtilis . We have recently identified the relaxase gene and the oriT of pLS20, which are separated by a region of almost 1 kb. Here we show that this region contains two auxiliary genes that we name aux1 LS20 and aux2 LS20 , and which we show are essential for conjugation. Both Aux1 LS20 and Aux2 LS20 are predicted to contain a Ribbon-Helix-Helix DNA binding motif near their N-terminus. Analyses of the purified proteins show that Aux1 LS20 and Aux2 LS20 form tetramers and hexamers in solution, respectively, and that they both bind preferentially to oriT LS20 , although with different characteristics and specificities. In silico analyses revealed that genes encoding homologs of Aux1 LS20 and/or Aux2 LS20 are located upstream of almost 400 relaxase genes of the Rel LS20 family (MOB L ) of relaxases. Thus, Aux1 LS20 and Aux2 LS20 of pLS20 constitute the founding member of the first two families of auxiliary proteins described for CEs of Gram-positive origin.

  19. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  20. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  1. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  2. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  3. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  4. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  5. Applications of the computer codes FLUX2D and PHI3D for the electromagnetic analysis of compressed magnetic field generators and power flow channels

    International Nuclear Information System (INIS)

    Hodgdon, M.L.; Oona, H.; Martinez, A.R.; Salon, S.; Wendling, P.; Krahenbuhl, L.; Nicolas, A.; Nicolas, L.

    1990-01-01

    The authors present the results of three electromagnetic field problems for compressed magnetic field generators and their associated power flow channels. The first problem is the computation of the transient magnetic field in a two-dimensional model of a helical generator during loading. The second problem is the three-dimensional eddy current patterns in a section of an armature beneath a bifurcation point of a helical winding. The authors' third problem is the calculation of the three-dimensional electrostatic fields in a region known as the post-hole convolute in which a rod connects the inner and outer walls of a system of three concentric cylinders through a hole in the middle cylinder. While analytic solutions exist for many electromagnetic filed problems in cases of special and ideal geometries, the solution of these and similar problems for the proper analysis and design of compressed magnetic field generators and their related hardware require computer simulations

  6. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  7. Finite Element Simulation of Medium-Range Blast Loading Using LS-DYNA

    Directory of Open Access Journals (Sweden)

    Yuzhen Han

    2015-01-01

    Full Text Available This study investigated the Finite Element simulation of blast loading using LS-DYNA. The objective is to identify approaches to reduce the requirement of computation effort while maintaining reasonable accuracy, focusing on blast loading scheme, element size, and its relationship with scale of explosion. The study made use of the recently developed blast loading scheme in LS-DYNA, which removes the necessity to model the explosive in the numerical models but still maintains the advantages of nonlinear fluid-structure interaction. It was found that the blast loading technique could significantly reduce the computation effort. It was also found that the initial density of air in the numerical model could be purposely increased to partially compensate the error induced by the use of relatively large air elements. Using the numerical approach, free air blast above a scaled distance of 0.4 m/kg1/3 was properly simulated, and the fluid-structure interaction at the same location could be properly duplicated using proper Arbitrary Lagrangian Eulerian (ALE coupling scheme. The study also showed that centrifuge technique, which has been successfully employed in model tests to investigate the blast effects, may be used when simulating the effect of medium- to large-scale explosion at small scaled distance.

  8. LS1 to LHC Report: LHC key handed back to Operations

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    This week, after 23 months of hard work involving about 1000 people every day, the key to the LHC was symbolically handed back to the Operations team. The first long shutdown is over and the machine is getting ready for a restart that will bring its beam to full energy in early spring.   Katy Foraz, LS1 activities coordinator, symbolically hands the LHC key to the operations team, represented, left to right, by Jorg Wenninger, Mike Lamont and Mirko Pojer. All the departments, all the machines and all the experimental areas were involved in the first long shutdown of the LHC that began in February 2013. Over the last two years, the Bulletin has closely followed  all the work and achievements that had been carefully included in the complex general schedule drawn up and managed by the team led by Katy Foraz from the Engineering Department. “The work on the schedule began two years before the start of LS1 and one of the first things we realised was that there was no commercial...

  9. Signal Simulation and Experimental Research on Acoustic Emission using LS-DYNA

    Directory of Open Access Journals (Sweden)

    Zhang Jianchao

    2015-09-01

    Full Text Available To calculate sound wave velocity, we performed the Hsu-Nielsen lead break experiment using the ANSYS/LS-DYNA finite element software. First, we identified the key problems in the finite element analysis, such as selecting the exciting force, dividing the grid density, and setting the calculation steps. Second, we established the finite element model of the sound wave transmission in a plate under the lead break simulation. Results revealed not only the transmission characteristics of the sound wave but also the simulation and calculation of the transmission velocity of the longitudinal and transverse waves through the time travel curve of the vibration velocity of the sound wave at various nodes. Finally, the Hsu-Nielsen lead break experiment was implemented. The results of the theoretical calculation and simulation analysis were consistent with the experimental results, thus demonstrating that the research method using the ANSYS/LS-DYNA software to simulate sound wave transmissions in acoustic emission experiments is feasible and effective.

  10. The successful completion of LS1, consolidation and preparations for the future

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    For CERN’s Technology (TE) Department, success in LS1 is more important than finishing. In other words, the aim is to reach the finish line having maintained the highest standards of safety, quality and performance. Other challenges need to be faced too, before, during and after LS1, and the Department always approaches them with optimism. The new Department Head tells us how his 750 colleagues work to keep the Laboratory at the cutting edge of high-energy physics technology.   José Miguel Jiménez. “We can only grow once we’ve stabilised our base.” The message presented by José Miguel Jiménez, who stepped into the role of TE Department Head in January 2014, is clear, as are his priorities going forward: “The Technology Department needs to be consolidated in terms of both its personnel and its assembly and test infrastructures, some of which are unique.” The TE Department is tasked with pr...

  11. Development status of TUF code

    International Nuclear Information System (INIS)

    Liu, W.S.; Tahir, A.; Zaltsgendler

    1996-01-01

    An overview of the important development of the TUF code in 1995 is presented. The development in the following areas is presented: control of round-off error propagation, gas resolution and release models, and condensation induced water hammer. This development is mainly generated from station requests for operational support and code improvement. (author)

  12. Running codes through the web

    International Nuclear Information System (INIS)

    Clark, R.E.H.

    2001-01-01

    Dr. Clark presented a report and demonstration of running atomic physics codes through the WWW. The atomic physics data is generated from Los Alamos National Laboratory (LANL) codes that calculate electron impact excitation, ionization, photoionization, and autoionization, and inversed processes through detailed balance. Samples of Web interfaces, input and output are given in the report

  13. Development of Automated Procedures to Generate Reference Building Models for ASHRAE Standard 90.1 and India’s Building Energy Code and Implementation in OpenStudio

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Andrew [National Renewable Energy Lab. (NREL), Golden, CO (United States); Haves, Philip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jegi, Subhash [International Institute of Information Technology, Hyderabad (India); Garg, Vishal [International Institute of Information Technology, Hyderabad (India); Ravache, Baptiste [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-09-14

    This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.

  14. Generation of initial geometries for the simulation of the physical system in the DualPHYsics code; Generacion de geometrias iniciales para la simulacion del sistema fisico en el codigo DualSPHysics

    Energy Technology Data Exchange (ETDEWEB)

    Segura Q, E.

    2013-07-01

    In the diverse research areas of the Instituto Nacional de Investigaciones Nucleares (ININ) are different activities related to science and technology, one of great interest is the study and treatment of the collection and storage of radioactive waste. Therefore at ININ the draft on the simulation of the pollutants diffusion in the soil through a porous medium (third stage) has this problem inherent aspects, hence a need for such a situation is to generate the initial geometry of the physical system For the realization of the simulation method is implemented smoothed particle hydrodynamics (SPH). This method runs in DualSPHysics code, which has great versatility and ability to simulate phenomena of any physical system where hydrodynamic aspects combine. In order to simulate a physical system DualSPHysics code, you need to preset the initial geometry of the system of interest, then this is included in the input file of the code. The simulation sets the initial geometry through regular geometric bodies positioned at different points in space. This was done through a programming language (Fortran, C + +, Java, etc..). This methodology will provide the basis to simulate more complex geometries future positions and form. (Author)

  15. The Bacillus subtilis Conjugative Plasmid pLS20 Encodes Two Ribbon-Helix-Helix Type Auxiliary Relaxosome Proteins That Are Essential for Conjugation

    Directory of Open Access Journals (Sweden)

    Andrés Miguel-Arribas

    2017-11-01

    Full Text Available Bacterial conjugation is the process by which a conjugative element (CE is transferred horizontally from a donor to a recipient cell via a connecting pore. One of the first steps in the conjugation process is the formation of a nucleoprotein complex at the origin of transfer (oriT, where one of the components of the nucleoprotein complex, the relaxase, introduces a site- and strand specific nick to initiate the transfer of a single DNA strand into the recipient cell. In most cases, the nucleoprotein complex involves, besides the relaxase, one or more additional proteins, named auxiliary proteins, which are encoded by the CE and/or the host. The conjugative plasmid pLS20 replicates in the Gram-positive Firmicute bacterium Bacillus subtilis. We have recently identified the relaxase gene and the oriT of pLS20, which are separated by a region of almost 1 kb. Here we show that this region contains two auxiliary genes that we name aux1LS20 and aux2LS20, and which we show are essential for conjugation. Both Aux1LS20 and Aux2LS20 are predicted to contain a Ribbon-Helix-Helix DNA binding motif near their N-terminus. Analyses of the purified proteins show that Aux1LS20 and Aux2LS20 form tetramers and hexamers in solution, respectively, and that they both bind preferentially to oriTLS20, although with different characteristics and specificities. In silico analyses revealed that genes encoding homologs of Aux1LS20 and/or Aux2LS20 are located upstream of almost 400 relaxase genes of the RelLS20 family (MOBL of relaxases. Thus, Aux1LS20 and Aux2LS20 of pLS20 constitute the founding member of the first two families of auxiliary proteins described for CEs of Gram-positive origin.

  16. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  17. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  18. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  19. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  20. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  1. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  2. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  3. The Ne3LS Network, Quebec's initiative to evaluate the impact and promote a responsible and sustainable development of nanotechnology

    International Nuclear Information System (INIS)

    Endo, Charles-Anica; Emond, Claude; Battista, Renaldo; Parizeau, Marie-Helene; Beaudry, Catherine

    2011-01-01

    The spectacular progress made by nanosciences and nanotechnologies elicits as much hope and fear. Consequently, a great number of research and training initiatives on the ethical, environmental, economic, legal and social issues regarding nanotechnology development (Ne 3 LS) are emerging worldwide. In Quebec, Canada, a Task Force was mandated by NanoQuebec to conceive a Ne 3 LS research and training strategy to assess those issues. This Task Force brought together experts from universities, governments or industry working in nanosciences and nanotechnologies or in Ne 3 LS. Their resulting action plan, made public in November 2006, contained several recommendations, including the creation of a knowledge network (Ne 3 LS Network). In the following years, after consulting with numerous key players concerned with the possible impacts of nanosciences and nanotechnologies in Quebec, the Ne 3 LS Network was launched in January 2010 in partnership with the Fonds quebecois de la recherche sur la nature et les technologies, the Fonds quebecois de la recherche sur la societe et la culture and the Fonds de la recherche en sante du Quebec, NanoQuebec, the Institut de recherche Robert-Sauve en sante et en securite du travail as well as the University of Montreal. Its objectives are to 1) Foster the development of Ne 3 LS research activities (grants and fellowships); 2) Spearhead the Canadian and international Ne 3 LS network; 3) Take part in the training of researchers and experts; 4) Encourage the creation of interactive tools for the general public; 5) Facilitate collaboration between decision-makers and experts; 6) Involve the scientific community through a host of activities (symposium, conferences, thematic events); 7) Build multidisciplinary research teams to evaluate the impact of nanotechnology.

  4. Experimental study of the effect of Nd:YAG laser on dental hard tissues: comparison between multi-pulse and free-generation emission code

    International Nuclear Information System (INIS)

    Carballosa Amor, A.; Tellez Jimenez, H.; Ponce Flores, E.; Flores Reyes, T.

    2016-01-01

    The aim of this study is to compare and contrast the morphological changes on dental hard tissue when irradiated with a Nd: YAG laser both on multi-pulse mode, with a Q: Switch of Cr: YAG passive, and on free generation mode. The experimental sample consisted of 6 healthy third molars which were divided equally and randomly between the two emission methods. The depths of each perforation were measured by optical coherence tomography (OCT). It was noted that, despite being less energy in the multi-pulse mode, the first three shots in this achieved deeper cavities than the ones on the free generation mode. Also, less damage to surrounding tissue were obtained on multi-pulse mode. (Author)

  5. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  6. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  7. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  8. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  9. Error-prone PCR mutation of Ls-EPSPS gene from Liriope spicata conferring to its enhanced glyphosate-resistance.

    Science.gov (United States)

    Mao, Chanjuan; Xie, Hongjie; Chen, Shiguo; Valverde, Bernal E; Qiang, Sheng

    2017-09-01

    Liriope spicata (Thunb.) Lour has a unique LsEPSPS structure contributing to the highest-ever-recognized natural glyphosate tolerance. The transformed LsEPSPS confers increased glyphosate resistance to E. coli and A. thaliana. However, the increased glyphosate-resistance level is not high enough to be of commercial value. Therefore, LsEPSPS was subjected to error-prone PCR to screen mutant EPSPS genes capable of endowing higher resistance levels. A mutant designated as ELs-EPSPS having five mutated amino acids (37Val, 67Asn, 277Ser, 351Gly and 422Gly) was selected for its ability to confer improved resistance to glyphosate. Expression of ELs-EPSPS in recombinant E. coli BL21 (DE3) strains enhanced resistance to glyphosate in comparison to both the LsEPSPS-transformed and -untransformed controls. Furthermore, transgenic ELs-EPSPS A. thaliana was about 5.4 fold and 2-fold resistance to glyphosate compared with the wild-type and the Ls-EPSPS-transgenic plants, respectively. Therefore, the mutated ELs-EPSPS gene has potential value for has potential for the development of glyphosate-resistant crops. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Inflorescence Development and the Role of LsFT in Regulating Bolting in Lettuce (Lactuca sativa L.)

    Science.gov (United States)

    Chen, Zijing; Han, Yingyan; Ning, Kang; Ding, Yunyu; Zhao, Wensheng; Yan, Shuangshuang; Luo, Chen; Jiang, Xiaotang; Ge, Danfeng; Liu, Renyi; Wang, Qian; Zhang, Xiaolan

    2018-01-01

    Lettuce (Lactuca sativa L.) is one of the most important leafy vegetable that is consumed during its vegetative growth. The transition from vegetative to reproductive growth is induced by high temperature, which has significant economic effect on lettuce production. However, the progression of floral transition and the molecular regulation of bolting are largely unknown. Here we morphologically characterized the inflorescence development and functionally analyzed the FLOWERING LOCUS T (LsFT) gene during bolting regulation in lettuce. We described the eight developmental stages during floral transition process. The expression of LsFT was negatively correlated with bolting in different lettuce varieties, and was promoted by heat treatment. Overexpression of LsFT could recover the late-flowering phenotype of ft-2 mutant. Knockdown of LsFT by RNA interference dramatically delayed bolting in lettuce, and failed to respond to high temperature. Therefore, this study dissects the process of inflorescence development and characterizes the role of LsFT in bolting regulation in lettuce. PMID:29403510

  11. Inflorescence Development and the Role of LsFT in Regulating Bolting in Lettuce (Lactuca sativa L.).

    Science.gov (United States)

    Chen, Zijing; Han, Yingyan; Ning, Kang; Ding, Yunyu; Zhao, Wensheng; Yan, Shuangshuang; Luo, Chen; Jiang, Xiaotang; Ge, Danfeng; Liu, Renyi; Wang, Qian; Zhang, Xiaolan

    2017-01-01

    Lettuce ( Lactuca sativa L.) is one of the most important leafy vegetable that is consumed during its vegetative growth. The transition from vegetative to reproductive growth is induced by high temperature, which has significant economic effect on lettuce production. However, the progression of floral transition and the molecular regulation of bolting are largely unknown. Here we morphologically characterized the inflorescence development and functionally analyzed the FLOWERING LOCUS T (LsFT) gene during bolting regulation in lettuce. We described the eight developmental stages during floral transition process. The expression of LsFT was negatively correlated with bolting in different lettuce varieties, and was promoted by heat treatment. Overexpression of LsFT could recover the late-flowering phenotype of ft-2 mutant. Knockdown of LsFT by RNA interference dramatically delayed bolting in lettuce, and failed to respond to high temperature. Therefore, this study dissects the process of inflorescence development and characterizes the role of LsFT in bolting regulation in lettuce.

  12. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  13. Data analysis and analytical predictions of a steam generator tube bundle flow field for verification of 2-D T/H computer code

    International Nuclear Information System (INIS)

    Hwang, J.Y.; Reid, H.C.; Berringer, R.

    1981-01-01

    Analytical predictions of the flow field within a 60 deg segment flow model of a proposed sodium heated steam generator are compared to experimental results obtained from several axial levels between baffling. The axial/crossflow field is developed by use of alternating multi-ported baffling, accomplished by radial perforation distribution. Radial and axial porous model predictions from an axisymmetric computational analysis compared to intra-pitch experimental data at the mid baffle span location for various levels. The analytical mechanics utilizes a cylindrical, axisymmetric, finite difference model, solving conservation mass and momentum equations. 6 refs

  14. Report of generation of the nuclear bank Presto-Warm (T=373 K) for the SVEA-96 fuel with the FMS codes

    International Nuclear Information System (INIS)

    Alonso V, G.

    1992-03-01

    In this work it is described in a general way the form in that was generated the Presto Warm database (TF=TM=373K) of the one SVEA-96 fuel for Laguna Verde. The formation of the bank it was carried out with the ECLIPSE 86-2D, RECORD 89-1A and POLGEN 88-1B of the FMS package installed in the VAX system of the offices of the National Commission of Nuclear Safety and Safeguards in Mexico D.F. The formed bank is denominated L1PG9109. All this was carried out following the 6F3/I/CN029/90/P1 procedure. The generated database contains information of the 10 nuclear parameters required in Presto without and with the effect of the control bar for the different arrangements of fuel bars present in the one assemble. All this included in what is known as Super option of the bank for Presto. (Author)

  15. Report of generation of the nuclear bank Presto-Warm (T=560 K) for the SVEA-96 fuel with the FMS codes

    International Nuclear Information System (INIS)

    Alonso V, G.

    1992-03-01

    In this work it is described in a general way the form in that was generated the Presto Warm database (TF=TM=560K) of the one SVEA-96 fuel for Laguna Verde. The formation of the bank it was carried out with the ECLIPSE 86-2D, RECORD 89-1A and POLGEN 88-1B of the FMS package installed in the VAX system of the offices of the National Commission of Nuclear Safety and Safeguards in Mexico D.F. The formed bank is denominated L1PG9109. All this was carried out following the 6F3/I/CN029/90/P1 procedure. The generated database contains information of the 10 nuclear parameters required in PRESTO without and with the effect of the control bar for the different arrangements of fuel bars present in the one assemble. All this included in what is known as SUPER option of the bank for PRESTO. (Author)

  16. Report of generation of the nuclear bank Presto-Cold (T=293 K) for the SVEA-96 fuel with the FMS codes

    International Nuclear Information System (INIS)

    Alonso V, G.

    1992-03-01

    In this work it is described in a general way the form in that was generated the Presto Cold database (TF=TM=293 K) of the one SVEA-96 fuel for Laguna Verde. The formation of the bank it was carried out with the ECLIPSE 86-2D, RECORD 89-1A and POLGEN 88-1B of the FMS package installed in the VAX system of the offices of the National Commission of Nuclear Safety and Safeguards in Mexico D.F. The formed bank is denominated L1PG9109. All this was carried out following the 6F3/I/CN029/90/P1 procedure. The generated database contains information of the 10 nuclear parameters required in PRESTO without and with the effect of the control bar for the different arrangements of fuel bars present in the one assemble. All this included in what is known as SUPER option of the bank for PRESTO. (Author)

  17. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  18. Vacuum Acceptance Tests for the UHV Room Temperature Vacuum System of the LHC during LS1

    CERN Document Server

    Cattenoz, G; Bregliozzi, G; Calegari, D; Gallagher, J; Marraffa, A; Chiggiato, P

    2014-01-01

    During the CERN Large Hadron Collider (LHC) first long shut down (LS1), a large number of vacuum tests are carried out on consolidated or newly fabricated devices. In such a way, the vacuum compatibility is assessed before installation in the UHV system of the LHC. According to the equipment’s nature, the vacuum acceptance tests consist in functional checks, leak test, outgassing rate measurements, evaluation of contaminants by Residual Gas Analysis (RGA), pumping speed measurements and qualification of the H2 sticking probability of Non-Evaporable-Getter (NEG) coating. In this paper, the methods used for the tests and the acceptance criteria are described. A summary of the measured vacuum characteristics for the tested components is also given.

  19. Fuzzy Pruning Based LS-SVM Modeling Development for a Fermentation Process

    Directory of Open Access Journals (Sweden)

    Weili Xiong

    2014-01-01

    Full Text Available Due to the complexity and uncertainty of microbial fermentation processes, data coming from the plants often contain some outliers. However, these data may be treated as the normal support vectors, which always deteriorate the performance of soft sensor modeling. Since the outliers also contaminate the correlation structure of the least square support vector machine (LS-SVM, the fuzzy pruning method is provided to deal with the problem. Furthermore, by assigning different fuzzy membership scores to data samples, the sensitivity of the model to the outliers can be reduced greatly. The effectiveness and efficiency of the proposed approach are demonstrated through two numerical examples as well as a simulator case of penicillin fermentation process.

  20. Collimated scanning LS-INAA for testing trace elements homogeneity in Brazilian coffee beans

    International Nuclear Information System (INIS)

    Tagliaferro, F.S.; Nadai Fernandes de, E.A.; Bode, P.; Baas, H.W.

    2008-01-01

    The degree of homogeneity is normally assessed by the variability of the results of independent analyses of several (e.g., 15) normal-scale replicates. Large sample instrumental neutron activation analysis (LS-INAA) with a collimated Ge detector allows inspecting the degree of homogeneity of the initial batch material, using a kilogram-size sample. The test is based on the spatial distributions of induced radioactivity. Such test was applied to samples of Brazilian whole (green) coffee beans (Coffea arabica and Coffea canephora) of approximately 1 kg in the frame of development of a coffee reference material. Results indicated that the material do not contain significant element composition inhomogeneities between batches of approximately 30-50 g, masses typically forming the starting base of a reference material. (author)

  1. MIMO to LS-MIMO: A road to realization of 5G

    Science.gov (United States)

    Koppati, Naveena; Pavani, K.; Sharma, Dinesh; Sharma, Purnima K.

    2017-07-01

    MIMO means multiple inputs multiple outputs. As it refers MIMO is a RF technology used in many new technologies these days to increase link capacity and spectral efficiency. MIMO is used in Wi-Fi, LTE, 4G, 5G and other wireless technologies. This paper describes the earlier history of MIMO-OFDM and the antenna beam forming development in MIMO and types of MIMO. Also this treatise describes several decoding algorithms. The MIMO combined with OFDM increases the channel capacity. But the main problem is in estimating the transmitted signal from the received signal. So the channel knowledge is to be known in estimating the channel capacity. The advancement in MIMO-OFDM is Massive MIMO which is beneficial in providing additional data capacity in the increased traffic environment is described. In this memoir various application scenarios of LS-MIMO which increases the capacity are discussed.

  2. Radioiodinated VEGF to image tumor angiogenesis in a LS180 tumor xenograft model

    International Nuclear Information System (INIS)

    Yoshimoto, Mitsuyoshi; Kinuya, Seigo; Kawashima, Atsuhiro; Nishii, Ryuichi; Yokoyama, Kunihiko; Kawai, Keiichi

    2006-01-01

    Introduction: Angiogenesis is essential for tumor growth or metastasis. A method involving noninvasive detection of angiogenic activity in vivo would provide diagnostic information regarding antiangiogenic therapy targeting vascular endothelial cells as well as important insight into the role of vascular endothelial growth factor (VEGF) and its receptor (flt-1 and KDR) system in tumor biology. We evaluated radioiodinated VEGF 121 , which displays high binding affinity for KDR, and VEGF 165 , which possesses high binding affinity for flt-1 and low affinity for KDR, as angiogenesis imaging agents using the LS180 tumor xenograft model. Methods: VEGF 121 and VEGF 165 were labeled with 125 I by the chloramine-T method. Biodistribution was observed in an LS180 human colon cancer xenograft model. Additionally, autoradiographic imaging and immunohistochemical staining of tumors were performed with 125 I-VEGF 121 . Results: 125 I-VEGF 121 and 125 I-VEGF 165 exhibited strong, continuous uptake by tumors and the uterus, an organ characterized by angiogenesis. 125 I-VEGF 121 uptake in tumors was twofold higher than that of 125 I-VEGF 165 (9.12±98 and 4.79±1.08 %ID/g at 2 h, respectively). 125 I-VEGF 121 displayed higher tumor to nontumor (T/N) ratios in most normal organs in comparison with 125 I-VEGF 165 . 125 I-VEGF 121 accumulation in tumors decreased with increasing tumor volume. Autoradiographic and immunohistochemical analyses confirmed that the difference in 125 I-VEGF 121 tumor accumulation correlated with degree of tumor vascularity. Conclusion: Radioiodinated VEGF 121 is a promising tracer for noninvasive delineation of angiogenesis in vivo

  3. Trimodal distribution of ozone and water vapor in the UT/LS during boreal summer

    Science.gov (United States)

    Dunkerton, T. J.

    2004-12-01

    The relation of ozone and water vapor in the upper troposphere and lower stratosphere (UT/LS) is strongly influenced by the off-equatorial Asian and North American monsoons in boreal summer. Both regions experience hydration, presumably as a result of deep convection. This behavior contrasts sharply with the apparent dehydrating influence of near-equatorial deep convection in boreal winter. There is also a striking difference in ozone between Asia and North America in boreal summer. Over Asia, ozone concentrations are low, evidently a result of ubiquitous deep convection and the vertical transport of ozone-poor air, while over North America, ozone concentrations are much higher. Since deep convection also occurs in the North American monsoon, it appears that the difference in ozone concentration between Asia and North America in boreal summer reflects a differing influence of the large-scale circulation in the two regions: specifically, (i) isolation of the Tibetan anticyclone versus (ii) the intrusion of filaments of ozone-rich air from the stratosphere over North America. During boreal summer, as in winter, near-equatorial concentrations of ozone and water vapor are low near the equator. The result of these geographical variations is a trimodal distribution of ozone and water-vapor correlation. Our talk reviews the observational evidence of this trimodal distribution and possible dynamical and microphysical causes, focusing primarily on the quality and possible sampling bias of satellite and aircraft measurements. A key issue is the ability of HALOE to sample areas of ubiquitous deep convection. Other issues include the vertical structure of tracer anomalies, isentropic stirring in the UT/LS, horizontal transport of biomass burning products lofted by deep convection, and connections to the moist phase of the tropical `tape recorder' signal in water vapor.

  4. Factors associated with falling in early, treated Parkinson's disease: The NET-PD LS1 cohort.

    Science.gov (United States)

    Chou, Kelvin L; Elm, Jordan J; Wielinski, Catherine L; Simon, David K; Aminoff, Michael J; Christine, Chadwick W; Liang, Grace S; Hauser, Robert A; Sudarsky, Lewis; Umeh, Chizoba C; Voss, Tiffini; Juncos, Jorge; Fang, John Y; Boyd, James T; Bodis-Wollner, Ivan; Mari, Zoltan; Morgan, John C; Wills, Anne-Marie; Lee, Stephen L; Parashos, Sotirios A

    2017-06-15

    Recognizing the factors associated with falling in Parkinson's disease (PD) would improve identification of at-risk individuals. To examine frequency of falling and baseline characteristics associated with falling in PD using the National Institute of Neurological Disorders and Stroke (NINDS) Exploratory Trials in PD Long-term Study-1 (NET-PD LS-1) dataset. The LS-1 database included 1741 early treated PD subjects (median 4year follow-up). Baseline characteristics were tested for a univariate association with post-baseline falling during the trial. Significant variables were included in a multivariable logistic regression model. A separate analysis using a negative binomial model investigated baseline factors on fall rate. 728 subjects (42%) fell during the trial, including at baseline. A baseline history of falls was the factor most associated with post-baseline falling. Men had lower odds of post-baseline falling compared to women, but for men, the probability of a post-baseline fall increased with age such that after age 70, men and women had similar odds of falling. Other baseline factors associated with a post-baseline fall and increased fall rate included the Unified PD Rating Scale (UPDRS) Activities of Daily Living (ADL) score, total functional capacity (TFC), baseline ambulatory capacity score and dopamine agonist monotherapy. Falls are common in early treated PD. The biggest risk factor for falls in PD remains a history of falling. Measures of functional ability (UPDRS ADL, TFC) and ambulatory capacity are novel clinical risk factors needing further study. A significant age by sex interaction may help to explain why age has been an inconsistent risk factor for falls in PD. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. All you need is shape: Predicting shear banding in sand with LS-DEM

    Science.gov (United States)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2018-02-01

    This paper presents discrete element method (DEM) simulations with experimental comparisons at multiple length scales-underscoring the crucial role of particle shape. The simulations build on technological advances in the DEM furnished by level sets (LS-DEM), which enable the mathematical representation of the surface of arbitrarily-shaped particles such as grains of sand. We show that this ability to model shape enables unprecedented capture of the mechanics of granular materials across scales ranging from macroscopic behavior to local behavior to particle behavior. Specifically, the model is able to predict the onset and evolution of shear banding in sands, replicating the most advanced high-fidelity experiments in triaxial compression equipped with sequential X-ray tomography imaging. We present comparisons of the model and experiment at an unprecedented level of quantitative agreement-building a one-to-one model where every particle in the more than 53,000-particle array has its own avatar or numerical twin. Furthermore, the boundary conditions of the experiment are faithfully captured by modeling the membrane effect as well as the platen displacement and tilting. The results show a computational tool that can give insight into the physics and mechanics of granular materials undergoing shear deformation and failure, with computational times comparable to those of the experiment. One quantitative measure that is extracted from the LS-DEM simulations that is currently not available experimentally is the evolution of three dimensional force chains inside and outside of the shear band. We show that the rotations on the force chains are correlated to the rotations in stress principal directions.

  6. A coupling of empirical explosive blast loads to ALE air domains in LS-DYNA (registered)

    International Nuclear Information System (INIS)

    Slavik, Todd P

    2010-01-01

    A coupling method recently implemented in LS-DYNA (registered) allows empirical explosive blast loads to be applied to air domains treated with the multi-material arbitrary Lagrangian-Eulerian (ALE) formulation. Previously, when simulating structures subjected to blast loads, two methods of analysis were available: a purely Lagrangian approach or one involving the ALE and Lagrangian formulations coupled with a fluid-structure interaction (FSI) algorithm. In the former, air blast pressure is computed with empirical equations and directly applied to Lagrangian elements of the structure. In the latter approach, the explosive as well as the air are explicitly modeled and the blast wave propagating through the ALE air domain impinges on the Lagrangian structure through FSI. Since the purely Lagrangian approach avoids modeling the air between the explosive and structure, a significant computational cost savings can be realized - especially so when large standoff distances are considered. The shortcoming of the empirical blast equations is their inability to account for focusing or shadowing of the blast waves due to their interaction with structures which may intervene between the explosive and primary structure of interest. The new method presented here obviates modeling the explosive and air leading up the structure. Instead, only the air immediately surrounding the Lagrangian structures need be modeled with ALE, while effects of the far-field blast are applied to the outer face of that ALE air domain with the empirical blast equations; thus, focusing and shadowing effects can be accommodated yet computational costs are kept to a minimum. Comparison of the efficiency and accuracy of this new method with other approaches shows that the ability of LS-DYNA (registered) to model a variety of new blast scenarios has been greatly extended.

  7. Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria.

    Science.gov (United States)

    Liu, Guoqiang; Kong, Yingying; Fan, Yajing; Geng, Ce; Peng, Donghai; Sun, Ming

    2017-05-10

    Bacillus velezensis LS69 was found to exhibit antagonistic activity against a diverse spectrum of pathogenic bacteria. It has one circular chromosome of 3,917,761bp with 3,643 open reading frames. Genome analysis identified ten gene clusters involved in nonribosomal synthesis of polyketides (macrolactin, bacillaene and difficidin), lipopeptides (surfactin, fengycin, bacilysin and iturin A) and bacteriocins (amylolysin and amylocyclicin). In addition, B. velezensis LS69 was found to contain a series of genes involved in enhancing plant growth and triggering plant immunity. Whole genome sequencing of Bacillus velezensis LS69 will provide a basis for elucidation of its biocontrol mechanisms and facilitate its applications in the future. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Investigation of an Alternative Fuel Form for the Liquid Salt Cooled Very High Temperature Reactor (LS-VHTR)

    International Nuclear Information System (INIS)

    Casino, William A. Jr.

    2006-01-01

    Much of the recent studies investigating the use of liquid salts as reactor coolants have utilized a core configuration of graphite prismatic fuel block assemblies with TRISO particles embedded into cylindrical fuel compacts arranged in a triangular pitch lattice. Although many calculations have been performed for this fuel form in gas cooled reactors, it would be instructive to investigate whether an alternative fuel form may yield improved performance for the liquid salt-cooled Very High Temperature Reactor (LS-VHTR). This study investigates how variations in the fuel form will impact the performance of the LS-VHTR during normal and accident conditions and compares the results with a similar analysis that was recently completed for a LS-VHTR core made up of prismatic block fuel. (author)

  9. BCM-2.0 - The new version of computer code ;Basic Channeling with Mathematica©;

    Science.gov (United States)

    Abdrashitov, S. V.; Bogdanov, O. V.; Korotchenko, K. B.; Pivovarov, Yu. L.; Rozhkova, E. I.; Tukhfatullin, T. A.; Eikhorn, Yu. L.

    2017-07-01

    The new symbolic-numerical code devoted to investigation of the channeling phenomena in periodic potential of a crystal has been developed. The code has been written in Wolfram Language taking advantage of analytical programming method. Newly developed different packages were successfully applied to simulate scattering, radiation, electron-positron pair production and other effects connected with channeling of relativistic particles in aligned crystal. The result of the simulation has been validated against data from channeling experiments carried out at SAGA LS.

  10. Xerostomia Quality of Life Scale (XeQoLS) questionnaire: validation of Italian version in head and neck cancer patients.

    Science.gov (United States)

    Lastrucci, Luciana; Bertocci, Silvia; Bini, Vittorio; Borghesi, Simona; De Majo, Roberta; Rampini, Andrea; Gennari, Pietro Giovanni; Pernici, Paola

    2018-01-01

    To translate the Xerostomia Quality-of-Life Scale (XeQoLS) into Italian language (XeQoLS-IT). Xerostomia is the most relevant acute and late toxicity in patients with head and neck cancer treated with radiotherapy (RT). Patient-reported outcome (PRO) instruments are subjective report on patient perception of health status. The XeQoLS consists of 15 items and measures the impact of salivary gland dysfunction and xerostomia on the four major domains of oral health-related QoL. The XeQoLS-IT was created through a linguistic validation multi-step process: forward translation (TF), backward translation (TB) and administration of the questionnaire to 35 Italian patients with head and neck cancer. Translation was independently carried out by two radiation oncologists who were Italian native speakers. The two versions were compared and adapted to obtain a reconciled version, version 1 (V1). V1 was translated back into English by an Italian pro skilled in teaching English. After review of discrepancies and choice of the most appropriate wording for clarity and similarity to the original, version 2 (V2) was reached by consensus. To evaluate version 2, patients completed the XeQoLS-IT questionnaire and also underwent a cognitive debriefing. The questionnaire was considered simple by the patients. The clarity of the instructions and the easiness to answer questions had a mean value of 4.5 (± 0.71) on a scale from 1 to 5. A valid multi-step process led to the creation of the final version of the XeQoLS-IT, a suitable instrument for the perception of xerostomia in patients treated with RT.

  11. DISEÑO Y EVALUACIÓN DE UN CLASIFICADOR DE TEXTURAS BASADO EN LS-SVM

    OpenAIRE

    Beitmantt Cárdenas Quintero; Nelson Enrique Vera Parra; Pablo Emilio Rozo García

    2013-01-01

    Evaluar el desempeño y el costo computacional de diferentes arquitecturas y metodologías Least Square Support Vector Machine (LS-SVM) ante la segmentación de imágenes por textura y a partir de dichos resultados postular un modelo de un clasificador de texturas LS-SVM.  Metodología: Ante un problema de clasificación binaria representado por la segmentación  de 32 imágenes, organizadas en 4 grupos y formadas por pares de texturas típicas (granito/corteza, ladrillo/tapicería, madera/mármol, teji...

  12. Hermitian self-dual quasi-abelian codes

    Directory of Open Access Journals (Sweden)

    Herbert S. Palines

    2017-12-01

    Full Text Available Quasi-abelian codes constitute an important class of linear codes containing theoretically and practically interesting codes such as quasi-cyclic codes, abelian codes, and cyclic codes. In particular, the sub-class consisting of 1-generator quasi-abelian codes contains large families of good codes. Based on the well-known decomposition of quasi-abelian codes, the characterization and enumeration of Hermitian self-dual quasi-abelian codes are given. In the case of 1-generator quasi-abelian codes, we offer necessary and sufficient conditions for such codes to be Hermitian self-dual and give a formula for the number of these codes. In the case where the underlying groups are some $p$-groups, the actual number of resulting Hermitian self-dual quasi-abelian codes are determined.

  13. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  14. The generation of recombinant influenza A viruses expressing a PB2 fusion protein requires the conservation of a packaging signal overlapping the coding and noncoding regions at the 5' end of the PB2 segment

    International Nuclear Information System (INIS)

    Dos Santos Afonso, Emmanuel; Escriou, Nicolas; Leclercq, India; Werf, Sylvie van der; Naffakh, Nadia

    2005-01-01

    We generated recombinant A/WSN/33 influenza A viruses expressing a PB2 protein fused to a Flag epitope at the N- (Flag-PB2) or C-terminus (PB2-Flag), which replicated efficiently and proved to be stable upon serial passage in vitro on MDCK cells. Rescue of PB2-Flag viruses required that the 5' end of the PB2 segment was kept identical to the wild-type beyond the 34 noncoding terminal nucleotides. This feature was achieved by a duplication of the 109 last nucleotides encoding PB2 between the Flag sequence and the 5'NCR. In PB2 minigenomes rescue experiments, both the 5' and 3' coding ends of the PB2 segment were found to promote the incorporation of minigenomes into virions. However, the presence of the Flag sequence at the junction between the 3'NCR and the coding sequence did not prevent the rescue of Flag-PB2 viruses. Our observations define requirements that may be useful for the purpose of engineering influenza RNAs

  15. Cavity structural integrity evaluation of steam explosion using LS-DYNA

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dae-Young; Park, Chang-Hwan [FNC Technology Co. Ltd., Yongin (Korea, Republic of); Kim, Kap-sun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    For investigating the mechanical response of the newly-designed NPP against an steam explosion, the cavity structural integrity evaluation was performed, in which the mechanical load resulted from a steam explosion in the reactor cavity was calculated. In the evaluation, two kinds of approach were considered, one of which is a deterministic manner and the other is a probabilistic one. In this report, the procedure and the results of the deterministic analysis are presented When entering the severe accident, the core is relocated to the lower head. In this case, an Ex-Vessel Steam Explosion(EVSE) can occur. It can threaten the structural integrity of the cavity due to the load applied to the walls or slabs of the cavity. The large amount of the energy transmitted from interaction between the molten corium and the water causes a dynamic loading onto the concrete walls resulting not only to affect the survivability of the various equipment but also to threaten the integrity of the containment. In this report, the response of the cavity wall structure is analyzed using the nonlinear finite element analysis (FEA) code. The resulting stress and strain of the structure were evaluated by the criteria in NEI07-13. Until now, deterministic analysis was performed via finite element analysis for the dynamic load generated by the steam explosion to investigate the effect on the cavity structure. A deterministic method was used in this study using the specific values of material properties and clearly defined steam explosion pressure curve. The results showed that the rebar and the liner are kept intact even at the high pressure pulse given by the steam explosion. The liner integrity is more critical to judge the preservation of the lean-tightness. In the meantime, there were found cracks in concrete media.

  16. Horizontal transfer generates genetic variation in an asexual pathogen

    Directory of Open Access Journals (Sweden)

    Xiaoqiu Huang

    2014-10-01

    Full Text Available There are major gaps in the understanding of how genetic variation is generated in the asexual pathogen Verticillium dahliae. On the one hand, V. dahliae is a haploid organism that reproduces clonally. On the other hand, single-nucleotide polymorphisms and chromosomal rearrangements were found between V. dahliae strains. Lineage-specific (LS regions comprising about 5% of the genome are highly variable between V. dahliae strains. Nonetheless, it is unknown whether horizontal gene transfer plays a major role in generating genetic variation in V. dahliae. Here, we analyzed a previously sequenced V. dahliae population of nine strains from various geographical locations and hosts. We found highly homologous elements in LS regions of each strain; LS regions of V. dahliae strain JR2 are much richer in highly homologous elements than the core genome. In addition, we discovered, in LS regions of JR2, several structural forms of nonhomologous recombination, and two or three homologous sequence types of each form, with almost each sequence type present in an LS region of another strain. A large section of one of the forms is known to be horizontally transferred between V. dahliae strains. We unexpectedly found that 350 kilobases of dynamic LS regions were much more conserved than the core genome between V. dahliae and a closely related species (V. albo-atrum, suggesting that these LS regions were horizontally transferred recently. Our results support the view that genetic variation in LS regions is generated by horizontal transfer between strains, and by chromosomal reshuffling reported previously.

  17. Immunomodulatory Effects of Lactobacillus salivarius LS01 and Bifidobacterium breve BR03, Alone and in Combination, on Peripheral Blood Mononuclear Cells of Allergic Asthmatics.

    Science.gov (United States)

    Drago, Lorenzo; De Vecchi, Elena; Gabrieli, Arianna; De Grandi, Roberta; Toscano, Marco

    2015-07-01

    The aim of this study was to evaluate probiotic characteristics of Lactobacillus salivarius LS01 and Bifidobacterium breve BR03 alone and in combination and their immunomodulatory activity in asthmatic subjects. Subjects affected by allergic asthma were recruited. Initially, LS01 and BR03 were analyzed for their growth compatibility by a broth compatibility assay. To study the antimicrobial activity of probiotic strains, an agar diffusion assay was performed. Finally, cytokine production by peripheral blood mononuclear cells (PBMCs) stimulated with LS01 and BR03 was determined by means of specific quantitative enzyme-linked immunosorbent assay (ELISA). The growth of some clinical pathogens were slightly inhibited by LS01 and LS01-BR03 co-culture supernatant not neutralized to pH 6.5, while only the growth of E. coli and S. aureus was inhibited by the supernatant of LS01 and LS01-BR03 neutralized to pH 6.5. Furthermore, LS01 and BR03 combination was able to decrease the secretion of proinflammatory cytokines by PBMCs, leading to an intense increase in IL-10 production. L. salivarius LS01 and B. breve BR03 showed promising probiotic properties and beneficial immunomodulatory activity that are increased when the 2 strains are used in combination in the same formulation.

  18. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  19. IL WORKSHOP DI FONETICA IN ITALIANO L2/LS: ACCENTO DI PAROLA E DURATA SILLABICA

    Directory of Open Access Journals (Sweden)

    Lidia Calabrò

    2016-09-01

    Full Text Available Lavorare sugli aspetti fonetico-fonologici di una lingua straniera risulta quanto mai importante e fondamentale, soprattutto se la L1 degli apprendenti la L2/LS è molto distante. Attraverso un’esperienza iniziata con studenti sinofoni del progetto Marco Polo / Turandot, le attività inserite nel workshop di fonetica nell’italiano L2 si presentano come proposta di didattica integrativa al fine di sensibilizzare gli studenti ai tratti soprasegmentali della L2 a contrasto con quelli della loro L1. Nel presente contributo verranno presentate alcune attività riguardanti la percezione e la realizzazione della durata vocalica e della sillaba accentata attraverso attività collaborative e il movimento del corpo. Tutte le attività prevedono un coinvolgimento personale e totale del singolo apprendente e di tutta la classe in quanto il workshop si avvale di multimodalità, multimedialità e apprendimento collaborativo al fine di scoprire i suoni della L2 e riflettere sulla loro percezione e produzione. Italian L2/LS phonetic workshops: word accent and syllable duration Working on phonetics and the phonolgical aspects language is considered important and fundamental, above all if a learner’s L1 is distant from the SL. The activities described in the paper took place during a phonetic workshop in Italian as a second language with Chinese students enrolled in the Marco Polo/Turandot project. They can be considered a teaching technique aimed at raising student awareness about the SL supra-segmental aspects in contrast to their L1. Some activities related to vowel length and stress will be presented: perception and production, cooperative learning and body movement aimed at discovering and practicing vowel length in stressed syllables (phonic stress. The activities involved the students in a more personal, total and physical collaboration together with their classmates. The multimedia workshop was designed to be multimodal and cooperative, in order to

  20. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.