WorldWideScience

Sample records for ls code generator

  1. Extending JPEG-LS for low-complexity scalable video coding

    DEFF Research Database (Denmark)

    Ukhanova, Anna; Sergeev, Anton; Forchhammer, Søren

    2011-01-01

    JPEG-LS, the well-known international standard for lossless and near-lossless image compression, was originally designed for non-scalable applications. In this paper we propose a scalable modification of JPEG-LS and compare it with the leading image and video coding standards JPEG2000 and H.264/SVC...

  2. A study on the modeling techniques using LS-INGRID

    Energy Technology Data Exchange (ETDEWEB)

    Ku, J. H.; Park, S. W

    2001-03-01

    For the development of radioactive material transport packages, the verification of structural safety of a package against the free drop impact accident should be carried out. The use of LS-DYNA, which is specially developed code for impact analysis, is essential for impact analysis of the package. LS-INGRID is a pre-processor for LS-DYNA with considerable capability to deal with complex geometries and allows for parametric modeling. LS-INGRID is most effective in combination with LS-DYNA code. Although the usage of LS-INGRID seems very difficult relative to many commercial mesh generators, the productivity of users performing parametric modeling tasks with LS-INGRID can be much higher in some cases. Therefore, LS-INGRID has to be used with LS-DYNA. This report presents basic explanations for the structure and commands, basic modelling examples and advanced modelling of LS-INGRID to use it for the impact analysis of various packages. The new users can build the complex model easily, through a study for the basic examples presented in this report from the modelling to the loading and constraint conditions.

  3. A preliminary neutronic evaluation and depletion study of VHTR and LS-VHTR reactors using the codes: WIMSD5 and MCNPX

    International Nuclear Information System (INIS)

    Silva, Fabiano C.; Pereira, Claubia; Costa, Antonella Lombardi; Veloso, Maria Auxiliadora Fortini

    2009-01-01

    It is expected that, in the future, besides electricity generation, reactors should also develop secondary activities, such as hydrogen generation and seawater desalinization. Generation IV reactors are expected to possess special characteristics, like high safety, minimization of radioactive rejects amount and ability to use reprocessed fuel with non-proliferating projects in their cycles. Among the projects of IV generation reactors available nowadays, the (High Temperature Reactors) HTR, are highlighted due to these desirable characteristics. Under such circumstances, such reactor may be able to have significant higher thermal power ratings to be used for hydrogen production, without loose of safety, even in an emergency. For this work, we have chosen two HTR concepts of a prismatic reactor: (Very High Temperature Reactor) VHTR and the (Liquid Salted -Very High Temperature Reactor) LS-VHTR. The principal difference between them is the coolant. The VHTR uses helium gas as a coolant and have a burnup of 101,661 MWd/THM while the LS-VHTR uses low-pressure liquid coolant molten fluoride salt with a boiling point near 1500 de C working at 155,946 MWd/THM. The ultimate power output is limited by the capacity of the passive decay system; this capacity is limited by the reactor vessel temperature. The goal was to evaluate the neutronic behavior and fuel composition during the burnup using the codes (Winfrith Improved Multi-Group Scheme) WIMSD5 and the MCNPX2.6. The first, deterministic and the second, stochastic. For both reactors, burned fuel type 'C' coming from Angra-I nuclear plant, in Brazil, was used with 3.1% of initial enrichment, burnup to 33,000 MWd/THM using the ORIGEN2.1 code, divided in three steps of 11,000 MWd/THM, with an average density power of 37.75 MWd/THM and 5 years of cooling in pool. Finally, the fuel was reprocessed by Purex technique extracting 99.9% of Pu, and the desired amount of fissile material (15%) to achieve the final mixed oxide was

  4. Controlling a Conventional LS-pump based on Electrically Measured LS-pressure

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben Ole; Hansen, Michael Rygaard

    2008-01-01

    As a result of the increasing use of sensors in mobile hydraulic equipment, the need for hydraulic pilot lines is decreasing, being replaced by electrical wiring and electrically controllable components. For controlling some of the existing hydraulic components there are, however, still a need...... this system, by either generating a copy of the LS-pressure, the LS-pressure being the output, or letting the output be the pump pressure. The focus of the current paper is on the controller design based on the first approach. Specifically a controlled leakage flow is used to avoid the need for a switching...

  5. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  6. Standard LS-TTL IC data book

    International Nuclear Information System (INIS)

    1997-05-01

    This book is one of semiconductor device data manual series. It introduces standard logic 74LS fast series. It has general information, circuit characteristic, note of design and test and FAST Data Sheets, which includes gates, Flip-Flop, Decoder and Encoder, Counter, Master reset, Counter, Shift register, octal Buffer/Line Driver/3-state, generator/Checker, Full adder, Error Detection and correction circuit controller and synchronous address Multiplexer. It also lists LS Data Sheets including NAND gate, NOR gate, Hex Inverter, Delay Element, Frequency Divider, Decode Counter, Function generator, Dual Decode Counter, Memory cycle controller and voltage controlled Oscillator.

  7. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  8. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  9. Studi Eksperimen dan Numerik Pengaruh Penambahan Vortex Generator pada Airfoil NASA LS-0417

    Directory of Open Access Journals (Sweden)

    Ulul Azmi

    2017-03-01

    Full Text Available Separasi boundary layer merupakan fenomena penting yang mempengaruhi performansi airfoil. Salah satu upaya untuk menunda atau menghilangkan separasi aliran adalah meningkatkan momentum fluida untuk melawan adverse pressure dan tegangan geser permukaan. Hal ini mengakibatkan separasi aliran akan tertunda lebih ke belakang. Upaya tersebut dapat dilakukan dengan penambahan turbulent generator pada upper surface airfoil. Vortex generator (VG merupakan salah satu jenis turbulent generator yang dapat mempercepat transisi dari laminar boundary layer menjadi turbulent boundary layer. Oleh karena itu, penelitian ini bertujuan untuk mengetahui pengaruh jarak penempatan dan ketinggian VG terhadap perkembangan turbulent boundary layer sehingga dapat meningkatkan performansi airfoil. Penelitian ini dilakukan dengan eksperimen dan numerik pada Re = 1.41x105 dengan angle of attack 16°. Benda uji yang digunakan adalah airfoil NASA LS-0417 dengan dan tanpa VG. Variasi jarak penempatan dan ketinggian VG yaitu x/c = 0.1; 0.2; 0.3; 0.4 (h = 1 mm; 3 mm; 5 mm. Hasil yang didapatkan adalah variasi vortex generator paling optimal adalah vortex generator dengan x/c = 0.3 dan h = 1 mm dimana Nilai CL/CD mengalami kenaikan sebesar 14.337%.

  10. Evaluation of the efficiency and fault density of software generated by code generators

    Science.gov (United States)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  11. Towards Product Lining Model-Driven Development Code Generators

    OpenAIRE

    Roth, Alexander; Rumpe, Bernhard

    2015-01-01

    A code generator systematically transforms compact models to detailed code. Today, code generation is regarded as an integral part of model-driven development (MDD). Despite its relevance, the development of code generators is an inherently complex task and common methodologies and architectures are lacking. Additionally, reuse and extension of existing code generators only exist on individual parts. A systematic development and reuse based on a code generator product line is still in its inf...

  12. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  13. FCG: a code generator for lazy functional languages

    NARCIS (Netherlands)

    Kastens, U.; Langendoen, K.G.; Hartel, Pieter H.; Pfahler, P.

    1992-01-01

    The FCGcode generator produces portable code that supports efficient two-space copying garbage collection. The code generator transforms the output of the FAST compiler front end into an abstract machine code. This code explicitly uses a call stack, which is accessible to the garbage collector. In

  14. New GOES satellite synchronized time code generation

    Science.gov (United States)

    Fossler, D. E.; Olson, R. K.

    1984-01-01

    The TRAK Systems' GOES Satellite Synchronized Time Code Generator is described. TRAK Systems has developed this timing instrument to supply improved accuracy over most existing GOES receiver clocks. A classical time code generator is integrated with a GOES receiver.

  15. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  16. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  17. Code Generation from Pragmatics Annotated Coloured Petri Nets

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    limited work has been done on transforming CPN model to protocol implementations. The goal of the thesis is to be able to automatically generate high-quality implementations of communication protocols based on CPN models. In this thesis, we develop a methodology for generating implementations of protocols...... third party libraries and the code should be easily usable by third party code. Finally, the code should be readable by developers with expertise on the considered platforms. In this thesis, we show that our code generation approach is able to generate code for a wide range of platforms without altering...... such as games and rich web applications. Finally, we conclude the evaluation of the criteria of our approach by using the WebSocket PA-CPN model to show that we are able to verify fairly large protocols....

  18. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  19. Optimized Method for Generating and Acquiring GPS Gold Codes

    Directory of Open Access Journals (Sweden)

    Khaled Rouabah

    2015-01-01

    Full Text Available We propose a simpler and faster Gold codes generator, which can be efficiently initialized to any desired code, with a minimum delay. Its principle consists of generating only one sequence (code number 1 from which we can produce all the other different signal codes. This is realized by simply shifting this sequence by different delays that are judiciously determined by using the bicorrelation function characteristics. This is in contrast to the classical Linear Feedback Shift Register (LFSR based Gold codes generator that requires, in addition to the shift process, a significant number of logic XOR gates and a phase selector to change the code. The presence of all these logic XOR gates in classical LFSR based Gold codes generator provokes the consumption of an additional time in the generation and acquisition processes. In addition to its simplicity and its rapidity, the proposed architecture, due to the total absence of XOR gates, has fewer resources than the conventional Gold generator and can thus be produced at lower cost. The Digital Signal Processing (DSP implementations have shown that the proposed architecture presents a solution for acquiring Global Positioning System (GPS satellites signals optimally and in a parallel way.

  20. Radionuclide daughter inventory generator code: DIG

    International Nuclear Information System (INIS)

    Fields, D.E.; Sharp, R.D.

    1985-09-01

    The Daughter Inventory Generator (DIG) code accepts a tabulation of radionuclide initially present in a waste stream, specified as amounts present either by mass or by activity, and produces a tabulation of radionuclides present after a user-specified elapsed time. This resultant radionuclide inventory characterizes wastes that have undergone daughter ingrowth during subsequent processes, such as leaching and transport, and includes daughter radionuclides that should be considered in these subsequent processes or for inclusion in a pollutant source term. Output of the DIG code also summarizes radionuclide decay constants. The DIG code was developed specifically to assist the user of the PRESTO-II methodology and code in preparing data sets and accounting for possible daughter ingrowth in wastes buried in shallow-land disposal areas. The DIG code is also useful in preparing data sets for the PRESTO-EPA code. Daughter ingrowth in buried radionuclides and in radionuclides that have been leached from the wastes and are undergoing hydrologic transport are considered, and the quantities of daughter radionuclide are calculated. Radionuclide decay constants generated by DIG and included in the DIG output are required in the PRESTO-II code input data set. The DIG accesses some subroutines written for use with the CRRIS system and accesses files containing radionuclide data compiled by D.C. Kocher. 11 refs

  1. Two-Level Semantics and Code Generation

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1988-01-01

    A two-level denotational metalanguage that is suitable for defining the semantics of Pascal-like languages is presented. The two levels allow for an explicit distinction between computations taking place at compile-time and computations taking place at run-time. While this distinction is perhaps...... not absolutely necessary for describing the input-output semantics of programming languages, it is necessary when issues such as data flow analysis and code generation are considered. For an example stack-machine, the authors show how to generate code for the run-time computations and still perform the compile...

  2. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  3. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  4. Forsskåls Fiskeherbarium

    DEFF Research Database (Denmark)

    Provencal, Philippe

    2017-01-01

    En beskrivelse af Peter Forsskåls fiskeherbarium og hans metoder i det marinbiologiske feltarbejde da han rejste i Egypten, Rødehavet og Yemen som ekspeditionsmedlem i den Arabiske Rejse 2061-67.......En beskrivelse af Peter Forsskåls fiskeherbarium og hans metoder i det marinbiologiske feltarbejde da han rejste i Egypten, Rødehavet og Yemen som ekspeditionsmedlem i den Arabiske Rejse 2061-67....

  5. The large-scale blast score ratio (LS-BSR pipeline: a method to rapidly compare genetic content between bacterial genomes

    Directory of Open Access Journals (Sweden)

    Jason W. Sahl

    2014-04-01

    Full Text Available Background. As whole genome sequence data from bacterial isolates becomes cheaper to generate, computational methods are needed to correlate sequence data with biological observations. Here we present the large-scale BLAST score ratio (LS-BSR pipeline, which rapidly compares the genetic content of hundreds to thousands of bacterial genomes, and returns a matrix that describes the relatedness of all coding sequences (CDSs in all genomes surveyed. This matrix can be easily parsed in order to identify genetic relationships between bacterial genomes. Although pipelines have been published that group peptides by sequence similarity, no other software performs the rapid, large-scale, full-genome comparative analyses carried out by LS-BSR.Results. To demonstrate the utility of the method, the LS-BSR pipeline was tested on 96 Escherichia coli and Shigella genomes; the pipeline ran in 163 min using 16 processors, which is a greater than 7-fold speedup compared to using a single processor. The BSR values for each CDS, which indicate a relative level of relatedness, were then mapped to each genome on an independent core genome single nucleotide polymorphism (SNP based phylogeny. Comparisons were then used to identify clade specific CDS markers and validate the LS-BSR pipeline based on molecular markers that delineate between classical E. coli pathogenic variant (pathovar designations. Scalability tests demonstrated that the LS-BSR pipeline can process 1,000 E. coli genomes in 27–57 h, depending upon the alignment method, using 16 processors.Conclusions. LS-BSR is an open-source, parallel implementation of the BSR algorithm, enabling rapid comparison of the genetic content of large numbers of genomes. The results of the pipeline can be used to identify specific markers between user-defined phylogenetic groups, and to identify the loss and/or acquisition of genetic information between bacterial isolates. Taxa-specific genetic markers can then be translated

  6. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  7. Global Convergence of a Modified LS Method

    Directory of Open Access Journals (Sweden)

    Liu JinKui

    2012-01-01

    Full Text Available The LS method is one of the effective conjugate gradient methods in solving the unconstrained optimization problems. The paper presents a modified LS method on the basis of the famous LS method and proves the strong global convergence for the uniformly convex functions and the global convergence for general functions under the strong Wolfe line search. The numerical experiments show that the modified LS method is very effective in practice.

  8. Improved mesh generator for the POISSON Group Codes

    International Nuclear Information System (INIS)

    Gupta, R.C.

    1987-01-01

    This paper describes the improved mesh generator of the POISSON Group Codes. These improvements enable one to have full control over the way the mesh is generated and in particular the way the mesh density is distributed throughout this model. A higher mesh density in certain regions coupled with a successively lower mesh density in others keeps the accuracy of the field computation high and the requirements on the computer time and computer memory low. The mesh is generated with the help of codes AUTOMESH and LATTICE; both have gone through a major upgrade. Modifications have also been made in the POISSON part of these codes. We shall present an example of a superconducting dipole magnet to explain how to use this code. The results of field computations are found to be reliable within a few parts in a hundred thousand even in such complex geometries

  9. MAGNETOHYDRODYNAMIC EQUATIONS (MHD GENERATION CODE

    Directory of Open Access Journals (Sweden)

    Francisco Frutos Alfaro

    2017-04-01

    Full Text Available A program to generate codes in Fortran and C of the full magnetohydrodynamic equations is shown. The program uses the free computer algebra system software REDUCE. This software has a package called EXCALC, which is an exterior calculus program. The advantage of this program is that it can be modified to include another complex metric or spacetime. The output of this program is modified by means of a LINUX script which creates a new REDUCE program to manipulate the magnetohydrodynamic equations to obtain a code that can be used as a seed for a magnetohydrodynamic code for numerical applications. As an example, we present part of the output of our programs for Cartesian coordinates and how to do the discretization.

  10. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  11. Next generation Zero-Code control system UI

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Developing ergonomic user interfaces for control systems is challenging, especially during machine upgrade and commissioning where several small changes may suddenly be required. Zero-code systems, such as *Inspector*, provide agile features for creating and maintaining control system interfaces. More so, these next generation Zero-code systems bring simplicity and uniformity and brake the boundaries between Users and Developers. In this talk we present *Inspector*, a CERN made Zero-code application development system, and we introduce the major differences and advantages of using Zero-code control systems to develop operational UI.

  12. Update on the opal opacity code

    International Nuclear Information System (INIS)

    Rogers, F.J.; Iglesias, C.A.; Wilson, B.G.

    1990-01-01

    Persisting discrepancies between theory and observation in a number of astrophysical properties has led to the conjecture that opacity databases may be inaccurate. The OPAL opacity code has been developed to address this question. The physical basis of OPAL removes several of the approximations present in past calculations. For example, it utilizes a much larger and more detailed set of atomic data than was used to construct the los Alamos Astrophysical Library. This data is generated online, in LS or intermediate coupling, from prefitted analytic effective potentials and is of similar quality as single configuration, relativistic, self-consistent-field calculations. The OPAL code has been used to calculate opacities for the solar core and for Cepheid variable stars. In both cases, significant increases in the opacity compared to the Los Alamos Astrophysical Library were found

  13. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  14. Evaluation of Material Models within LS-DYNA(Registered TradeMark) for a Kevlar/Epoxy Composite Honeycomb

    Science.gov (United States)

    Polanco, Michael A.; Kellas, Sotiris; Jackson, Karen

    2009-01-01

    The performance of material models to simulate a novel composite honeycomb Deployable Energy Absorber (DEA) was evaluated using the nonlinear explicit dynamic finite element code LS-DYNA(Registered TradeMark). Prototypes of the DEA concept were manufactured using a Kevlar/Epoxy composite material in which the fibers are oriented at +/-45 degrees with respect to the loading axis. The development of the DEA has included laboratory tests at subcomponent and component levels such as three-point bend testing of single hexagonal cells, dynamic crush testing of single multi-cell components, and impact testing of a full-scale fuselage section fitted with a system of DEA components onto multi-terrain environments. Due to the thin nature of the cell walls, the DEA was modeled using shell elements. In an attempt to simulate the dynamic response of the DEA, it was first represented using *MAT_LAMINATED_COMPOSITE_FABRIC, or *MAT_58, in LS-DYNA. Values for each parameter within the material model were generated such that an in-plane isotropic configuration for the DEA material was assumed. Analytical predictions showed that the load-deflection behavior of a single-cell during three-point bending was within the range of test data, but predicted the DEA crush response to be very stiff. In addition, a *MAT_PIECEWISE_LINEAR_PLASTICITY, or *MAT_24, material model in LS-DYNA was developed, which represented the Kevlar/Epoxy composite as an isotropic elastic-plastic material with input from +/-45 degrees tensile coupon data. The predicted crush response matched that of the test and localized folding patterns of the DEA were captured under compression, but the model failed to predict the single-cell three-point bending response.

  15. Perspectives on the development of next generation reactor systems safety analysis codes

    International Nuclear Information System (INIS)

    Zhang, H.

    2015-01-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  16. Perspectives on the development of next generation reactor systems safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-07-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  17. pix2code: Generating Code from a Graphical User Interface Screenshot

    OpenAIRE

    Beltramelli, Tony

    2017-01-01

    Transforming a graphical user interface screenshot created by a designer into computer code is a typical task conducted by a developer in order to build customized software, websites, and mobile applications. In this paper, we show that deep learning methods can be leveraged to train a model end-to-end to automatically generate code from a single input image with over 77% of accuracy for three different platforms (i.e. iOS, Android and web-based technologies).

  18. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  19. MUXS: a code to generate multigroup cross sections for sputtering calculations

    International Nuclear Information System (INIS)

    Hoffman, T.J.; Robinson, M.T.; Dodds, H.L. Jr.

    1982-10-01

    This report documents MUXS, a computer code to generate multigroup cross sections for charged particle transport problems. Cross sections generated by MUXS can be used in many multigroup transport codes, with minor modifications to these codes, to calculate sputtering yields, reflection coefficients, penetration distances, etc

  20. (U) Ristra Next Generation Code Report

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-22

    LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming a common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.

  1. Grid code requirements for wind power generation

    International Nuclear Information System (INIS)

    Djagarov, N.; Filchev, S.; Grozdev, Z.; Bonev, M.

    2011-01-01

    In this paper production data of wind power in Europe and Bulgaria and plans for their development within 2030 are reviewed. The main characteristics of wind generators used in Bulgaria are listed. A review of the grid code in different European countries, which regulate the requirements for renewable sources, is made. European recommendations for requirements harmonization are analyzed. Suggestions for the Bulgarian gird code are made

  2. INGEN: a general-purpose mesh generator for finite element codes

    International Nuclear Information System (INIS)

    Cook, W.A.

    1979-05-01

    INGEN is a general-purpose mesh generator for two- and three-dimensional finite element codes. The basic parts of the code are surface and three-dimensional region generators that use linear-blending interpolation formulas. These generators are based on an i, j, k index scheme that is used to number nodal points, construct elements, and develop displacement and traction boundary conditions. This code can generate truss elements (2 modal points); plane stress, plane strain, and axisymmetry two-dimensional continuum elements (4 to 8 nodal points); plate elements (4 to 8 nodal points); and three-dimensional continuum elements (8 to 21 nodal points). The traction loads generated are consistent with the element generated. The expansion--contraction option is of special interest. This option makes it possible to change an existing mesh such that some regions are refined and others are made coarser than the original mesh. 9 figures

  3. Generation of Efficient High-Level Hardware Code from Dataflow Programs

    OpenAIRE

    Siret , Nicolas; Wipliez , Matthieu; Nezan , Jean François; Palumbo , Francesca

    2012-01-01

    High-level synthesis (HLS) aims at reducing the time-to-market by providing an automated design process that interprets and compiles high-level abstraction programs into hardware. However, HLS tools still face limitations regarding the performance of the generated code, due to the difficulties of compiling input imperative languages into efficient hardware code. Moreover the hardware code generated by the HLS tools is usually target-dependant and at a low level of abstraction (i.e. gate-level...

  4. gCSP occam Code Generation for RMoX

    NARCIS (Netherlands)

    Groothuis, M.A.; Liet, Geert K.; Broenink, Johannes F.; Roebbers, H.W.; Sunter, J.P.E.; Welch, P.H.; Wood, D.C.

    2005-01-01

    gCSP is a graphical tool for creating and editing CSP diagrams. gCSP is used in our labs to generate the embedded software framework for our control systems. As a further extension to our gCSP tool, an occam code generator has been constructed. Generating occam from CSP diagrams gives opportunities

  5. Improved diffusion coefficients generated from Monte Carlo codes

    International Nuclear Information System (INIS)

    Herman, B. R.; Forget, B.; Smith, K.; Aviles, B. N.

    2013-01-01

    Monte Carlo codes are becoming more widely used for reactor analysis. Some of these applications involve the generation of diffusion theory parameters including macroscopic cross sections and diffusion coefficients. Two approximations used to generate diffusion coefficients are assessed using the Monte Carlo code MC21. The first is the method of homogenization; whether to weight either fine-group transport cross sections or fine-group diffusion coefficients when collapsing to few-group diffusion coefficients. The second is a fundamental approximation made to the energy-dependent P1 equations to derive the energy-dependent diffusion equations. Standard Monte Carlo codes usually generate a flux-weighted transport cross section with no correction to the diffusion approximation. Results indicate that this causes noticeable tilting in reconstructed pin powers in simple test lattices with L2 norm error of 3.6%. This error is reduced significantly to 0.27% when weighting fine-group diffusion coefficients by the flux and applying a correction to the diffusion approximation. Noticeable tilting in reconstructed fluxes and pin powers was reduced when applying these corrections. (authors)

  6. Forced circulation type steam generator simulation code: HT4

    International Nuclear Information System (INIS)

    Okamoto, Masaharu; Tadokoro, Yoshihiro

    1982-08-01

    The purpose of this code is a understanding of dynamic characteristics of the steam generator, which is a component of High-temperature Heat Transfer Components Test Unit. This unit is a number 4th test section of Helium Engineering Demonstration Loop (HENDEL). Features of this report are as follows, modeling of the steam generator, a basic relationship for the continuity equation, numerical analysis techniques of a non-linear simultaneous equation and computer graphics output techniques. Forced circulation type steam generator with strait tubes and horizontal cut baffles, applied in this code, have be designed at the Over All System Design of the VHTRex. The code is for use with JAERI's digital computer FACOM M200. About 1.5 sec required for each time step reiteration, then about 40 sec cpu time required for a standard problem. (author)

  7. Summary: after LS1

    International Nuclear Information System (INIS)

    Pojer, M.; Schmidt, R.

    2012-01-01

    After LS1 the energy will be about 6.5 TeV. The physics potential of LHC is determined by the integrated luminosity useful for the experiments and not only by the peak luminosity. The integrated luminosity is determined by the peak luminosity, the luminosity decay and the efficiency of operation (availability). In this session two of these parameters are addressed, the peak luminosity and the availability. In this paper peak luminosity is discussed through the performance potential of the injectors and through the global performance reach of LHC after LS1. LHC availability is tackled through the issues of the reliability of the magnet powering and of the beam system and of the occurrence of quenches

  8. Analysis of visual coding variables on CRT generated displays

    International Nuclear Information System (INIS)

    Blackman, H.S.; Gilmore, W.E.

    1985-01-01

    Cathode ray tube generated safety parameter display systems in a nuclear power plant control room situation have been found to be improved in effectiveness when color coding is employed. Research has indicated strong support for graphic coding techniques particularly in redundant coding schemes. In addition, findings on pictographs, as applied in coding schemes, indicate the need for careful application and for further research in the development of a standardized set of symbols

  9. Code Generation with Templates

    CERN Document Server

    Arnoldus, Jeroen; Serebrenik, A

    2012-01-01

    Templates are used to generate all kinds of text, including computer code. The last decade, the use of templates gained a lot of popularity due to the increase of dynamic web applications. Templates are a tool for programmers, and implementations of template engines are most times based on practical experience rather than based on a theoretical background. This book reveals the mathematical background of templates and shows interesting findings for improving the practical use of templates. First, a framework to determine the necessary computational power for the template metalanguage is presen

  10. Automatic ID heat load generation in ANSYS code

    International Nuclear Information System (INIS)

    Wang, Zhibi.

    1992-01-01

    Detailed power density profiles are critical in the execution of a thermal analysis using a finite element (FE) code such as ANSYS. Unfortunately, as yet there is no easy way to directly input the precise power profiles into ANSYS. A straight-forward way to do this is to hand-calculate the power of each node or element and then type the data into the code. Every time a change is made to the FE model, the data must be recalculated and reentered. One way to solve this problem is to generate a set of discrete data, using another code such as PHOTON2, and curve-fit the data. Using curve-fitted formulae has several disadvantages. It is time consuming because of the need to run a second code for generation of the data, curve-fitting, and doing the data check, etc. Additionally, because there is no generality for different beamlines or different parameters, the above work must be repeated for each case. And, errors in the power profiles due to curve-fitting result in errors in the analysis. To solve the problem once and for all and with the capability to apply to any insertion device (ID), a program for ED power profile was written in ANSYS Parametric Design Language (APDL). This program is implemented as an ANSYS command with input parameters of peak magnetic field, deflection parameter, length of ID, and distance from the source. Once the command is issued, all the heat load will be automatically generated by the code

  11. Code generation of RHIC accelerator device objects

    International Nuclear Information System (INIS)

    Olsen, R.H.; Hoff, L.; Clifford, T.

    1995-01-01

    A RHIC Accelerator Device Object is an abstraction which provides a software view of a collection of collider control points known as parameters. A grammar has been defined which allows these parameters, along with code describing methods for acquiring and modifying them, to be specified efficiently in compact definition files. These definition files are processed to produce C++ source code. This source code is compiled to produce an object file which can be loaded into a front end computer. Each loaded object serves as an Accelerator Device Object class definition. The collider will be controlled by applications which set and get the parameters in instances of these classes using a suite of interface routines. Significant features of the grammar are described with details about the generated C++ code

  12. Effects of surface roughness and vortex generators on the LS(1)-0417MOD airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Reuss, R.L.; Hoffman, M.J.; Gregorek, G.M. [Ohio State Univ., Columbus, OH (United States)

    1995-12-01

    An 18-inch constant-chord model of the LS(l)-0417MOD airfoil section was tested under two dimensional steady state conditions ate University 7{times}10 Subsonic Wind Tunnel. The objective was to document section lift and moment characteristics model and air flow conditions. Surface pressure data was acquired at {minus}60{degrees} through + 230{degrees} geometric angles of attack, at a nominal 1 million Reynolds number. Cases with and without leading edge grit roughness were investigated. The leading edge mulated blade conditions in the field. Additionally, surface pressure data were acquired for Reynolds numbers of 1.5 and 2.0 million, with and without leading edge grit roughness; the angle of attack was limited to a {minus}20{degrees} to 40{degrees} range. In general, results showed lift curve slope sensitivities to Reynolds number and roughness. The maximum lift coefficient was reduced as much as 29% by leading edge roughness. Moment coefficient showed little sensitivity to roughness beyond 50{degrees} angle of attack, but the expected decambering effect of a thicker boundary layer with roughness did show at lower angles. Tests were also conducted with vortex generators located at the 30% chord location on the upper surface only, at 1 and 1.5 million Reynolds numbers, with and without leading edge grit roughness. In general, with leading edge grit roughness applied, the vortex generators restored 85 percent of the baseline level of maximum lift coefficient but with a more sudden stall break and at a higher angle of attack than the baseline.

  13. Influence of Terraced area DEM Resolution on RUSLE LS Factor

    Science.gov (United States)

    Zhang, Hongming; Baartman, Jantiene E. M.; Yang, Xiaomei; Gai, Lingtong; Geissen, Viollette

    2017-04-01

    Topography has a large impact on the erosion of soil by water. Slope steepness and slope length are combined (the LS factor) in the universal soil-loss equation (USLE) and its revised version (RUSLE) for predicting soil erosion. The LS factor is usually extracted from a digital elevation model (DEM). The grid size of the DEM will thus influence the LS factor and the subsequent calculation of soil loss. Terracing is considered as a support practice factor (P) in the USLE/RUSLE equations, which is multiplied with the other USLE/RUSLE factors. However, as terraces change the slope length and steepness, they also affect the LS factor. The effect of DEM grid size on the LS factor has not been investigated for a terraced area. We obtained a high-resolution DEM by unmanned aerial vehicles (UAVs) photogrammetry, from which the slope steepness, slope length, and LS factor were extracted. The changes in these parameters at various DEM resolutions were then analysed. The DEM produced detailed LS-factor maps, particularly for low LS factors. High (small valleys, gullies, and terrace ridges) and low (flats and terrace fields) spatial frequencies were both sensitive to changes in resolution, so the areas of higher and lower slope steepness both decreased with increasing grid size. Average slope steepness decreased and average slope length increased with grid size. Slope length, however, had a larger effect than slope steepness on the LS factor as the grid size varied. The LS factor increased when the grid size increased from 0.5 to 30-m and increased significantly at grid sizes >5-m. The LS factor was increasingly overestimated as grid size decreased. The LS factor decreased from grid sizes of 30 to 100-m, because the details of the terraced terrain were gradually lost, but the factor was still overestimated.

  14. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  15. LS1: exciting times ahead

    CERN Multimedia

    Caroline Duc

    2013-01-01

    As the first and last proton-lead run of 2013 draws to a close, the extensive upgrade and maintenance programme of the LHC's first long shutdown (LS1) is about to get under way.   The LHC has provided physicists with a huge quantity of data to analyse since the first physics run in 2009. Now it's time for the machine, along with CERN's other accelerators, to get a facelift. LS1 will start on 13 February 2013, but this doesn’t mean that life at the Laboratory will be any less rich and exciting. Although there will be no collisions for a period of almost two years, the whole CERN site will be a hive of activity, with large-scale work under way to modernise the infrastructure and prepare the LHC for operation at higher energy. "A whole series of renovation work will be carried out around the LHC during LS1,” explains Simon Baird, deputy head of the EN Department. "The key driver is of course the consolidation of the 10,170 high-curren...

  16. Texture side information generation for distributed coding of video-plus-depth

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Raket, Lars Lau; Zamarin, Marco

    2013-01-01

    We consider distributed video coding in a monoview video-plus-depth scenario, aiming at coding textures jointly with their corresponding depth stream. Distributed Video Coding (DVC) is a video coding paradigm in which the complexity is shifted from the encoder to the decoder. The Side Information...... components) is strongly correlated, so the additional depth information may be used to generate more accurate SI for the texture stream, increasing the efficiency of the system. In this paper we propose various methods for accurate texture SI generation, comparing them with other state-of-the-art solutions...

  17. Wavelet-Coded OFDM for Next Generation Mobile Communications

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    2016-01-01

    In this work, we evaluate the performance of Wavelet-Coding into offering robustness for OFDM signals against the combined effects of varying fading and noise bursts. Wavelet-Code enables high diversity gains with a low complex receiver, and, most notably, without compromising the system’s spectr......-wave frequencies in future generation mobile communication due to its robustness against multipath fading....

  18. C code generation applied to nonlinear model predictive control for an artificial pancreas

    DEFF Research Database (Denmark)

    Boiroux, Dimitri; Jørgensen, John Bagterp

    2017-01-01

    This paper presents a method to generate C code from MATLAB code applied to a nonlinear model predictive control (NMPC) algorithm. The C code generation uses the MATLAB Coder Toolbox. It can drastically reduce the time required for development compared to a manual porting of code from MATLAB to C...

  19. Novel power saving architecture for FBG based OCDMA code generation

    Science.gov (United States)

    Osadola, Tolulope B.; Idris, Siti K.; Glesk, Ivan

    2013-10-01

    A novel architecture for generating incoherent, 2-dimensional wavelength hopping-time spreading optical CDMA codes is presented. The architecture is designed to facilitate the reuse of optical source signal that is unused after an OCDMA code has been generated using fiber Bragg grating based encoders. Effective utilization of available optical power is therefore achieved by cascading several OCDMA encoders thereby enabling 3dB savings in optical power.

  20. A UML profile for code generation of component based distributed systems

    International Nuclear Information System (INIS)

    Chiozzi, G.; Karban, R.; Andolfato, L.; Tejeda, A.

    2012-01-01

    A consistent and unambiguous implementation of code generation (model to text transformation) from UML (must rely on a well defined UML (Unified Modelling Language) profile, customizing UML for a particular application domain. Such a profile must have a solid foundation in a formally correct ontology, formalizing the concepts and their relations in the specific domain, in order to avoid a maze or set of wildly created stereotypes. The paper describes a generic profile for the code generation of component based distributed systems for control applications, the process to distill the ontology and define the profile, and the strategy followed to implement the code generator. The main steps that take place iteratively include: defining the terms and relations with an ontology, mapping the ontology to the appropriate UML meta-classes, testing the profile by creating modelling examples, and generating the code. This has allowed us to work on the modelling of E-ELT (European Extremely Large Telescope) control system and instrumentation without knowing what infrastructure will be finally used

  1. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  2. A mid-term report for LS1

    CERN Multimedia

    2014-01-01

    As the LHC’s first long shutdown, LS1, enters its second calendar year, it’s a good time for a mid-term report on how things are progressing.    Towards the end of last year, I had the pleasure to go down to the LHC tunnel to witness the closure of the first of the machine’s sectors to be completed. As I write, three sectors are now closed up, with a fourth not far behind. These are important milestones, and you can follow progress in detail in the regular LS1 reports in the Bulletin. They show that we’re on schedule for physics to resume in about a year from now, but more than that, they are an important reminder of the LS1 motto: safety, quality, schedule. It is fantastic news that we are on schedule, and testimony to the rigour that went into the detailed and complex planning of all the work that had to be undertaken in LS1. But more important than the schedule is the fact that we’ve carried out the work safely and that the qualit...

  3. Architectural and Algorithmic Requirements for a Next-Generation System Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    V.A. Mousseau

    2010-05-01

    This document presents high-level architectural and system requirements for a next-generation system analysis code (NGSAC) to support reactor safety decision-making by plant operators and others, especially in the context of light water reactor plant life extension. The capabilities of NGSAC will be different from those of current-generation codes, not only because computers have evolved significantly in the generations since the current paradigm was first implemented, but because the decision-making processes that need the support of next-generation codes are very different from the decision-making processes that drove the licensing and design of the current fleet of commercial nuclear power reactors. The implications of these newer decision-making processes for NGSAC requirements are discussed, and resulting top-level goals for the NGSAC are formulated. From these goals, the general architectural and system requirements for the NGSAC are derived.

  4. Data on genome analysis of Bacillus velezensis LS69.

    Science.gov (United States)

    Liu, Guoqiang; Kong, Yingying; Fan, Yajing; Geng, Ce; Peng, Donghai; Sun, Ming

    2017-08-01

    The data presented in this article are related to the published entitled "Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria" (Liu et al., 2017) [1]. Genome analysis revealed B. velezensis LS69 has a good potential for biocontrol and plant growth promotion. This article provides an extended analysis of the genetic islands, core genes and amylolysin loci of B. velezensis LS69.

  5. Data on genome analysis of Bacillus velezensis LS69

    OpenAIRE

    Liu, Guoqiang; Kong, Yingying; Fan, Yajing; Geng, Ce; Peng, Donghai; Sun, Ming

    2017-01-01

    The data presented in this article are related to the published entitled “Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria” (Liu et al., 2017) [1]. Genome analysis revealed B. velezensis LS69 has a good potential for biocontrol and plant growth promotion. This article provides an extended analysis of the genetic islands, core genes and amylolysin loci of B. velezensis LS69.

  6. Modeling Guidelines for Code Generation in the Railway Signaling Context

    Science.gov (United States)

    Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo

    2009-01-01

    Modeling guidelines constitute one of the fundamental cornerstones for Model Based Development. Their relevance is essential when dealing with code generation in the safety-critical domain. This article presents the experience of a railway signaling systems manufacturer on this issue. Introduction of Model-Based Development (MBD) and code generation in the industrial safety-critical sector created a crucial paradigm shift in the development process of dependable systems. While traditional software development focuses on the code, with MBD practices the focus shifts to model abstractions. The change has fundamental implications for safety-critical systems, which still need to guarantee a high degree of confidence also at code level. Usage of the Simulink/Stateflow platform for modeling, which is a de facto standard in control software development, does not ensure by itself production of high-quality dependable code. This issue has been addressed by companies through the definition of modeling rules imposing restrictions on the usage of design tools components, in order to enable production of qualified code. The MAAB Control Algorithm Modeling Guidelines (MathWorks Automotive Advisory Board)[3] is a well established set of publicly available rules for modeling with Simulink/Stateflow. This set of recommendations has been developed by a group of OEMs and suppliers of the automotive sector with the objective of enforcing and easing the usage of the MathWorks tools within the automotive industry. The guidelines have been published in 2001 and afterwords revisited in 2007 in order to integrate some additional rules developed by the Japanese division of MAAB [5]. The scope of the current edition of the guidelines ranges from model maintainability and readability to code generation issues. The rules are conceived as a reference baseline and therefore they need to be tailored to comply with the characteristics of each industrial context. Customization of these

  7. AMZ, multigroup constant library for EXPANDA code, generated by NJOY code from ENDF/B-IV

    International Nuclear Information System (INIS)

    Chalhoub, E.S.; Moraes, Marisa de

    1985-01-01

    It is described a library of multigroup constants with 70 energy groups and 37 isotopes to fast reactor calculation. The cross sections, scattering matrices and self-shielding factors were generated by NJOY code and RGENDF interface program, from ENDF/B-IV'S evaluated data. The library is edited in adequated format to be used by EXPANDA code. (M.C.K.) [pt

  8. Data on genome analysis of Bacillus velezensis LS69

    Directory of Open Access Journals (Sweden)

    Guoqiang Liu

    2017-08-01

    Full Text Available The data presented in this article are related to the published entitled “Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria” (Liu et al., 2017 [1]. Genome analysis revealed B. velezensis LS69 has a good potential for biocontrol and plant growth promotion. This article provides an extended analysis of the genetic islands, core genes and amylolysin loci of B. velezensis LS69.

  9. Developments in the Generation and Interpretation of Wire Codes (invited paper)

    International Nuclear Information System (INIS)

    Ebi, K.L.

    1999-01-01

    Three new developments in the generation and interpretation of wire codes are discussed. First, a method was developed to computer generate wire codes using data gathered from a utility database of the local distribution system and from tax assessor records. This method was used to wire code more than 250,000 residences in the greater Denver metropolitan area. There was an approximate 75% agreement with field wire coding. Other research in Denver suggests that wire codes predict some characteristics of a residence and its neighbourhood, including age, assessed value, street layout and traffic density. A third new development is the case-specular method to study the association between wire codes and childhood cancers. Recent results from applying the method to the Savitz et al and London et al studies suggest that the associations between childhood cancer and VHCC residences were strongest for residences with a backyard rather than street service drop, and for VHCC residences with LCC speculars. (author)

  10. Verification of 3-D generation code package for neutronic calculations of WWERs

    International Nuclear Information System (INIS)

    Sidorenko, V.D.; Aleshin, S.S.; Bolobov, P.A.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Morozov, V.V.; Syslov, A.A.; Tsvetkov, V.M.

    2000-01-01

    Materials on verification of the 3 -d generation code package for WWERs neutronic calculations are presented. The package includes: - spectral code TVS-M; - 2-D fine mesh diffusion code PERMAK-A for 4- or 6-group calculation of WWER core burnup; - 3-D coarse mesh diffusion code BIPR-7A for 2-group calculations of quasi-stationary WWERs regimes. The materials include both TVS-M verification data and verification data on PERMAK-A and BIPR-7A codes using constant libraries generated with TVS-M. All materials are related to the fuel without Gd. TVS-M verification materials include results of comparison both with benchmark calculations obtained by other codes and with experiments carried out at ZR-6 critical facility. PERMAK-A verification materials contain results of comparison with TVS-M calculations and with ZR-6 experiments. BIPR-7A materials include comparison with operation data for Dukovany-2 and Loviisa-1 NPPs (WWER-440) and for Balakovo NPP Unit 4 (WWER-1000). The verification materials demonstrate rather good accuracy of calculations obtained with the use of code package of the 3 -d generation. (Authors)

  11. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  12. Strong normalization by type-directed partial evaluation and run-time code generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1998-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  13. Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1997-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  14. A method for generating subgroup parameters from resonance tables and the SPART code

    International Nuclear Information System (INIS)

    Devan, K.; Mohanakrishnan, P.

    1995-01-01

    A method for generating subgroup or band parameters from resonance tables is described. A computer code SPART was written using this method. This code generates the subgroup parameters for any number of bands within the specified broad groups at different temperatures by reading the required input data from the binary cross section library in the Cadarache format. The results obtained with SPART code for two bands were compared with that obtained from GROUPIE code and a good agreement was obtained. Results of the generation of subgroup parameters in four bands for sample case of 239 Pu from resonance tables of Cadarache Ver.2 library is also presented. 6 refs, 2 tabs

  15. LS1 Report: the clouds are lifting

    CERN Multimedia

    Anaïs Schaeffer

    2014-01-01

    To combat the problem of electron clouds, which perturbate the environment of the particle beams in our accelerators, the Vacuum team have turned to amorphous carbon. This material is being applied to the interior of 16 magnets in the SPS during LS1 and will help prevent the formation of the secondary particles which are responsible for these clouds.   This photo shows the familiar coils of an SPS dipole magnet in brown. The vacuum chamber is the metallic rectangular part in the centre. The small wheeled device you can see in the vacuum chamber carries the hollow cathodes  along the length of the chamber. When a particle beam circulates at high energy in a vacuum chamber, it unavoidably generates secondary particles. These include electrons produced by the ionisation of residual molecules in the vacuum or indirectly generated by synchrotron radiation. When these electrons hit the surface of the vacuum chamber, they produce other electrons which, through an avalanche-like process, re...

  16. Generating Importance Map for Geometry Splitting using Discrete Ordinates Code in Deep Shielding Problem

    International Nuclear Information System (INIS)

    Kim, Jong Woon; Lee, Young Ouk

    2016-01-01

    When we use MCNP code for a deep shielding problem, we prefer to use variance reduction technique such as geometry splitting, or weight window, or source biasing to have relative error within reliable confidence interval. To generate importance map for geometry splitting in MCNP calculation, we should know the track entering number and previous importance on each cells since a new importance is calculated based on these information. If a problem is deep shielding problem such that we have zero tracks entering on a cell, we cannot generate new importance map. In this case, discrete ordinates code can provide information to generate importance map easily. In this paper, we use AETIUS code as a discrete ordinates code. Importance map for MCNP is generated based on a zone average flux of AETIUS calculation. The discretization of space, angle, and energy is not necessary for MCNP calculation. This is the big merit of MCNP code compared to the deterministic code. However, deterministic code (i.e., AETIUS) can provide a rough estimate of the flux throughout a problem relatively quickly. This can help MCNP by providing variance reduction parameters. Recently, ADVANTG code is released. This is an automated tool for generating variance reduction parameters for fixed-source continuous-energy Monte Carlo simulations with MCNP5 v1.60

  17. Safety, Quality, Schedule: the motto of LS1

    CERN Multimedia

    2013-01-01

    The LHC’s first long shutdown, LS1, is a marathon that began on 16 February and will take us through to the beginning of 2015. Just as Olympic marathon runners have a motto, Citius, Altius, Fortius, so the athletes of LS1 work to the mantra of Safety, Quality, Schedule. Four months into LS1, they have settled into their rhythm, and things are going to plan.   The first task of LS1 was to bring the LHC up to room temperature - this was achieved in just 10 weeks. In parallel, preliminary tests for electrical quality assurance and leaks revealed essentially the level of wear and tear we’d expect after three years of running. One slightly anxious moment came when we looked at the RF fingers – the devices that ensure electrical contact in the beam pipes as they pass from one magnet to the next. Those of you with long memories will recall that before start-up, some of these got damaged at warm-up. The good news today is that with all eight sectors test...

  18. The LS-STAG immersed boundary/cut-cell method for non-Newtonian flows in 3D extruded geometries

    Science.gov (United States)

    Nikfarjam, F.; Cheny, Y.; Botella, O.

    2018-05-01

    The LS-STAG method is an immersed boundary/cut-cell method for viscous incompressible flows based on the staggered MAC arrangement for Cartesian grids, where the irregular boundary is sharply represented by its level-set function, results in a significant gain in computer resources (wall time, memory usage) compared to commercial body-fitted CFD codes. The 2D version of LS-STAG method is now well-established (Cheny and Botella, 2010), and this paper presents its extension to 3D geometries with translational symmetry in the z direction (hereinafter called 3D extruded configurations). This intermediate step towards the fully 3D implementation can be applied to a wide variety of canonical flows and will be regarded as the keystone for the full 3D solver, since both discretization and implementation issues on distributed memory machines are tackled at this stage of development. The LS-STAG method is then applied to various Newtonian and non-Newtonian flows in 3D extruded geometries (axisymmetric pipe, circular cylinder, duct with an abrupt expansion) for which benchmark results and experimental data are available. The purpose of these investigations are (a) to investigate the formal order of accuracy of the LS-STAG method, (b) to assess the versatility of method for flow applications at various regimes (Newtonian and shear-thinning fluids, steady and unsteady laminar to turbulent flows) (c) to compare its performance with well-established numerical methods (body-fitted and immersed boundary methods).

  19. Steam generator and circulator model for the HELAP code

    International Nuclear Information System (INIS)

    Ludewig, H.

    1975-07-01

    An outline is presented of the work carried out in the 1974 fiscal year on the GCFBR safety research project consisting of the development of improved steam generator and circulator (steam turbine driven helium compressor) models which will eventually be inserted in the HELAP (1) code. Furthermore, a code was developed which will be used to generate steady state input for the primary and secondary sides of the steam generator. The following conclusions and suggestions for further work are made: (1) The steam-generator and circulator model are consistent with the volume and junction layout used in HELAP, (2) with minor changes these models, when incorporated in HELAP, could be used to simulate a direct cycle plant, (3) an explicit control valve model is still to be developed and would be very desirable to control the flow to the turbine during a transient (initially this flow will be controlled by using the existing check valve model); (4) the friction factor in the laminar flow region is computed inaccurately, this might cause significant errors in loss-of-flow accidents; and (5) it is felt that HELAP will still use a large amount of computer time and will thus be limited to design basis accidents without scram or loss of flow transients with and without scram. Finally it may also be used as a test bed for the development of prototype component models which would be incorporated in a more sophisticated system code, developed specifically for GCFBR's

  20. Interpretation and code generation based on intermediate languages

    DEFF Research Database (Denmark)

    Kornerup, Peter; Kristensen, Bent Bruun; Madsen, Ole Lehrmann

    1980-01-01

    The possibility of supporting high level languages through intermediate languages to be used for direct interpretation and as intermediate forms in compilers is investigated. An accomplished project in the construction of an interpreter and a code generator using one common intermediate form...

  1. LS1: electrical engineering upgrades and consolidation

    International Nuclear Information System (INIS)

    Duval, F.

    2012-01-01

    3 different types of activities are planned by the Engineering Department Electrical Engineering (EN-EL) Group for the first long shut down (LS1). First, the consolidation of EN-EL's ageing infrastructure elements. It is part of a 15- year programme aiming at increasing the reliability and availability of the power distribution network. Secondly, the maintenance of the accelerators infrastructure. In addition to the usual periodic operations and those delayed until LS1, the group will address more demanding activities like the replacement campaigns for irradiated cables and non-radiation resistant fibres as well as the removal of unused cables in particularly overcrowded areas. Thirdly, a vast amount of user copper and optical fibre cabling requests: EN-EL estimates that only 50% of LS1 requests are currently known. The main activities will be EN-EL's contributions to the R2E project, BE-BI upgrade projects, and the RF upgrade project in SPS BA3

  2. Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization

    Science.gov (United States)

    Green, Lawrence; Carle, Alan; Fagan, Mike

    1999-01-01

    Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop

  3. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  4. ADGEN: An automated adjoint code generator for large-scale sensitivity analysis

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.; Horwedel, J.E.; Lucius, J.L.

    1987-01-01

    This paper describes a new automated system, named ADGEN, which makes use of the strengths of computer calculus to automate the costly and time-consuming calculation of derivatives in FORTRAN computer codes, and automatically generate adjoint solutions of computer codes

  5. BBU code development for high-power microwave generators

    International Nuclear Information System (INIS)

    Houck, T.L.; Westenskow, G.A.; Yu, S.S.

    1992-01-01

    We are developing a two-dimensional, time-dependent computer code for the simulation of transverse instabilities in support of relativistic klystron-two beam accelerator research at LLNL. The code addresses transient effects as well as both cumulative and regenerative beam breakup modes. Although designed specifically for the transport of high current (kA) beams through traveling-wave structures, it is applicable to devices consisting of multiple combinations of standing-wave, traveling-wave, and induction accelerator structures. In this paper we compare code simulations to analytical solutions for the case where there is no rf coupling between cavities, to theoretical scaling parameters for coupled cavity structures, and to experimental data involving beam breakup in the two traveling-wave output structure of our microwave generator. (Author) 4 figs., tab., 5 refs

  6. Towards Qualifiable Code Generation from a Clocked Synchronous Subset of Modelica

    Directory of Open Access Journals (Sweden)

    Bernhard Thiele

    2015-01-01

    Full Text Available So far no qualifiable automatic code generators (ACGs are available for Modelica. Hence, digital control applications can be modeled and simulated in Modelica, but require tedious additional efforts (e.g., manual reprogramming to produce qualifiable target system production code. In order to more fully leverage the potential of a model-based development (MBD process in Modelica, a qualifiable automatic code generator is needed. Typical Modelica code generation is a fairly complex process which imposes a huge development burden to any efforts of tool qualification. This work aims at mapping a Modelica subset for digital control function development to a well-understood synchronous data-flow kernel language. This kernel language allows to resort to established compilation techniques for data-flow languages which are understood enough to be accepted by certification authorities. The mapping is established by providing a translational semantics from the Modelica subset to the synchronous data-flow kernel language. However, this translation turned out to be more intricate than initially expected and has given rise to several interesting issues that require suitable design decisions regarding the mapping and the language subset.

  7. LS1 general planning and strategy for the LHC, LHC injectors

    International Nuclear Information System (INIS)

    Foraz, K.

    2012-01-01

    The goal of Long Shutdown 1 (LS1) is to perform the full maintenance of equipment and the necessary consolidation and upgrade activities in order to ensure reliable LHC operation at nominal performance from mid-2014. LS1 is scheduled to last 20 months. LS1 not only concerns the LHC but also its injectors. To ensure resources will be available an analysis is in progress to detect conflict/overload and decide what is compulsory, what we can afford, and what can be postponed until LS2. The strategy, time key drivers, constraints, and draft schedule are presented here. (author)

  8. Generation of cross-sections and reference solutions using the code Serpent

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Delfin L, A.; Del Valle G, E.

    2012-10-01

    Serpent is a code that solves the neutron transport equations using the Monte Carlo method that besides generating reference solutions in stationary state for complex geometry problems, has been specially designed for physical applications of cells, what includes the generation of homogenized cross-sections for several energy groups. In this work a calculation methodology is described using the code Serpent to generate the necessary cross-sections to carry out calculations with the code TNXY, developed in 1993 in the Nuclear Engineering Department of the Instituto Politecnico Nacional (Mexico) by means of an interface programmed in Octave. The computation program TNXY solves the neutron transport equations for several energy groups in stationary state and geometry X Y using the Discreet Ordinates method (S N ). To verify and to validate the methodology the results of TNXY were compared with those calculated by Serpent giving minor differences to 0.55% in the value of the multiplication factor. (Author)

  9. On the use of SERPENT Monte Carlo code to generate few group diffusion constants

    Energy Technology Data Exchange (ETDEWEB)

    Piovezan, Pamela, E-mail: pamela.piovezan@ctmsp.mar.mil.b [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Sao Paulo, SP (Brazil); Carluccio, Thiago; Domingos, Douglas Borges; Rossi, Pedro Russo; Mura, Luiz Felipe, E-mail: fermium@cietec.org.b, E-mail: thiagoc@ipen.b [Fermium Tecnologia Nuclear, Sao Paulo, SP (Brazil); Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The accuracy of diffusion reactor codes strongly depends on the quality of the groups constants processing. For many years, the generation of such constants was based on 1-D infinity cell transport calculations. Some developments using collision probability or the method of characteristics allow, nowadays, 2-D assembly group constants calculations. However, these 1-D and 2-D codes how some limitations as , for example, on complex geometries and in the neighborhood of heavy absorbers. On the other hand, since Monte Carlos (MC) codes provide accurate neutro flux distributions, the possibility of using these solutions to provide group constants to full-core reactor diffusion simulators has been recently investigated, especially for the cases in which the geometry and reactor types are beyond the capability of the conventional deterministic lattice codes. The two greatest difficulties on the use of MC codes to group constant generation are the computational costs and the methodological incompatibility between analog MC particle transport simulation and deterministic transport methods based in several approximations. The SERPENT code is a 3-D continuous energy MC transport code with built-in burnup capability that was specially optimized to generate these group constants. In this work, we present the preliminary results of using the SERPENT MC code to generate 3-D two-group diffusion constants for a PWR like assembly. These constants were used in the CITATION diffusion code to investigate the effects of the MC group constants determination on the neutron multiplication factor diffusion estimate. (author)

  10. Generating Code with Polymorphic let: A Ballad of Value Restriction, Copying and Sharing

    Directory of Open Access Journals (Sweden)

    Oleg Kiselyov

    2017-02-01

    Full Text Available Getting polymorphism and effects such as mutation to live together in the same language is a tale worth telling, under the recurring refrain of copying vs. sharing. We add new stanzas to the tale, about the ordeal to generate code with polymorphism and effects, and be sure it type-checks. Generating well-typed-by-construction polymorphic let-expressions is impossible in the Hindley-Milner type system: even the author believed that. The polymorphic-let generator turns out to exist. We present its derivation and the application for the lightweight implementation of quotation via a novel and unexpectedly simple source-to-source transformation to code-generating combinators. However, generating let-expressions with polymorphic functions demands more than even the relaxed value restriction can deliver. We need a new deal for let-polymorphism in ML. We conjecture the weaker restriction and implement it in a practically-useful code-generation library. Its formal justification is formulated as the research program.

  11. Study on random number generator in Monte Carlo code

    International Nuclear Information System (INIS)

    Oya, Kentaro; Kitada, Takanori; Tanaka, Shinichi

    2011-01-01

    The Monte Carlo code uses a sequence of pseudo-random numbers with a random number generator (RNG) to simulate particle histories. A pseudo-random number has its own period depending on its generation method and the period is desired to be long enough not to exceed the period during one Monte Carlo calculation to ensure the correctness especially for a standard deviation of results. The linear congruential generator (LCG) is widely used as Monte Carlo RNG and the period of LCG is not so long by considering the increasing rate of simulation histories in a Monte Carlo calculation according to the remarkable enhancement of computer performance. Recently, many kinds of RNG have been developed and some of their features are better than those of LCG. In this study, we investigate the appropriate RNG in a Monte Carlo code as an alternative to LCG especially for the case of enormous histories. It is found that xorshift has desirable features compared with LCG, and xorshift has a larger period, a comparable speed to generate random numbers, a better randomness, and good applicability to parallel calculation. (author)

  12. The little-studied cluster Berkeley 90. I. LS III +46 11: a very massive O3.5 If* + O3.5 If* binary

    Science.gov (United States)

    Maíz Apellániz, J.; Negueruela, I.; Barbá, R. H.; Walborn, N. R.; Pellerin, A.; Simón-Díaz, S.; Sota, A.; Marco, A.; Alonso-Santiago, J.; Sanchez Bermudez, J.; Gamen, R. C.; Lorenzo, J.

    2015-07-01

    Context. It appears that most (if not all) massive stars are born in multiple systems. At the same time, the most massive binaries are hard to find owing to their low numbers throughout the Galaxy and the implied large distances and extinctions. Aims: We want to study LS III +46 11, identified in this paper as a very massive binary; another nearby massive system, LS III +46 12; and the surrounding stellar cluster, Berkeley 90. Methods: Most of the data used in this paper are multi-epoch high S/N optical spectra, although we also use Lucky Imaging and archival photometry. The spectra are reduced with dedicated pipelines and processed with our own software, such as a spectroscopic-orbit code, CHORIZOS, and MGB. Results: LS III +46 11 is identified as a new very early O-type spectroscopic binary [O3.5 If* + O3.5 If*] and LS III +46 12 as another early O-type system [O4.5 V((f))]. We measure a 97.2-day period for LS III +46 11 and derive minimum masses of 38.80 ± 0.83 M⊙ and 35.60 ± 0.77 M⊙ for its two stars. We measure the extinction to both stars, estimate the distance, search for optical companions, and study the surrounding cluster. In doing so, a variable extinction is found as well as discrepant results for the distance. We discuss possible explanations and suggest that LS III +46 12 may be a hidden binary system where the companion is currently undetected.

  13. Revitalising Mathematics Classroom Teaching through Lesson Study (LS): A Malaysian Case Study

    Science.gov (United States)

    Lim, Chap Sam; Kor, Liew Kee; Chia, Hui Min

    2016-01-01

    This paper discusses how implementation of Lesson Study (LS) has brought about evolving changes in the quality of mathematics classroom teaching in one Chinese primary school. The Japanese model of LS was adapted as a teacher professional development to improve mathematics teachers' teaching practices. The LS group consisted of five mathematics…

  14. Biodegradation test of SPS-LS blends as polymer electrolyte membrane fuel cells

    International Nuclear Information System (INIS)

    Putri, Zufira; Arcana, I Made

    2014-01-01

    Sulfonated polystyrene (SPS) can be applied as a proton exchange membrane fuel cell due to its fairly good chemical stability. In order to be applied as polymer electrolyte membrane fuel cells (PEMFCs), membrane polymer should have a good ionic conductivity, high proton conductivity, and high mechanical strength. Lignosulfonate (LS) is a complex biopolymer which has crosslinks and sulfonate groups. SPS-LS blends with addition of SiO 2 are used to increase the proton conductivity and to improve the mechanical properties and thermal stability. However, the biodegradation test of SPS-LS blends is required to determine whether the application of these membranes to be applied as an environmentally friendly membrane. In this study, had been done the synthesis of SPS, biodegradability test of SPS-LS blends with variations of LS and SiO 2 compositions. The biodegradation test was carried out in solid medium of Luria Bertani (LB) with an activated sludge used as a source of microorganism at incubation temperature of 37°C. Based on the results obtained indicated that SPS-LS-SiO 2 blends are more decomposed by microorganism than SPS-LS blends. This result is supported by analysis of weight reduction percentage, functional groups with Fourier Transform Infrared (FTIR) Spectroscopy, and morphological surface with Scanning Electron Microscopy (SEM)

  15. SPS completes LS1 activities

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    On 27 June, the SPS closed its doors to the LS1 engineers, bringing to an end almost 17 months of activities. The machine now enters the hardware-testing phase in preparation for an October restart.   Photo 1: The SPS transfer tunnel, TT10, reinforced with steal beams. Having completed their LS1 activities right on schedule (to the day!), the SPS team is now preparing the machine for its restart. Over the next eight weeks, hardware tests of the SPS dipole and quadrupole power converters will be underway, led by the TE-EPC (Electrical Power Converters) team. "OP start-up test activities will also be running in parallel, utilising the off hours when EPC is not using the machine," says David McFarlane, the SPS technical coordinator from the Engineering Department. "The primary beam testing phase will start at the beginning of September, once hardware tests and DSO safety tests have been completed." It has been a long journey to this point, with several major...

  16. Improvements to the COBRA-TF (EPRI) computer code for steam generator analysis. Final report

    International Nuclear Information System (INIS)

    Stewart, C.W.; Barnhart, J.S.; Koontz, A.S.

    1980-09-01

    The COBRA-TF (EPRI) code has been improved and extended for pressurized water reactor steam generator analysis. New features and models have been added in the areas of subcooled boiling and heat transfer, turbulence, numerics, and global steam generator modeling. The code's new capabilities are qualified against selected experimental data and demonstrated for typical global and microscale steam generator analysis

  17. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show...

  18. Code Generation for Protocols from CPN models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael; Kindler, Ekkart

    software implementation satisfies the properties verified for the model. Coloured Petri Nets (CPNs) have been widely used to model and verify protocol software, but limited work exists on using CPN models of protocol software as a basis for automated code generation. In this report, we present an approach...... modelling languages, MDE further has the advantage that models are amenable to model checking which allows key behavioural properties of the software design to be verified. The combination of formally verified models and automated code generation contributes to a high degree of assurance that the resulting...... for generating protocol software from a restricted class of CPN models. The class of CPN models considered aims at being descriptive in that the models are intended to be helpful in understanding and conveying the operation of the protocol. At the same time, a descriptive model is close to a verifiable version...

  19. Approximation generation for correlations in thermal-hydraulic analysis codes

    International Nuclear Information System (INIS)

    Pereira, Luiz C.M.; Carmo, Eduardo G.D. do

    1997-01-01

    A fast and precise evaluation of fluid thermodynamic and transport properties is needed for the efficient mass, energy and momentum transport phenomena simulation related to nuclear plant power generation. A fully automatic code capable to generate suitable approximation for correlations with one or two independent variables is presented. Comparison in terms of access speed and precision with original correlations currently used shows the adequacy of the approximation obtained. (author). 4 refs., 8 figs., 1 tab

  20. Salt Selection for the LS-VHTR

    International Nuclear Information System (INIS)

    Williams, D.F.; Clarno, K.T.

    2006-01-01

    Molten fluorides were initially developed for use in the nuclear industry as the high temperature fluid-fuel for a Molten Salt Reactor (MSR). The Office of Nuclear Energy is exploring the use of molten fluorides as a primary coolant (rather than helium) in an Advanced High Temperature Reactor (AHTR) design, also know as the Liquid-Salt cooled Very High Temperature Reactor (LS-VHTR). This paper provides a review of relevant properties for use in evaluation and ranking of candidate coolants for the LS-VHTR. Nuclear, physical, and chemical properties were reviewed and metrics for evaluation are recommended. Chemical properties of the salt were examined for the purpose of identifying factors that effect materials compatibility (i.e., corrosion). Some preliminary consideration of economic factors for the candidate salts is also presented. (authors)

  1. Biodegradation test of SPS-LS blends as polymer electrolyte membrane fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Putri, Zufira, E-mail: zufira.putri@gmail.com, E-mail: arcana@chem.itb.ac.id; Arcana, I Made, E-mail: zufira.putri@gmail.com, E-mail: arcana@chem.itb.ac.id [Inorganic and Physical Chemistry Research Groups, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung, Bandung (Indonesia)

    2014-03-24

    Sulfonated polystyrene (SPS) can be applied as a proton exchange membrane fuel cell due to its fairly good chemical stability. In order to be applied as polymer electrolyte membrane fuel cells (PEMFCs), membrane polymer should have a good ionic conductivity, high proton conductivity, and high mechanical strength. Lignosulfonate (LS) is a complex biopolymer which has crosslinks and sulfonate groups. SPS-LS blends with addition of SiO{sub 2} are used to increase the proton conductivity and to improve the mechanical properties and thermal stability. However, the biodegradation test of SPS-LS blends is required to determine whether the application of these membranes to be applied as an environmentally friendly membrane. In this study, had been done the synthesis of SPS, biodegradability test of SPS-LS blends with variations of LS and SiO{sub 2} compositions. The biodegradation test was carried out in solid medium of Luria Bertani (LB) with an activated sludge used as a source of microorganism at incubation temperature of 37°C. Based on the results obtained indicated that SPS-LS-SiO{sub 2} blends are more decomposed by microorganism than SPS-LS blends. This result is supported by analysis of weight reduction percentage, functional groups with Fourier Transform Infrared (FTIR) Spectroscopy, and morphological surface with Scanning Electron Microscopy (SEM)

  2. Machine-Checked Sequencer for Critical Embedded Code Generator

    Science.gov (United States)

    Izerrouken, Nassima; Pantel, Marc; Thirioux, Xavier

    This paper presents the development of a correct-by-construction block sequencer for GeneAuto a qualifiable (according to DO178B/ED12B recommendation) automatic code generator. It transforms Simulink models to MISRA C code for safety critical systems. Our approach which combines classical development process and formal specification and verification using proof-assistants, led to preliminary fruitful exchanges with certification authorities. We present parts of the classical user and tools requirements and derived formal specifications, implementation and verification for the correctness and termination of the block sequencer. This sequencer has been successfully applied to real-size industrial use cases from various transportation domain partners and led to requirement errors detection and a correct-by-construction implementation.

  3. Validation of the WIMSD4M cross-section generation code with benchmark results

    International Nuclear Information System (INIS)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented

  4. Validation of the WIMSD4M cross-section generation code with benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.

  5. Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®

    Energy Technology Data Exchange (ETDEWEB)

    Stander, Nielen [Livermore Software Technology Corporation, CA (United States); Basudhar, Anirban [Livermore Software Technology Corporation, CA (United States); Basu, Ushnish [Livermore Software Technology Corporation, CA (United States); Gandikota, Imtiaz [Livermore Software Technology Corporation, CA (United States); Savic, Vesna [General Motors, Flint, MI (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, XiaoHua [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pourboghrat, Farhang [The Ohio State Univ., Columbus, OH (United States); Park, Taejoon [The Ohio State Univ., Columbus, OH (United States); Mapar, Aboozar [Michigan State Univ., East Lansing, MI (United States); Kumar, Sharvan [Brown Univ., Providence, RI (United States); Ghassemi-Armaki, Hassan [Brown Univ., Providence, RI (United States); Abu-Farha, Fadi [Clemson Univ., SC (United States)

    2015-06-15

    Ever-tightening regulations on fuel economy and carbon emissions demand continual innovation in finding ways for reducing vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials by adding material diversity, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing thickness while retaining sufficient strength and ductility required for durability and safety. Such a project was proposed and is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the Department of Energy. Under this program, new steel alloys (Third Generation Advanced High Strength Steel or 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. In this project the principal phases identified are (i) material identification, (ii) formability optimization and (iii) multi-disciplinary vehicle optimization. This paper serves as an introduction to the LS-OPT methodology and therefore mainly focuses on the first phase, namely an approach to integrate material identification using material models of different length scales. For this purpose, a multi-scale material identification strategy, consisting of a Crystal Plasticity (CP) material model and a Homogenized State Variable (SV) model, is discussed and demonstrated. The paper concludes with proposals for integrating the multi-scale methodology into the overall vehicle design.

  6. New coding technique for computer generated holograms.

    Science.gov (United States)

    Haskell, R. E.; Culver, B. C.

    1972-01-01

    A coding technique is developed for recording computer generated holograms on a computer controlled CRT in which each resolution cell contains two beam spots of equal size and equal intensity. This provides a binary hologram in which only the position of the two dots is varied from cell to cell. The amplitude associated with each resolution cell is controlled by selectively diffracting unwanted light into a higher diffraction order. The recording of the holograms is fast and simple.

  7. On the use of the Serpent Monte Carlo code for few-group cross section generation

    International Nuclear Information System (INIS)

    Fridman, E.; Leppaenen, J.

    2011-01-01

    Research highlights: → B1 methodology was used for generation of leakage-corrected few-group cross sections in the Serpent Monte-Carlo code. → Few-group constants generated by Serpent were compared with those calculated by Helios deterministic lattice transport code. → 3D analysis of a PWR core was performed by a nodal diffusion code DYN3D employing two-group cross section sets generated by Serpent and Helios. → An excellent agreement in the results of 3D core calculations obtained with Helios and Serpent generated cross-section libraries was observed. - Abstract: Serpent is a recently developed 3D continuous-energy Monte Carlo (MC) reactor physics burnup calculation code. Serpent is specifically designed for lattice physics applications including generation of homogenized few-group constants for full-core core simulators. Currently in Serpent, the few-group constants are obtained from the infinite-lattice calculations with zero neutron current at the outer boundary. In this study, in order to account for the non-physical infinite-lattice approximation, B1 methodology, routinely used by deterministic lattice transport codes, was considered for generation of leakage-corrected few-group cross sections in the Serpent code. A preliminary assessment of the applicability of the B1 methodology for generation of few-group constants in the Serpent code was carried out according to the following steps. Initially, the two-group constants generated by Serpent were compared with those calculated by Helios deterministic lattice transport code. Then, a 3D analysis of a Pressurized Water Reactor (PWR) core was performed by the nodal diffusion code DYN3D employing two-group cross section sets generated by Serpent and Helios. At this stage thermal-hydraulic (T-H) feedback was neglected. The DYN3D results were compared with those obtained from the 3D full core Serpent MC calculations. Finally, the full core DYN3D calculations were repeated taking into account T-H feedback and

  8. Modeling Vortex Generators in a Navier-Stokes Code

    Science.gov (United States)

    Dudek, Julianne C.

    2011-01-01

    A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.

  9. Multiplicative Structure and Hecke Rings of Generator Matrices for Codes over Quotient Rings of Euclidean Domains

    Directory of Open Access Journals (Sweden)

    Hajime Matsui

    2017-12-01

    Full Text Available In this study, we consider codes over Euclidean domains modulo their ideals. In the first half of the study, we deal with arbitrary Euclidean domains. We show that the product of generator matrices of codes over the rings mod a and mod b produces generator matrices of all codes over the ring mod a b , i.e., this correspondence is onto. Moreover, we show that if a and b are coprime, then this correspondence is one-to-one, i.e., there exist unique codes over the rings mod a and mod b that produce any given code over the ring mod a b through the product of their generator matrices. In the second half of the study, we focus on the typical Euclidean domains such as the rational integer ring, one-variable polynomial rings, rings of Gaussian and Eisenstein integers, p-adic integer rings and rings of one-variable formal power series. We define the reduced generator matrices of codes over Euclidean domains modulo their ideals and show their uniqueness. Finally, we apply our theory of reduced generator matrices to the Hecke rings of matrices over these Euclidean domains.

  10. LS1 general planning and strategy for the LHC, LHC injectors

    CERN Document Server

    Foraz, K

    2012-01-01

    The goal of Long Shutdown 1 (LS1) is to perform the full maintenance of equipment, and the necessary consolidation and upgrade activities in order to ensure reliable LHC operation at nominal performance from mid 2014. LS1 not only concerns LHC but also its injectors. To ensure resources will be available an analysis is in progress to detect conflict/overload and decide what is compulsary, what we can afford, and what can be postponed to LS2. The strategy, time key drivers, constraints, and draft schedule will be presented here.

  11. Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder

    Science.gov (United States)

    Staats, Matt

    2009-01-01

    We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.

  12. Regional-scale calculation of the LS factor using parallel processing

    Science.gov (United States)

    Liu, Kai; Tang, Guoan; Jiang, Ling; Zhu, A.-Xing; Yang, Jianyi; Song, Xiaodong

    2015-05-01

    With the increase of data resolution and the increasing application of USLE over large areas, the existing serial implementation of algorithms for computing the LS factor is becoming a bottleneck. In this paper, a parallel processing model based on message passing interface (MPI) is presented for the calculation of the LS factor, so that massive datasets at a regional scale can be processed efficiently. The parallel model contains algorithms for calculating flow direction, flow accumulation, drainage network, slope, slope length and the LS factor. According to the existence of data dependence, the algorithms are divided into local algorithms and global algorithms. Parallel strategy are designed according to the algorithm characters including the decomposition method for maintaining the integrity of the results, optimized workflow for reducing the time taken for exporting the unnecessary intermediate data and a buffer-communication-computation strategy for improving the communication efficiency. Experiments on a multi-node system show that the proposed parallel model allows efficient calculation of the LS factor at a regional scale with a massive dataset.

  13. Bacillus amyloliquefaciens L-S60 Reforms the Rhizosphere Bacterial Community and Improves Growth Conditions in Cucumber Plug Seedling

    Directory of Open Access Journals (Sweden)

    Yuxuan Qin

    2017-12-01

    Full Text Available Vegetable plug seedling has become the most important way to produce vegetable seedlings in China. This seedling method can significantly improve the quality and yield of vegetables compared to conventional methods. In the process of plug seedling, chemical fertilizers or pesticides are often used to improve the yield of the seedlings albeit with increasing concerns. Meanwhile, little is known about the impact of beneficial bacteria on the rhizosphere microbiota and the growth conditions of vegetables during plug seedling. In this study, we applied a culture-independent next-generation sequencing-based approach and investigated the impact of a plant beneficial bacterium, Bacillus amyloliquefaciens L-S60, on the composition and dynamics of rhizosphere microbiota and the growth conditions of cucumbers during plug seedling. Our results showed that application of L-S60 significantly altered the structure of the bacterial community associated with the cucumber seedling; presence of beneficial rhizosphere species such as Bacillus, Rhodanobacter, Paenibacillus, Pseudomonas, Nonomuraea, and Agrobacterium was higher upon L-S60 treatment than in the control group. We also measured the impact of L-S60 application on the physiological properties of the cucumber seedlings as well as the availability of main mineral elements in the seedling at different time points during the plug seedling. Results from those measurements indicated that L-S60 application promoted growth conditions of cucumber seedlings and that more available mineral elements were detected in the cucumber seedlings from the L-S60 treated group than from the control group. The findings in this study provided evidence for the beneficial effects of plant growth-promoting rhizosphere bacteria on the bacterial community composition and growth conditions of the vegetables during plug seedling.

  14. Prediction of the strength of concrete radiation shielding based on LS-SVM

    International Nuclear Information System (INIS)

    Juncai, Xu; Qingwen, Ren; Zhenzhong, Shen

    2015-01-01

    Highlights: • LS-SVM was introduced for prediction of the strength of RSC. • A model for prediction of the strength of RSC was implemented. • The grid search algorithm was used to optimize the parameters of the LS-SVM. • The performance of LS-SVM in predicting the strength of RSC was evaluated. - Abstract: Radiation-shielding concrete (RSC) and conventional concrete differ in strength because of their distinct constituents. Predicting the strength of RSC with different constituents plays a vital role in radiation shielding (RS) engineering design. In this study, a model to predict the strength of RSC is established using a least squares-support vector machine (LS-SVM) through grid search algorithm. The algorithm is used to optimize the parameters of the LS-SVM on the basis of traditional prediction methods for conventional concrete. The predicted results of the LS-SVM model are compared with the experimental data. The results of the prediction are stable and consistent with the experimental results. In addition, the studied parameters exhibit significant effects on the simulation results. Therefore, the proposed method can be applied in predicting the strength of RSC, and the predicted results can be adopted as an important reference for RS engineering design

  15. Hypertension Knowledge-Level Scale (HK-LS): A Study on Development, Validity and Reliability

    OpenAIRE

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-01-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensio...

  16. ANL/CANTIA code for steam generator tube integrity assessment

    International Nuclear Information System (INIS)

    Revankar, S.T.; Wolf, B.; Majumdar, S.; Riznic, J.R.

    2009-01-01

    Steam generator (SG) tubes have an important safety role in CANDU type reactors and Pressurized Water Reactors (PWR) because they constitute one of the primary barriers between the radioactive and non-radioactive sides of the nuclear plant. The SG tubes are susceptible to corrosion and damage. A failure of a single steam generator tube, or even a few tubes, would not be a serious safety-related event in a CANDU reactor. The leakage from a ruptured tube is within makeup capacity of the primary heat transport system, so that as long as the operator takes the correct actions, the off-site consequences will be negligible. A sufficient safety margin against tube rupture used to be the basis for a variety of maintenance strategies developed to maintain a suitable level of plant safety and reliability. Several through-wall flaws may remain in operation and potentially contribute to the total primary-to-secondary leak rate. Assessment of the conditional probabilities of tube failures, leak rates, and ultimately risk of exceeding licensing dose limits has been used for steam generator tube fitness-for-service assessment. The advantage of this type of analysis is that it avoids the excessive conservatism typically present in deterministic methodologies. However, it requires considerable effort and expense to develop all of the failure, leakage, probability of detection, and flaw growth distributions and models necessary to obtain meaningful results from a probabilistic model. The Canadian Nuclear Safety Commission (CNSC) recently developed the CANTIA methodology for probabilistic assessment of inspection strategies for steam generator tubes as a direct effect on the probability of tube failure and primary-to-secondary leak rate Recently Argonne National Laboratory has developed tube integrity and leak rate models under Integrated Steam Generator Tube Integrity Program (ISGTIP-2). These models have been incorporated in the ANL/CANTIA code. This paper presents the ANL

  17. Modular code supervisor. Automatic generation of command language

    International Nuclear Information System (INIS)

    Dumas, M.; Thomas, J.B.

    1988-01-01

    It is shown how, starting from a problem formulated by the user, to generate the adequate calculation procedure in the command code, and acquire the data necessary for the calculation while verifying their validity. Modular codes are used, because of their flexibility and wide utilisation. Modules are written in Fortran, and calculations are done in batches according to an algorithm written in the GIBIANE command language. The action plans are based on the STRIPS and WARPLAN families. Elementary representation of a module and special instructions are illustrated. Dynamic construction macro-actions, and acquisition of the specification (which allows users to express the goal of a program without indicating which algorithm is used to reach the goal) are illustrated. The final phase consists in translating the algorithm into the command language [fr

  18. People typically experience extended periods of relative happiness or unhappiness due to positive feedback loops between LS and variables which are both causes and consequences of LS

    NARCIS (Netherlands)

    Headey, Bruce; Muffels, R.J.A.

    2015-01-01

    Long term panel data enable researchers to construct trajectories of LS for individuals over time. Bar charts of trajectories, and subsequent statistical analysis, show that respondents typically spend multiple consecutive years above and below their own long-term mean level of LS. We attempt to

  19. Effect of Lactobacillus salivarius Ls-33 on fecal microbiota in obese adolescents.

    Science.gov (United States)

    Larsen, Nadja; Vogensen, Finn K; Gøbel, Rikke Juul; Michaelsen, Kim F; Forssten, Sofia D; Lahtinen, Sampo J; Jakobsen, Mogens

    2013-12-01

    This study is a part of the clinical trials with probiotic bacterium Lactobacillus salivarius Ls-33 conducted in obese adolescents. Previously reported clinical studies showed no effect of Ls-33 consumption on the metabolic syndrome in the subject group. The aim of the study was to investigate the impact of L. salivarius Ls-33 on fecal microbiota in obese adolescents. The study was a double-blinded intervention with 50 subjects randomized to intake of L. salivarius Ls-33 or placebo for 12 weeks. The fecal microbiota was assessed by real-time quantitative PCR before and after intervention. Concentrations of fecal short chain fatty acids were determined using gas chromatography. Ratios of Bacteroides-Prevotella-Porphyromonas group to Firmicutes belonging bacteria, including Clostridium cluster XIV, Blautia coccoides_Eubacteria rectale group and Roseburia intestinalis, were significantly increased (p ≤ 0.05) after administration of Ls-33. The cell numbers of fecal bacteria, including the groups above as well as Clostridium cluster I, Clostridium cluster IV, Faecalibacterium prausnitzii, Enterobacteriaceae, Enterococcus, the Lactobacillus group and Bifidobacterium were not significantly altered by intervention. Similarly, short chain fatty acids remained unaffected. L. salivarius Ls-33 might modify the fecal microbiota in obese adolescents in a way not related to metabolic syndrome. NCT 01020617. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  20. Code Generation for a Simple First-Order Prover

    DEFF Research Database (Denmark)

    Villadsen, Jørgen; Schlichtkrull, Anders; Halkjær From, Andreas

    2016-01-01

    We present Standard ML code generation in Isabelle/HOL of a sound and complete prover for first-order logic, taking formalizations by Tom Ridge and others as the starting point. We also define a set of so-called unfolding rules and show how to use these as a simple prover, with the aim of using t...... the approach for teaching logic and verification to computer science students at the bachelor level....

  1. Horizontal transfer generates genetic variation in an asexual pathogen

    Directory of Open Access Journals (Sweden)

    Xiaoqiu Huang

    2014-10-01

    Full Text Available There are major gaps in the understanding of how genetic variation is generated in the asexual pathogen Verticillium dahliae. On the one hand, V. dahliae is a haploid organism that reproduces clonally. On the other hand, single-nucleotide polymorphisms and chromosomal rearrangements were found between V. dahliae strains. Lineage-specific (LS regions comprising about 5% of the genome are highly variable between V. dahliae strains. Nonetheless, it is unknown whether horizontal gene transfer plays a major role in generating genetic variation in V. dahliae. Here, we analyzed a previously sequenced V. dahliae population of nine strains from various geographical locations and hosts. We found highly homologous elements in LS regions of each strain; LS regions of V. dahliae strain JR2 are much richer in highly homologous elements than the core genome. In addition, we discovered, in LS regions of JR2, several structural forms of nonhomologous recombination, and two or three homologous sequence types of each form, with almost each sequence type present in an LS region of another strain. A large section of one of the forms is known to be horizontally transferred between V. dahliae strains. We unexpectedly found that 350 kilobases of dynamic LS regions were much more conserved than the core genome between V. dahliae and a closely related species (V. albo-atrum, suggesting that these LS regions were horizontally transferred recently. Our results support the view that genetic variation in LS regions is generated by horizontal transfer between strains, and by chromosomal reshuffling reported previously.

  2. Generation of the WIMS code library from the ENDF/B-VI basic library

    International Nuclear Information System (INIS)

    Aboustta, Mohamed Ali Bashir.

    1994-01-01

    The WIMS code is being presently used in many research centers and educational institutions in the world. It has proven to be versatile, reliable and diverse as it is used to calculate different reactor systems. Its data library is rich of useful information that can even be condensed to serve other codes, but the copy distributed with the code is not updated. Some of its data has never been changed, others had changed many times to accommodate certain experimental setups and some data is, simply, not included. This work is an attempt to dominate the techniques used in generating a multigroup library as being applied to the WIMS data library. This new library is called UFMGLIB. A new set of consistent data was generated from the basic ENDF/B-VI library, including complete data for the fission product nuclides and more elaborated burnup chains. The performance of the library is comparable to that of the Standard library accompanying the code and a later library, WIMKAL 88, generated by a group of the Korean Research Institute of Atomic Energy. (author). 38 refs., 40 figs., 30 tabs

  3. AMZ, library of multigroup constants for EXPANDA computer codes, generated by NJOY computer code from ENDF/B-IV

    International Nuclear Information System (INIS)

    Chalhoub, E.S.; Moraes, M. de.

    1984-01-01

    A 70-group, 37-isotope library of multigroup constants for fast reactor nuclear design calculations is described. Nuclear cross sections, transfer matrices, and self-shielding factors were generated with NJOY code and an auxiliary program RGENDF using evaluated data from ENDF/B-IV. The output is being issued in a format suitable for EXPANDA code. Comparisons with JFS-2 library, as well as, test resuls for 14 CSEWG benchmark critical assemblies are presented. (Author) [pt

  4. LS-SNP/PDB: annotated non-synonymous SNPs mapped to Protein Data Bank structures.

    Science.gov (United States)

    Ryan, Michael; Diekhans, Mark; Lien, Stephanie; Liu, Yun; Karchin, Rachel

    2009-06-01

    LS-SNP/PDB is a new WWW resource for genome-wide annotation of human non-synonymous (amino acid changing) SNPs. It serves high-quality protein graphics rendered with UCSF Chimera molecular visualization software. The system is kept up-to-date by an automated, high-throughput build pipeline that systematically maps human nsSNPs onto Protein Data Bank structures and annotates several biologically relevant features. LS-SNP/PDB is available at (http://ls-snp.icm.jhu.edu/ls-snp-pdb) and via links from protein data bank (PDB) biology and chemistry tabs, UCSC Genome Browser Gene Details and SNP Details pages and PharmGKB Gene Variants Downloads/Cross-References pages.

  5. PCS a code system for generating production cross section libraries

    International Nuclear Information System (INIS)

    Cox, L.J.

    1997-01-01

    This document outlines the use of the PCS Code System. It summarizes the execution process for generating FORMAT2000 production cross section files from FORMAT2000 reaction cross section files. It also describes the process of assembling the ASCII versions of the high energy production files made from ENDL and Mark Chadwick's calculations. Descriptions of the function of each code along with its input and output and use are given. This document is under construction. Please submit entries, suggestions, questions, and corrections to (ljc at sign llnl.gov) 3 tabs

  6. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  7. LS-SVM: uma nova ferramenta quimiométrica para regressão multivariada. Comparação de modelos de regressão LS-SVM e PLS na quantificação de adulterantes em leite em pó empregando NIR LS-SVM: a new chemometric tool for multivariate regression. Comparison of LS-SVM and pls regression for determination of common adulterants in powdered milk by nir spectroscopy

    Directory of Open Access Journals (Sweden)

    Marco F. Ferrão

    2007-08-01

    Full Text Available Least-squares support vector machines (LS-SVM were used as an alternative multivariate calibration method for the simultaneous quantification of some common adulterants found in powdered milk samples, using near-infrared spectroscopy. Excellent models were built using LS-SVM for determining R², RMSECV and RMSEP values. LS-SVMs show superior performance for quantifying starch, whey and sucrose in powdered milk samples in relation to PLSR. This study shows that it is possible to determine precisely the amount of one and two common adulterants simultaneously in powdered milk samples using LS-SVM and NIR spectra.

  8. Performance potential of the injectors after LS1

    International Nuclear Information System (INIS)

    Bartosik, H.; Carli, C.; Damerau, H.; Garoby, R.; Gilardoni, S.; Goddard, B.; Hancock, S.; Hanke, K.; Lombardi, A.; Mikulec, B.; Raginel, V.; Rumolo, G.; Shaposhnikova, E.; Vretenar, M.

    2012-01-01

    The main upgrades of the injector chain in the framework of the LIU Project will only be implemented in the second long shutdown (LS2), in particular the increase of the PSB-PS transfer energy to 2 GeV or the implementation of cures/solutions against instabilities/e-cloud effects etc. in the SPS. On the other hand, Linac4 will become available by the end of 2014. Until the end of 2015 it may replace Linac2 at short notice, taking 50 MeV protons into the PSB via the existing injection system but with reduced performance. Afterwards, the H - injection equipment will be ready and Linac4 could be connected for 160 MeV H - injection into the PSB during a prolonged winter shutdown before LS2. The anticipated beam performance of the LHC injectors after LS1 in these different cases is presented. Space charge on the PS flat-bottom will remain a limitation because the PSB-PS transfer energy will stay at 1.4 GeV. As a mitigation measure new RF manipulations are presented which can improve brightness for 25 ns bunch spacing, allowing for more than nominal luminosity in the LHC. (authors)

  9. Post LS1 schedule

    CERN Document Server

    Lamont, M

    2014-01-01

    The scheduling limits for a typical long year taking into account technical stops, machine development, spe- cial physics runs are presented. An attempt is then made to outline a ten year post LS1 schedule taking into account the disparate requirements outlined in the previous talks in this session. The demands on the planned long shutdowns and the impact of these demands on their proposed length will be discussed. The option of using ion running as a pre-shutdown cool-down period will be addressed.

  10. Development of the next generation reactor analysis code system, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Nagaya, Yasunobu; Chiba, Go; Kugo, Teruhiko; Ishikawa, Makoto; Tatsumi, Masahiro; Hirai, Yasushi; Hyoudou, Hideaki; Numata, Kazuyuki; Iwai, Takehiko; Jin, Tomoyuki

    2011-03-01

    A next generation reactor analysis code system, MARBLE, has been developed. MARBLE is a successor of the fast reactor neutronics analysis code systems, JOINT-FR and SAGEP-FR (conventional systems), which were developed for so-called JUPITER standard analysis methods. MARBLE has the equivalent analysis capability to the conventional system because MARBLE can utilize sub-codes included in the conventional system without any change. On the other hand, burnup analysis functionality for power reactors is improved compared with the conventional system by introducing models on fuel exchange treatment and control rod operation and so on. In addition, MARBLE has newly developed solvers and some new features of burnup calculation by the Krylov sub-space method and nuclear design accuracy evaluation by the extended bias factor method. In the development of MARBLE, the object oriented technology was adopted from the view-point of improvement of the software quality such as flexibility, expansibility, facilitation of the verification by the modularization and assistance of co-development. And, software structure called the two-layer system consisting of scripting language and system development language was applied. As a result, MARBLE is not an independent analysis code system which simply receives input and returns output, but an assembly of components for building an analysis code system (i.e. framework). Furthermore, MARBLE provides some pre-built analysis code systems such as the fast reactor neutronics analysis code system. SCHEME, which corresponds to the conventional code and the fast reactor burnup analysis code system, ORPHEUS. (author)

  11. CITADEL: a computer code for the analysis of iodine behavior in steam generator tube rupture accidents

    International Nuclear Information System (INIS)

    1982-04-01

    The computer code CITADEL was written to analyze iodine behavior during steam generator tube rupture accidents. The code models the transport and deposition of iodine from its point of escape at the steam generator primary break until its release to the environment. This report provides a brief description of the code including its input requirements and the nature and form of its output. A user's guide describing the manner in which the input data are required to be set up to run the code is also provided

  12. Code Assessment of SPACE 2.19 using LSTF Steam Generator Tube Rupture Test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Minhee; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The SPACE is a best estimated two-phase three-field thermal-hydraulic analysis code used to analyze the safety and performance of pressurized water reactors. As a result of the development, the 2.19 version of the code was released through the successive various verification and validation works. The present work is on the line of expanding the work by Kim et al. In this study, results produced by the SPACE 2.19 code were compared with the experimental data from JAERI's LSTF Test Run LSTF SB-SG-06 experiment simulating a Steam Generator Tube Rupture (SGTR) transient. In order to identify the predictability of SPACE 2.19, the LSTF steam generator tube rupture test was simulated. To evaluate the computed results, LSTF SB-SG-06 test data simulating the SGTR and the RELAP5/ MOD3.1 are used. The calculation results indicate that the SPACE 2.19 code predicted well the sequence of events and the major phenomena during the transient, such as the asymmetric loop behavior, reactor coolant system cooldown and heat transfer by natural circulation, the primary and secondary system depressurization by the pressurizer auxiliary spray and the steam dump using the intact loop steam generator relief valve.

  13. A threshold-based fixed predictor for JPEG-LS image compression

    Science.gov (United States)

    Deng, Lihua; Huang, Zhenghua; Yao, Shoukui

    2018-03-01

    In JPEG-LS, fixed predictor based on median edge detector (MED) only detect horizontal and vertical edges, and thus produces large prediction errors in the locality of diagonal edges. In this paper, we propose a threshold-based edge detection scheme for the fixed predictor. The proposed scheme can detect not only the horizontal and vertical edges, but also diagonal edges. For some certain thresholds, the proposed scheme can be simplified to other existing schemes. So, it can also be regarded as the integration of these existing schemes. For a suitable threshold, the accuracy of horizontal and vertical edges detection is higher than the existing median edge detection in JPEG-LS. Thus, the proposed fixed predictor outperforms the existing JPEG-LS predictors for all images tested, while the complexity of the overall algorithm is maintained at a similar level.

  14. Development of the next generation code system as an engineering modeling language. (2). Study with prototyping

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Uto, Nariaki; Kasahara, Naoto; Ishikawa, Makoto

    2003-04-01

    In the fast reactor development, numerical simulation using analytical codes plays an important role for complementing theory and experiment. It is necessary that the engineering models and analysis methods can be flexibly changed, because the phenomena to be investigated become more complicated due to the diversity of the needs for research. And, there are large problems in combining physical properties and engineering models in many different fields. Aiming to the realization of the next generation code system which can solve those problems, the authors adopted three methods, (1) Multi-language (SoftWIRE.NET, Visual Basic.NET and Fortran) (2) Fortran 90 and (3) Python to make a prototype of the next generation code system. As this result, the followings were confirmed. (1) It is possible to reuse a function of the existing codes written in Fortran as an object of the next generation code system by using Visual Basic.NET. (2) The maintainability of the existing code written by Fortran 77 can be improved by using the new features of Fortran 90. (3) The toolbox-type code system can be built by using Python. (author)

  15. Interoperable domain-specific languages families for code generation

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Plášil, F.; Bureš, Tomáš; Hnětynka, P.

    2013-01-01

    Roč. 43, č. 5 (2013), s. 479-499 ISSN 0038-0644 R&D Projects: GA ČR GD201/09/H057 EU Projects: European Commission(XE) ASCENS 257414 Grant - others:GA AV ČR(CZ) GAP103/11/1489 Program:FP7 Institutional research plan: CEZ:AV0Z10300504 Keywords : code generation * domain specific languages * models reuse * extensible languages * specification * program synthesis Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.148, year: 2013

  16. A Modified LS+AR Model to Improve the Accuracy of the Short-term Polar Motion Prediction

    Science.gov (United States)

    Wang, Z. W.; Wang, Q. X.; Ding, Y. Q.; Zhang, J. J.; Liu, S. S.

    2017-03-01

    There are two problems of the LS (Least Squares)+AR (AutoRegressive) model in polar motion forecast: the inner residual value of LS fitting is reasonable, but the residual value of LS extrapolation is poor; and the LS fitting residual sequence is non-linear. It is unsuitable to establish an AR model for the residual sequence to be forecasted, based on the residual sequence before forecast epoch. In this paper, we make solution to those two problems with two steps. First, restrictions are added to the two endpoints of LS fitting data to fix them on the LS fitting curve. Therefore, the fitting values next to the two endpoints are very close to the observation values. Secondly, we select the interpolation residual sequence of an inward LS fitting curve, which has a similar variation trend as the LS extrapolation residual sequence, as the modeling object of AR for the residual forecast. Calculation examples show that this solution can effectively improve the short-term polar motion prediction accuracy by the LS+AR model. In addition, the comparison results of the forecast models of RLS (Robustified Least Squares)+AR, RLS+ARIMA (AutoRegressive Integrated Moving Average), and LS+ANN (Artificial Neural Network) confirm the feasibility and effectiveness of the solution for the polar motion forecast. The results, especially for the polar motion forecast in the 1-10 days, show that the forecast accuracy of the proposed model can reach the world level.

  17. Diagnosis of Elevator Faults with LS-SVM Based on Optimization by K-CV

    Directory of Open Access Journals (Sweden)

    Zhou Wan

    2015-01-01

    Full Text Available Several common elevator malfunctions were diagnosed with a least square support vector machine (LS-SVM. After acquiring vibration signals of various elevator functions, their energy characteristics and time domain indicators were extracted by theoretically analyzing the optimal wavelet packet, in order to construct a feature vector of malfunctions for identifying causes of the malfunctions as input of LS-SVM. Meanwhile, parameters about LS-SVM were optimized by K-fold cross validation (K-CV. After diagnosing deviated elevator guide rail, deviated shape of guide shoe, abnormal running of tractor, erroneous rope groove of traction sheave, deviated guide wheel, and tension of wire rope, the results suggested that the LS-SVM based on K-CV optimization was one of effective methods for diagnosing elevator malfunctions.

  18. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    Directory of Open Access Journals (Sweden)

    Fuqiang Sun

    2017-06-01

    Full Text Available Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM and a genetic algorithm (GA. Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  19. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    Science.gov (United States)

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  20. Non-stop training during LS1!

    CERN Multimedia

    HSE Unit

    2013-01-01

    The year 2013 is a busy year for the Safety Training team, who are seeing a dramatic increase in their activities during LS1. The Safety Training Service within the HSE Unit offers training courses all year round to people working on the CERN site who are exposed to a variety of potential hazards (e.g. chemical hazards, fire hazards, etc.) either because of the activities they perform (e.g. work in confined spaces or on machines) and/or their place of work (e.g. workshops, laboratories, underground areas, etc.).   LS1 has triggered an increase in the number of requests for training, mainly from people requiring to carry out work on the LHC. Indeed, in order to access the underground areas, it is obligatory to have taken certain safety courses such as the self-rescue mask or radiation protection training courses. Consequently, the number of training sessions and the number of people trained is currently twice what it was during the same period in 2012, with almost 4,600 people trained in 530 s...

  1. Phase-coded microwave signal generation based on a single electro-optical modulator and its application in accurate distance measurement.

    Science.gov (United States)

    Zhang, Fangzheng; Ge, Xiaozhong; Gao, Bindong; Pan, Shilong

    2015-08-24

    A novel scheme for photonic generation of a phase-coded microwave signal is proposed and its application in one-dimension distance measurement is demonstrated. The proposed signal generator has a simple and compact structure based on a single dual-polarization modulator. Besides, the generated phase-coded signal is stable and free from the DC and low-frequency backgrounds. An experiment is carried out. A 2 Gb/s phase-coded signal at 20 GHz is successfully generated, and the recovered phase information agrees well with the input 13-bit Barker code. To further investigate the performance of the proposed signal generator, its application in one-dimension distance measurement is demonstrated. The measurement accuracy is less than 1.7 centimeters within a measurement range of ~2 meters. The experimental results can verify the feasibility of the proposed phase-coded microwave signal generator and also provide strong evidence to support its practical applications.

  2. Lindas Leen paštēls, tēls drukātajos medijos un fanu publikās

    OpenAIRE

    Komarovskis, Jānis

    2011-01-01

    Bakalaura darba tēma ir „Lindas Leen paštēls, tēls drukātajos medijos un fanu publikās”. Darba mērķis ir izpētīt Lindas Leen paštēlu, tēlu drukātajos medijos un fanu publikās. Darba teorētiskā daļa balstās uz Klausa Mertena (Klaus Merten) tēla veidošanās teoriju, kura apgalvo, ka tēla veidošanās auditorijas uztverē ir cieši saistīta ar medijos publicēto informāciju par noteikto objektu. Pētījuma veikšanai ir izmantotas četras pētniecības metodes: daļēji strukturētā intervija, mediju kon...

  3. SLACINPT - a FORTRAN program that generates boundary data for the SLAC gun code

    International Nuclear Information System (INIS)

    Michel, W.L.; Hepburn, J.D.

    1982-03-01

    The FORTRAN program SLACINPT was written to simplify the preparation of boundary data for the SLAC gun code. In SLACINPT, the boundary is described by a sequence of straight line or arc segments. From these, the program generates the individual boundary mesh point data, required as input by the SLAC gun code

  4. Comparison of central corneal thickness measured by Lenstar LS900, OrbscanⅡ and ultrasonic pachmetry

    Directory of Open Access Journals (Sweden)

    Hong-Tao Zhang

    2013-09-01

    Full Text Available AIM: To investigate the difference of central corneal thickness(CCTmeasured by Lenstar LS900, OrbscahⅡ system and ultrasonic pachmetry, and to evaluate the correlation and consistency of the results for providing a theoretical basis for clinical application.METHODS: The mean value of CCT in 70 eyes of 35 patients measured three times by Lenstar LS900, OrbscahⅡ system and ultrasonic pachmetry underwent statistical analysis. The difference of CCT was compared, and the correlation and consistency of three measurements were analyzed to provide theoretical basis for clinical application. CCT values measured by different methods were analyzed with randomized block variance analysis. LSD-t test was used for pairwise comparison between groups. The correlation of three measurement methods were analyzed by linear correlation analysis, and Bland-Altman was used to analyze the consistency.RESULTS: The mean CCT values measured by Lenstar LS900, OrbscanⅡ and ultrasonic pachmetry were 542.75±40.06, 528.74±39.59, 538.54±40.93μm, respectively. The mean difference of CCT measurement was 4.21±8.78μm between Lenstar LS900 and ultrasonic pachmetry, 14.01±13.39μm between Lenstar LS900 and Orbscan Ⅱ, 9.8±10.57μm between ultrasonic pachmetry and Orbscan Ⅱ. The difference was statistically significant(PP>0.05: There was positive correlation between CCT with Lenstar LS900 and ultrasonic pachmetry(r=0.977, 0.944; PCONCLUSION: There are excellent correlation among Lenstar LS900, Orbscan Ⅱ and ultrasonic pachmetry. Lenstar LS900 can be used as CCT non-contact measurement tool.

  5. Consistency and accuracy of diagnostic cancer codes generated by automated registration: comparison with manual registration

    Directory of Open Access Journals (Sweden)

    Codazzi Tiziana

    2006-09-01

    Full Text Available Abstract Background Automated procedures are increasingly used in cancer registration, and it is important that the data produced are systematically checked for consistency and accuracy. We evaluated an automated procedure for cancer registration adopted by the Lombardy Cancer Registry in 1997, comparing automatically-generated diagnostic codes with those produced manually over one year (1997. Methods The automatically generated cancer cases were produced by Open Registry algorithms. For manual registration, trained staff consulted clinical records, pathology reports and death certificates. The social security code, present and checked in both databases in all cases, was used to match the files in the automatic and manual databases. The cancer cases generated by the two methods were compared by manual revision. Results The automated procedure generated 5027 cases: 2959 (59% were accepted automatically and 2068 (41% were flagged for manual checking. Among the cases accepted automatically, discrepancies in data items (surname, first name, sex and date of birth constituted 8.5% of cases, and discrepancies in the first three digits of the ICD-9 code constituted 1.6%. Among flagged cases, cancers of female genital tract, hematopoietic system, metastatic and ill-defined sites, and oropharynx predominated. The usual reasons were use of specific vs. generic codes, presence of multiple primaries, and use of extranodal vs. nodal codes for lymphomas. The percentage of automatically accepted cases ranged from 83% for breast and thyroid cancers to 13% for metastatic and ill-defined cancer sites. Conclusion Since 59% of cases were accepted automatically and contained relatively few, mostly trivial discrepancies, the automatic procedure is efficient for routine case generation effectively cutting the workload required for routine case checking by this amount. Among cases not accepted automatically, discrepancies were mainly due to variations in coding practice.

  6. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  7. Development of a 1D thermal-hydraulic analysis code for once-through steam generator in SMRs using straight tubes

    Energy Technology Data Exchange (ETDEWEB)

    Park, Youngjae; Kim, Iljin; Kim, Hyungdae [Kyung Hee University, Yongin (Korea, Republic of)

    2015-10-15

    Diverse integral/small-modular reactors (SMRs) have been developed. Once-through steam generator (OTSG) which generates superheated steam without steam separator and dryer was used in the SMRs to reduce volume of steam generator. It would be possible to design a new steam generator with best estimate thermal-hydraulic codes such as RELAP and MARS. However, it is not convenience to use the general purpose thermal-hydraulic analysis code to design a specific component of nuclear power plants. A widely used simulation tool for thermal-hydraulic analysis of drum-type steam generators is ATHOS, which allows 3D analysis. On the other hand, a simple 1D thermal-hydraulic analysis code might be accurate enough for the conceptual design of OTSG. In this study, thermal-hydraulic analysis code for conceptual design of OTSG was developed using 1D homogeneous equilibrium model (HEM). A benchmark calculation was also conducted to verify and validate the prediction accuracy of the developed code by comparing with the analysis results with MARS. Finally, conceptual design of OTSG was conducted by the developed code. A simple 1D thermal-hydraulic analysis code was developed for the purpose of conceptual design OTSG for SMRs. A set of benchmark calculations was conducted to verify and validate the analysis accuracy of the developed code by comparing results obtained with a best-estimated thermal-hydraulic analysis code, MARS. Finally, analysis of two different OTSG design concepts with superheating and recirculation was demonstrated using the developed code.

  8. [Measurement of soil organic matter and available K based on SPA-LS-SVM].

    Science.gov (United States)

    Zhang, Hai-Liang; Liu, Xue-Mei; He, Yong

    2014-05-01

    Visible and short wave infrared spectroscopy (Vis/SW-NIRS) was investigated in the present study for measurement of soil organic matter (OM) and available potassium (K). Four types of pretreatments including smoothing, SNV, MSC and SG smoothing+first derivative were adopted to eliminate the system noises and external disturbances. Then partial least squares regression (PLSR) and least squares-support vector machine (LS-SVM) models were implemented for calibration models. The LS-SVM model was built by using characteristic wavelength based on successive projections algorithm (SPA). Simultaneously, the performance of LSSVM models was compared with PLSR models. The results indicated that LS-SVM models using characteristic wavelength as inputs based on SPA outperformed PLSR models. The optimal SPA-LS-SVM models were achieved, and the correlation coefficient (r), and RMSEP were 0. 860 2 and 2. 98 for OM and 0. 730 5 and 15. 78 for K, respectively. The results indicated that visible and short wave near infrared spectroscopy (Vis/SW-NIRS) (325 approximately 1 075 nm) combined with LS-SVM based on SPA could be utilized as a precision method for the determination of soil properties.

  9. TRISO fuel thermal simulations in the LS-VHTR

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Mario C.; Scari, Maria E.; Costa, Antonella L.; Pereira, Claubia; Veloso, Maria A.F., E-mail: marc5663@gmail.com, E-mail: melizabethscari@yahoo.com, E-mail: antonella@nuclear.ufmg.br, E-mail: claubia@nuclear.ufmg.br, E-mail: dora@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear; Instituto Nacional de Ciência e Tecnologia de Reatores Nucleares Inovadores/CNPq (Brazil)

    2017-07-01

    The liquid-salt-cooled very high-temperature reactor (LS-VHTR) is a reactor that presents very good characteristics in terms of energy production and safety aspects. It uses as fuel the TRISO particles immersed in a graphite matrix with a cylindrical shape called fuel compact, as moderator graphite and as coolant liquid salt Li{sub 2}BeF{sub 4} called Flibe. This work evaluates the thermal hydraulic performance of the heat removal system and the reactor core by performing different simplifications to represent the reactor core and the fuel compact under steady-state conditions, starting the modeling from a single fuel element, until complete the studies with the entire core model developed in the RELAP5-3D code. Two models were considered for representation of the fuel compact, homogeneous and non-homogeneous models, as well as different geometries of the heat structures was considered. The aim to develop several models was to compare the thermal hydraulic characteristics resulting from the construction of a more economical and less discretized model with much more refined models that can lead to more complexes analyzes to representing TRISO effect particles in the fuel compact. The different results found, mainly, for the core temperature distributions are presented and discussed. (author)

  10. GIS-based Analysis of LS Factor under Coal Mining Subsidence Impacts in Sandy Region

    Directory of Open Access Journals (Sweden)

    W. Xiao

    2014-09-01

    Full Text Available Coal deposits in the adjacent regions of Shanxi, Shaanxi, and Inner Mongolia province (SSI account for approximately two-thirds of coal in China; therefore, the SSI region has become the frontier of coal mining and its westward movement. Numerous adverse impacts to land and environment have arisen in these sandy, arid, and ecologically fragile areas. Underground coal mining activities cause land to subside and subsequent soil erosion, with slope length and slope steepness (LS as the key influential factor. In this investigation, an SSI mining site was chosen as a case study area, and 1 the pre-mining LS factor was obtained using a digital elevation model (DEM dataset; 2 a mining subsidence prediction was implemented with revised subsidence prediction factors; and 3 the post-mining LS factor was calculated by integrating the pre-mining DEM dataset and coal mining subsidence prediction data. The results revealed that the LS factor leads to some changes in the bottom of subsidence basin and considerable alterations at the basin’s edges of basin. Moreover, the LS factor became larger in the steeper terrain under subsidence impacts. This integrated method could quantitatively analyse LS changes and spatial distribution under mining impacts, which will benefit and provide references for soil erosion evaluations in this region

  11. Computer codes for simulation of Angra 1 reactor steam generator

    International Nuclear Information System (INIS)

    Pinto, A.C.

    1978-01-01

    A digital computer code is developed for the simulation of the steady-state operation of a u-tube steam generator with natural recirculation used in Pressurized Water Reactors. The steam generator is simulated with two flow channel separated by a metallic wall, with a preheating section with counter flow and a vaporizing section with parallel flow. The program permits the changes in flow patterns and heat transfer correlations, in accordance with the local conditions along the vaporizing section. Various sub-routines are developed for the determination of steam and water properties and a mathematical model is established for the simulation of transients in the same steam generator. The steady state operating conditions in one of the steam generators of ANGRA 1 reactor are determined utilizing this programme. Global results obtained agree with published values [pt

  12. ANNarchy: a code generation approach to neural simulations on parallel hardware

    Science.gov (United States)

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  13. Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®

    Energy Technology Data Exchange (ETDEWEB)

    Stander, Nielen; Basudhar, Anirban; Basu, Ushnish; Gandikota, Imtiaz; Savic, Vesna; Sun, Xin; Choi, Kyoo Sil; Hu, Xiaohua; Pourboghrat, F.; Park, Taejoon; Mapar, Aboozar; Kumar, Shavan; Ghassemi-Armaki, Hassan; Abu-Farha, Fadi

    2015-09-14

    Ever-tightening regulations on fuel economy, and the likely future regulation of carbon emissions, demand persistent innovation in vehicle design to reduce vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials, by adding material diversity and composite materials, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing plate thickness while retaining sufficient strength and ductility required for durability and safety. A project to develop computational material models for advanced high strength steel is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the US Department of Energy. Under this program, new Third Generation Advanced High Strength Steel (i.e., 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. The objectives of the project are to integrate atomistic, microstructural, forming and performance models to create an integrated computational materials engineering (ICME) toolkit for 3GAHSS. The mechanical properties of Advanced High Strength Steels (AHSS) are controlled by many factors, including phase composition and distribution in the overall microstructure, volume fraction, size and morphology of phase constituents as well as stability of the metastable retained austenite phase. The complex phase transformation and deformation mechanisms in these steels make the well-established traditional techniques obsolete, and a multi-scale microstructure-based modeling approach following the ICME [0]strategy was therefore chosen in this project. Multi-scale modeling as a major area of research and development is an outgrowth of the Comprehensive

  14. Development of Multi-Scale Finite Element Analysis Codes for High Formability Sheet Metal Generation

    International Nuclear Information System (INIS)

    Nnakamachi, Eiji; Kuramae, Hiroyuki; Ngoc Tam, Nguyen; Nakamura, Yasunori; Sakamoto, Hidetoshi; Morimoto, Hideo

    2007-01-01

    In this study, the dynamic- and static-explicit multi-scale finite element (F.E.) codes are developed by employing the homogenization method, the crystalplasticity constitutive equation and SEM-EBSD measurement based polycrystal model. These can predict the crystal morphological change and the hardening evolution at the micro level, and the macroscopic plastic anisotropy evolution. These codes are applied to analyze the asymmetrical rolling process, which is introduced to control the crystal texture of the sheet metal for generating a high formability sheet metal. These codes can predict the yield surface and the sheet formability by analyzing the strain path dependent yield, the simple sheet forming process, such as the limit dome height test and the cylindrical deep drawing problems. It shows that the shear dominant rolling process, such as the asymmetric rolling, generates ''high formability'' textures and eventually the high formability sheet. The texture evolution and the high formability of the newly generated sheet metal experimentally were confirmed by the SEM-EBSD measurement and LDH test. It is concluded that these explicit type crystallographic homogenized multi-scale F.E. code could be a comprehensive tool to predict the plastic induced texture evolution, anisotropy and formability by the rolling process and the limit dome height test analyses

  15. DOG -II input generator program for DOT3.5 code

    International Nuclear Information System (INIS)

    Hayashi, Katsumi; Handa, Hiroyuki; Yamada, Koubun; Kamogawa, Susumu; Takatsu, Hideyuki; Koizumi, Kouichi; Seki, Yasushi

    1992-01-01

    DOT3.5 is widely used for radiation transport analysis of fission reactors, fusion experimental facilities and particle accelerators. We developed the input generator program for DOT3.5 code in aim to prepare input data effectively. Formar program DOG was developed and used internally in Hitachi Engineering Company. In this new version DOG-II, limitation for R-Θ geometry was removed. All the input data is created by interactive method in front of color display without using DOT3.5 manual. Also the geometry related input are easily created without calculation of precise curved mesh point. By using DOG-II, reliable input data for DOT3.5 code is obtained easily and quickly

  16. Characteristics of CdLS (Cornelia de Lange Syndrome)

    Science.gov (United States)

    ... Celebration and Memorial Gifts Planned Giving Monthly Giving Corporate Partnership Matching Gifts Stocks, Trusts and Other Gifts ... 25 percent of individuals with CdLS. Behavioral and communication issues and developmental delays often exist. Major Characteristics ...

  17. Code Generation by Model Transformation : A Case Study in Transformation Modularity

    NARCIS (Netherlands)

    Hemel, Z.; Kats, L.C.L.; Visser, E.

    2008-01-01

    Preprint of paper published in: Theory and Practice of Model Transformations (ICMT 2008), Lecture Notes in Computer Science 5063; doi:10.1007/978-3-540-69927-9_13 The realization of model-driven software development requires effective techniques for implementing code generators for domain-specific

  18. Calculating the LS factor of Universal Soil Loss Equation (USLE for the watershed of River Silver, Castelo-ES = Cálculo do fator LS da Equação Universal de Perdas de Solos (EUPS para a bacia do Rio da Prata, Castelo-ES

    Directory of Open Access Journals (Sweden)

    Luciano Melo Coutinho

    2014-01-01

    Full Text Available Erosion is considered the main cause of depletion of agricultural land, which generates approximate annual losses of billions of dollars in Brazil. Water erosion is the most common, caused by effective precipitation basin, since its potential is the main agent of reshaping the land. Universal Equation Soil Loss (USLE show great applicability to estimate erosion in watersheds from their physical and geographical elements (PS = R*K*L*S*C*P. The intensity of erosion can be influenced by the profile of the slope, measured by the length (L and grade of slope (S. The topographic factor (LS of the USLE is the most difficult to obtain for large and/or diverse relief areas. We calculated the spatial map of the LS factor Silver watershed (Castelo-ES from the processing of cartographic data in environment Geographic Information Systems (GIS. The relief of the study area was represented by interpolation from contour lines supported mapped hydrography, give the digital elevation model hydrologically consistent (MDEHC and declivity map. For generation of the LS factor was used to equation developed by Bertoni and Lombardi Neto (2005, suitable for slopes of different length and declivity. The Silver watershed has diversified relief, marked by flat and steep declivity areas, which indicates erosive vulnerability. The main values of LS were identified minimum (0, medium (8.2 and maximum (80.5. = A erosão é apontada como a principal causa do depauperamento de terras agrícolas, o que gera prejuízos anuais aproximados da ordem de bilhões de dólares no Brasil. A erosão hídrica é a mais comum, ocasionada pela precipitação efetiva em bacias, que devido ao seu potencial erosivo é o principal agente de remodelagem do terreno. A Equação Universal de Perdas de Solos (EUPS mostra-se de grande aplicabilidade para estimar a erosão de bacias hidrográficas a partir de seus elementos físicos e geográficos (PS = R*K*L*S*C*P. A intensidade da erosão pode sofrer

  19. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.

  20. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    International Nuclear Information System (INIS)

    Arndt, S.A.

    1997-01-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities

  1. Summary of sessions 5 and 6: long shutdown 1 (LS1) 2013-2014

    International Nuclear Information System (INIS)

    Bordry, F.; Foraz, K.

    2012-01-01

    The minimal duration for LS1 is 20 months, meaning the time from physics to physics will be about 2 years. The actual start of LS1 is set for the 17. November 2012, which will allow the Liquid Helium emptying from the machine before Christmas. However, delivery dates for certain components are on the critical path of the experiments, which will allow the first beams for beam commissioning not before September 2014. Depending on the results of mid-2012 physics, the start date of LS1 will be reviewed. The actual plan for injectors is in line with the LHC plan, but the risk of running injectors for two years has to be assessed. The analysis of resources is progressing well throughout the complex (collaborations and internal mobility) and is being done according to priorities. Certain activities have already been postponed to LS2, and new requests will be carefully analyzed, as well as open issues

  2. Mr.CAS-A minimalistic (pure) Ruby CAS for fast prototyping and code generation

    Science.gov (United States)

    Ragni, Matteo

    There are Computer Algebra System (CAS) systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.

  3. Mr.CAS—A minimalistic (pure Ruby CAS for fast prototyping and code generation

    Directory of Open Access Journals (Sweden)

    Matteo Ragni

    2017-01-01

    Full Text Available There are Computer Algebra System (CAS systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.

  4. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  5. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.; Haapalehto, T. [Lappeenranta Univ. of Technology, Lappeenranta (Finland); Kouhia, J. [VTT Energy, Nuclear Energy, Lappeenranta (Finland)

    1995-09-01

    Three experiments were conducted to study the behavior of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes to that the results may be compared. Only the steam generator was modelled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments.

  6. RELAP5/MOD2 code modifications to obtain better predictions for the once-through steam generator

    International Nuclear Information System (INIS)

    Blanchat, T.; Hassan, Y.

    1989-01-01

    The steam generator is a major component in pressurized water reactors. Predicting the response of a steam generator during both steady-state and transient conditions is essential in studying the thermal-hydraulic behavior of a nuclear reactor coolant system. Therefore, many analytical and experimental efforts have been performed to investigate the thermal-hydraulic behavior of the steam generators during operational and accident transients. The objective of this study is to predict the behavior of the secondary side of the once-through steam generator (OTSG) using the RELAP5/MOD2 computer code. Steady-state conditions were predicted with the current version of the RELAP5/MOD2 code and compared with experimental plant data. The code predictions consistently underpredict the degree of superheat. A new interface friction model has been implemented in a modified version of RELAP5/MOD2. This modification, along with changes to the flow regime transition criteria and the heat transfer correlations, correctly predicts the degree of superheat and matches plant data

  7. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  8. Cellular prion protein and γ-synuclein overexpression in LS 174T colorectal cancer cell drives endothelial proliferation-to-differentiation switch

    Directory of Open Access Journals (Sweden)

    Sing-Hui Ong

    2018-03-01

    Full Text Available Background Tumor-induced angiogenesis is an imperative event in pledging new vasculature for tumor metastasis. Since overexpression of neuronal proteins gamma-synuclein (γ-Syn and cellular prion protein (PrPC is always detected in advanced stages of cancer diseases which involve metastasis, this study aimed to investigate whether γ-Syn or PrPC overexpression in colorectal adenocarcinoma, LS 174T cells affects angiogenesis of endothelial cells, EA.hy 926 (EA. Methods EA cells were treated with conditioned media (CM of LS 174T-γ-Syn or LS 174T-PrP, and their proliferation, invasion, migration, adhesion and ability to form angiogenic tubes were assessed using a range of biological assays. To investigate plausible background mechanisms in conferring the properties of EA cells above, nitrite oxide (NO levels were measured and the expression of angiogenesis-related factors was assessed using a human angiogenesis antibody array. Results EA proliferation was significantly inhibited by LS 174T-PrP CM whereas its telomerase activity was reduced by CM of LS 174T-γ-Syn or LS 174T-PrP, as compared to EA incubated with LS 174T CM. Besides, LS 174T-γ-Syn CM or LS 174T-PrP CM inhibited EA invasion and migration in Boyden chamber assay. Furthermore, LS 174T-γ-Syn CM significantly inhibited EA migration in scratch wound assay. Gelatin zymography revealed reduced secretion of MMP-2 and MMP-9 by EA treated with LS 174T-γ-Syn CM or LS 174T-PrP CM. In addition, cell adhesion assay showed lesser LS 174T-γ-Syn or LS 174T-PrP cells adhered onto EA, as compared to LS 174T. In tube formation assay, LS 174T-γ-Syn CM or LS 174T-PrP CM induced EA tube formation. Increased NO secretion by EA treated with LS 174T-γ-Syn CM or LS 174T-PrP CM was also detected. Lastly, decreased expression of pro-angiogenic factors like CXCL16, IGFBP-2 and amphiregulin in LS 174T-γ-Syn CM or LS 174T-PrP CM was detected using the angiogenesis antibody array. Discussion These results

  9. Experimental and Analytical Studies on Improved Feedforward ML Estimation Based on LS-SVR

    Directory of Open Access Journals (Sweden)

    Xueqian Liu

    2013-01-01

    Full Text Available Maximum likelihood (ML algorithm is the most common and effective parameter estimation method. However, when dealing with small sample and low signal-to-noise ratio (SNR, threshold effects are resulted and estimation performance degrades greatly. It is proved that support vector machine (SVM is suitable for small sample. Consequently, we employ the linear relationship between least squares support vector regression (LS-SVR’s inputs and outputs and regard LS-SVR process as a time-varying linear filter to increase input SNR of received signals and decrease the threshold value of mean square error (MSE curve. Furthermore, it is verified that by taking single-tone sinusoidal frequency estimation, for example, and integrating data analysis and experimental validation, if LS-SVR’s parameters are set appropriately, not only can the LS-SVR process ensure the single-tone sinusoid and additive white Gaussian noise (AWGN channel characteristics of original signals well, but it can also improves the frequency estimation performance. During experimental simulations, LS-SVR process is applied to two common and representative single-tone sinusoidal ML frequency estimation algorithms, the DFT-based frequency-domain periodogram (FDP and phase-based Kay ones. And the threshold values of their MSE curves are decreased by 0.3 dB and 1.2 dB, respectively, which obviously exhibit the advantage of the proposed algorithm.

  10. LS1 Report: nearing the finish line

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    The LS1 team will be popping the champagne next week, on Tuesday 27 May, celebrating the completion of the consolidation of the splices in the framework of the SMACC project.   A technician works on one of the final shunts during LS1. "It has been a long journey into the heart of the LHC, tackling over 27,000 shunts*," says Luca Bottura, TE-MSC Group leader. "We are happy that the final train has, at last, reached its rest station, and look forward to sending it on many new adventures," confirm Frédéric Savary, TE-MSC Large Magnet Facility Section leader, and Jean-Philippe Tock, SMACC Project leader. Also in the LHC, pressure tests in Sector 1-2 - the third sector to be tackled - are almost complete. The temperature in Sector 6-7 is around 100K and it will be accessible again from next week. As for the SPS, all the LSS1 beam elements excluding one monitor are back in position. Vacuum teams are now ...

  11. CMS DAQ current and future hardware upgrades up to post Long Shutdown 3 (LS3) times

    CERN Document Server

    Racz, Attila; Behrens, Ulf; Branson, James; Chaze, Olivier; Cittolin, Sergio; Contescu, Cristian; da Silva Gomes, Diego; Darlea, Georgiana-Lavinia; Deldicque, Christian; Demiragli, Zeynep; Dobson, Marc; Doualot, Nicolas; Erhan, Samim; Fulcher, Jonathan Richard; Gigi, Dominique; Gladki, Maciej; Glege, Frank; Gomez-Ceballos, Guillelmo; Hegeman, Jeroen; Holzner, Andre; Janulis, Mindaugas; Lettrich, Michael; Meijers, Frans; Meschi, Emilio; Mommsen, Remigius K; Morovic, Srecko; O'Dell, Vivian; Orn, Samuel Johan; Orsini, Luciano; Papakrivopoulos, Ioannis; Paus, Christoph; Petrova, Petia; Petrucci, Andrea; Pieri, Marco; Rabady, Dinyar; Reis, Thomas; Sakulin, Hannes; Schwick, Christoph; Simelevicius, Dainius; Vazquez Velez, Cristina; Vougioukas, Michail; Zejdl, Petr

    2017-01-01

    Following the first LHC collisions seen and recorded by CMS in 2009, the DAQ hardware went through a major upgrade during LS1 (2013- 2014) and new detectors have been connected during 2015-2016 and 2016-2017 winter shutdowns. Now, LS2 (2019-2020) and LS3 (2024-mid 2026) are actively being prepared. This paper shows how CMS DAQ hardware has evolved from the beginning and will continue to evolve in order to meet the future challenges posed by High Luminosity LHC (HL-LHC) and the CMS detector evolution. In particular, post LS3 DAQ architectures are focused upon.

  12. Can the proton injectors meet the HL-LHC requirements after LS2?

    International Nuclear Information System (INIS)

    Goddard, B.; Bartosik, H.; Bracco, C.; Bruening, O.; Carli, C.; Cornelis, K.; Damerau, H.; Garoby, R.; Gilardoni, S.; Hancock, S.; Hanke, K.; Kain, V.; Meddahi, M.; Mikulec, B.; Papaphilippou, Y.; Rumolo, G.; Shaposhnikova, E.; Steerenberg, R.; Vretenar, M.

    2012-01-01

    The LIU project has as mandate the upgrade of the LHC injector chain to match the requirements of HL-LHC. The present planning assumes that the upgrade work will be completed in LS2, for commissioning in the following operational year. The known limitations in the different injectors are described, together with the various upgrades planned to improve the performance. The expected performance reach after the upgrade with 25 and 50 ns beams is examined. The project planning is discussed in view of the present LS1 and LS2 planning. The main unresolved questions and associated decision points are presented, and the key issues to be addressed by the end of 2012 are detailed in the context of the machine development programs and hardware construction activities. (authors)

  13. Simplification of coding of NRU loop experiment software with dimensional generator

    International Nuclear Information System (INIS)

    Davis, R. S.

    2006-01-01

    The following are specific topics of this paper: 1.There is much creativity in the manner in which Dimensional Generator can be applied to a specific programming task [2]. This paper tells how Dimensional Generator was applied to a reactor-physics task. 2. In this first practical use, Dimensional Generator itself proved not to need change, but a better user interface was found necessary, essentially because the relevance of Dimensional Generator to reactor physics was initially underestimated. It is briefly described. 3. The use of Dimensional Generator helps make reactor-physics source code somewhat simpler. That is explained here with brief examples from BURFEL-PC and WIMSBURF. 4. Most importantly, with the help of Dimensional Generator, all erroneous physical expressions were automatically detected. The errors are detailed here (in spite of the author's embarrassment) because they show clearly, both in theory and in practice, how Dimensional Generator offers quality enhancement of reactor-physics programming. (authors)

  14. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  15. Analysis of the VVER-440 reactor steam generator secondary side with the RELAP5/MOD3 code

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1993-01-01

    Nuclear Engineering Laboratory of the Technical Research Centre of Finland has widely used RELAP5/MOD2 and -MOD3 codes to simulate horizontal steam generators. Several models have been developed and successfully used in the VVER-safety analysis. Nevertheless, the models developed have included only rather few nodes in the steam generator secondary side. The secondary side has normally been divided into about 10 to 15 nodes. Since the secondary side at the steam generators of VVER-440 type reactors consists of a rather large water pool, these models were only roughly capable to predict secondary side flows. The paper describes an attempt to use RELAP5/MOD3 code to predict secondary side flows in a steam generator of a VVER-440 reactor. A 2D/3D model has been developed using RELAP5/MOD3 codes cross-flow junctions. The model includes 90 volumes on the steam generator secondary side. The model has been used to calculate steady state flow conditions in the secondary side of a VVER-440 reactor steam generator. (orig.) (1 ref., 9 figs., 2 tabs.)

  16. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    International Nuclear Information System (INIS)

    Virtanen, E.; Haapalehto, T.; Kouhia, J.

    1997-01-01

    Three experiments were conducted to study the behaviour of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes so that the results may be compared. Only the steam generator was modeled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments. (orig.)

  17. Development and Validation of an Instrument to Measure the Impact of Genetic Testing on Self-Concept in Lynch Syndrome (LS)

    Science.gov (United States)

    Esplen, Mary Jane; Stuckless, Noreen; Wong, Jiahui; Gallinger, Steve; Aronson, Melyssa; Rothenmund, Heidi; Semotiuk, Kara; Stokes, Jackie; Way, Chris; Green, Jane; Butler, Kate; Petersen, Helle Vendel

    2011-01-01

    Background A positive genetic test result may impact on a person’s self-concept and affect quality of life. Purpose The purpose of the study was to develop a self concept scale to measure such impact for individuals carrying mutations for a heritable colorectal cancer- Lynch Syndrome (LS). Methods Two distinct phases were involved: Phase 1 generated specific colorectal self-concept candidate scale items from interviews with eight LS carriers and five genetic counselors which were added to a previously developed self-concept scale for BRCA1/2 mutation carriers. Phase II had 115 LS carriers complete the candidate scale and a battery of validating measures. Results A 20 item scale was developed with two dimensions identified through factor analysis: stigma/vulnerability and bowel symptom-related anxiety. The scale demonstrated excellent reliability (Cronbach’s α = .93), good convergent validity by a high correlation with impact of event scale (r(102) = .55, pRosenberg Self-Esteem Scale (r(108) = −.59, pscale’s performance was stable across participant characteristics. Conclusions This new scale for measuring self-concept has potential to be used as a clinical tool and as a measure for future studies. PMID:21883167

  18. Nonlinear Time Series Prediction Using LS-SVM with Chaotic Mutation Evolutionary Programming for Parameter Optimization

    International Nuclear Information System (INIS)

    Xu Ruirui; Chen Tianlun; Gao Chengfeng

    2006-01-01

    Nonlinear time series prediction is studied by using an improved least squares support vector machine (LS-SVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.

  19. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  20. Multiple optical code-label processing using multi-wavelength frequency comb generator and multi-port optical spectrum synthesizer.

    Science.gov (United States)

    Moritsuka, Fumi; Wada, Naoya; Sakamoto, Takahide; Kawanishi, Tetsuya; Komai, Yuki; Anzai, Shimako; Izutsu, Masayuki; Kodate, Kashiko

    2007-06-11

    In optical packet switching (OPS) and optical code division multiple access (OCDMA) systems, label generation and processing are key technologies. Recently, several label processors have been proposed and demonstrated. However, in order to recognize N different labels, N separate devices are required. Here, we propose and experimentally demonstrate a large-scale, multiple optical code (OC)-label generation and processing technology based on multi-port, a fully tunable optical spectrum synthesizer (OSS) and a multi-wavelength electro-optic frequency comb generator. The OSS can generate 80 different OC-labels simultaneously and can perform 80-parallel matched filtering. We also demonstrated its application to OCDMA.

  1. Towards provably correct code generation for a hard real-time programming language

    DEFF Research Database (Denmark)

    Fränzle, Martin; Müller-Olm, Markus

    1994-01-01

    This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementation...

  2. Intermediate coupling collision strengths from LS coupled R-matrix elements

    International Nuclear Information System (INIS)

    Clark, R.E.H.

    1978-01-01

    Fine structure collision strength for transitions between two groups of states in intermediate coupling and with inclusion of configuration mixing are obtained from LS coupled reactance matrix elements (R-matrix elements) and a set of mixing coefficients. The LS coupled R-matrix elements are transformed to pair coupling using Wigner 6-j coefficients. From these pair coupled R-matrix elements together with a set of mixing coefficients, R-matrix elements are obtained which include the intermediate coupling and configuration mixing effects. Finally, from the latter R-matrix elements, collision strengths for fine structure transitions are computed (with inclusion of both intermediate coupling and configuration mixing). (Auth.)

  3. An improved steam generator model for the SASSYS code

    International Nuclear Information System (INIS)

    Pizzica, P.A.

    1989-01-01

    A new steam generator model has been developed for the SASSYS computer code, which analyzes accident conditions in a liquid-metal-cooled fast reactor. It has been incorporated into the new SASSYS balance-of-plant model, but it can also function as a stand-alone model. The model provides a full solution of the steady-state condition before the transient calculation begins for given sodium and water flow rates, inlet and outlet sodium temperatures, and inlet enthalpy and region lengths on the water side

  4. MC2-3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C. H. [Argonne National Lab. (ANL), Argonne, IL (United States); Yang, W. S. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-11-08

    The MC2-3 code is a Multigroup Cross section generation Code for fast reactor analysis, developed by improving the resonance self-shielding and spectrum calculation methods of MC2-2 and integrating the one-dimensional cell calculation capabilities of SDX. The code solves the consistent P1 multigroup transport equation using basic neutron data from ENDF/B data files to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (~2000) or hyperfine (~400,000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified isotopic temperatures. The pointwise cross sections are directly used in the hyperfine group calculation whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for two-dimensional whole-core problems to generate region-dependent broad-group cross sections. Multigroup cross sections are written in the ISOTXS format for a user-specified group structure. The code is executable on UNIX, Linux, and PC Windows systems, and its library includes all isotopes of the ENDF/BVII. 0 data.

  5. PREP-PWR-1.0: a WIMS-D/4 pre-processor code for the generation of data for PWR fuel assemblies

    International Nuclear Information System (INIS)

    Ball, G.

    1991-06-01

    The PREP-PWR-1.0 computer code is a substantially modified version of the PREWIM code which formed part of the original MARIA System (Report J.E.N. 543). PREP-PWR-1.0 is a comprehensive pre-processor code which generates input data for the WIMS-D/4.1 code (Report PEL 294) for PWR fuel assemblies, with or without control and burnable poison rods. This data is generated at various base and off-base conditions. The overall cross section generation methodology is described, followed by a brief overview of the model. Aspects of the base/off-base calculational scheme are outlined. Additional features of the code are described while the input data format of PREP-PWR-1.0 is listed. The sample problems and suggestions for further improvements to the code are also described. 2 figs., 2 tabs., 12 refs

  6. LS1 Report: Handing in the ATLAS keys

    CERN Multimedia

    Antonella Del Rosso, Katarina Anthony

    2014-01-01

    After completing more than 250 work packages concerning the whole detector and experimental site, the ATLAS and CERN teams involved with LS1 operations are now wrapping things up before starting the commissioning phase in preparation for the LHC restart. The giant detector is now more efficient, safer and even greener than ever thanks to the huge amount of work carried out over the past two years.   Cleaning up the ATLAS cavern and detector in preparation for Run 2. Hundreds of people, more than 3000 certified interventions, huge and delicate parts of the detector completely refurbished: the ATLAS detector that will take data during Run 2 is a brand new machine, which will soon be back in the hands of the thousands of scientists who are preparing for the high-energy run of the LHC accelerator. “During LS1, we have upgraded the detector’s basic infrastructure and a few of its sub-detectors,” explains Beniamino Di Girolamo, ATLAS Technical Coordinator. &...

  7. Inflorescence Development and the Role of LsFT in Regulating Bolting in Lettuce (Lactuca sativa L.)

    Science.gov (United States)

    Chen, Zijing; Han, Yingyan; Ning, Kang; Ding, Yunyu; Zhao, Wensheng; Yan, Shuangshuang; Luo, Chen; Jiang, Xiaotang; Ge, Danfeng; Liu, Renyi; Wang, Qian; Zhang, Xiaolan

    2018-01-01

    Lettuce (Lactuca sativa L.) is one of the most important leafy vegetable that is consumed during its vegetative growth. The transition from vegetative to reproductive growth is induced by high temperature, which has significant economic effect on lettuce production. However, the progression of floral transition and the molecular regulation of bolting are largely unknown. Here we morphologically characterized the inflorescence development and functionally analyzed the FLOWERING LOCUS T (LsFT) gene during bolting regulation in lettuce. We described the eight developmental stages during floral transition process. The expression of LsFT was negatively correlated with bolting in different lettuce varieties, and was promoted by heat treatment. Overexpression of LsFT could recover the late-flowering phenotype of ft-2 mutant. Knockdown of LsFT by RNA interference dramatically delayed bolting in lettuce, and failed to respond to high temperature. Therefore, this study dissects the process of inflorescence development and characterizes the role of LsFT in bolting regulation in lettuce. PMID:29403510

  8. Inflorescence Development and the Role of LsFT in Regulating Bolting in Lettuce (Lactuca sativa L.).

    Science.gov (United States)

    Chen, Zijing; Han, Yingyan; Ning, Kang; Ding, Yunyu; Zhao, Wensheng; Yan, Shuangshuang; Luo, Chen; Jiang, Xiaotang; Ge, Danfeng; Liu, Renyi; Wang, Qian; Zhang, Xiaolan

    2017-01-01

    Lettuce ( Lactuca sativa L.) is one of the most important leafy vegetable that is consumed during its vegetative growth. The transition from vegetative to reproductive growth is induced by high temperature, which has significant economic effect on lettuce production. However, the progression of floral transition and the molecular regulation of bolting are largely unknown. Here we morphologically characterized the inflorescence development and functionally analyzed the FLOWERING LOCUS T (LsFT) gene during bolting regulation in lettuce. We described the eight developmental stages during floral transition process. The expression of LsFT was negatively correlated with bolting in different lettuce varieties, and was promoted by heat treatment. Overexpression of LsFT could recover the late-flowering phenotype of ft-2 mutant. Knockdown of LsFT by RNA interference dramatically delayed bolting in lettuce, and failed to respond to high temperature. Therefore, this study dissects the process of inflorescence development and characterizes the role of LsFT in bolting regulation in lettuce.

  9. Calculation of “LS-curves” for coincidence summing corrections in gamma ray spectrometry

    Science.gov (United States)

    Vidmar, Tim; Korun, Matjaž

    2006-01-01

    When coincidence summing correction factors for extended samples are calculated in gamma-ray spectrometry from full-energy-peak and total efficiencies, their variation over the sample volume needs to be considered. In other words, the correction factors cannot be computed as if the sample were a point source. A method developed by Blaauw and Gelsema takes the variation of the efficiencies over the sample volume into account. It introduces the so-called LS-curve in the calibration procedure and only requires the preparation of a single standard for each sample geometry. We propose to replace the standard preparation by calculation and we show that the LS-curves resulting from our method yield coincidence summing correction factors that are consistent with the LS values obtained from experimental data.

  10. Steady Modeling for an Ammonia Synthesis Reactor Based on a Novel CDEAS-LS-SVM Model

    Directory of Open Access Journals (Sweden)

    Zhuoqian Liu

    2014-01-01

    Full Text Available A steady-state mathematical model is built in order to represent plant behavior under stationary operating conditions. A novel modeling using LS-SVR based on Cultural Differential Evolution with Ant Search is proposed. LS-SVM is adopted to establish the model of the net value of ammonia. The modeling method has fast convergence speed and good global adaptability for identification of the ammonia synthesis process. The LS-SVR model was established using the above-mentioned method. Simulation results verify the validity of the method.

  11. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  12. Mesh generation and energy group condensation studies for the jaguar deterministic transport code

    International Nuclear Information System (INIS)

    Kennedy, R. A.; Watson, A. M.; Iwueke, C. I.; Edwards, E. J.

    2012-01-01

    The deterministic transport code Jaguar is introduced, and the modeling process for Jaguar is demonstrated using a two-dimensional assembly model of the Hoogenboom-Martin Performance Benchmark Problem. This single assembly model is being used to test and analyze optimal modeling methodologies and techniques for Jaguar. This paper focuses on spatial mesh generation and energy condensation techniques. In this summary, the models and processes are defined as well as thermal flux solution comparisons with the Monte Carlo code MC21. (authors)

  13. Mesh generation and energy group condensation studies for the jaguar deterministic transport code

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, R. A.; Watson, A. M.; Iwueke, C. I.; Edwards, E. J. [Knolls Atomic Power Laboratory, Bechtel Marine Propulsion Corporation, P.O. Box 1072, Schenectady, NY 12301-1072 (United States)

    2012-07-01

    The deterministic transport code Jaguar is introduced, and the modeling process for Jaguar is demonstrated using a two-dimensional assembly model of the Hoogenboom-Martin Performance Benchmark Problem. This single assembly model is being used to test and analyze optimal modeling methodologies and techniques for Jaguar. This paper focuses on spatial mesh generation and energy condensation techniques. In this summary, the models and processes are defined as well as thermal flux solution comparisons with the Monte Carlo code MC21. (authors)

  14. ISOGEN: Interactive isotope generation and depletion code

    International Nuclear Information System (INIS)

    Venkata Subbaiah, Kamatam

    2016-01-01

    ISOGEN is an interactive code for solving first order coupled linear differential equations with constant coefficients for a large number of isotopes, which are produced or depleted by the processes of radioactive decay or through neutron transmutation or fission. These coupled equations can be written in a matrix notation involving radioactive decay constants and transmutation coefficients, and the eigenvalues of thus formed matrix vary widely (several tens of orders), and hence no single method of solution is suitable for obtaining precise estimate of concentrations of isotopes. Therefore, different methods of solutions are followed, namely, matrix exponential method, Bateman series method, and Gauss-Seidel iteration method, as was followed in the ORIGEN-2 code. ISOGEN code is written in a modern computer language, VB.NET version 2013 for Windows operating system version 7, which enables one to provide many interactive features between the user and the program. The output results depend on the input neutron database employed and the time step involved in the calculations. The present program can display the information about the database files, and the user has to select one which suits the current need. The program prints the 'WARNING' information if the time step is too large, which is decided based on the built-in convergence criterion. Other salient interactive features provided are (i) inspection of input data that goes into calculation, (ii) viewing of radioactive decay sequence of isotopes (daughters, precursors, photons emitted) in a graphical format, (iii) solution of parent and daughter products by direct Bateman series solution method, (iv) quick input method and context sensitive prompts for guiding the novice user, (v) view of output tables for any parameter of interest, and (vi) output file can be read to generate new information and can be viewed or printed since the program stores basic nuclide concentration unlike other batch jobs. The sample

  15. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  16. NULIF: neutron spectrum generator, few-group constant calculator, and fuel depletion code

    International Nuclear Information System (INIS)

    Wittkopf, W.A.; Tilford, J.M.; Andrews, J.B. II; Kirschner, G.; Hassan, N.M.; Colpo, P.N.

    1977-02-01

    The NULIF code generates a microgroup neutron spectrum and calculates spectrum-weighted few-group parameters for use in a spatial diffusion code. A wide variety of fuel cells, non-fuel cells, and fuel lattices, typical of PWR (or BWR) lattices, are treated. A fuel depletion routine and change card capability allow a broad range of problems to be studied. Coefficient variation with fuel burnup, fuel temperature change, moderator temperature change, soluble boron concentration change, burnable poison variation, and control rod insertion are readily obtained. Heterogeneous effects, including resonance shielding and thermal flux depressions, are treated. Coefficients are obtained for one thermal group and up to three epithermal groups. A special output routine writes the few-group coefficient data in specified format on an output tape for automated fitting in the PDQ07-HARMONY system of spatial diffusion-depletion codes

  17. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  18. Detector Plans for LS1

    Energy Technology Data Exchange (ETDEWEB)

    Nessi, M [European Organization for Nuclear Research, Geneva (Switzerland)

    2012-07-01

    All experiments plan an effective usage of the LS1 shutdown period. After three years of running they will go through a consolidation phase, mostly to fix problems that have emerged over time, like single points of failure in the infrastructure, failures of low voltage power supplies and optical links. Upgrades of some detector components will start, mainly related to the beam pipe, the innermost tracker elements and the trigger system. Detector components, which had to be staged for cost reasons in 2003, will then enter into the detector setup. The goal is to be fully ready for the new energy regime at nominal luminosity.

  19. Launch of new e-learning course “Safety during LS1”

    CERN Multimedia

    HSE Unit

    2013-01-01

    After 3 years of activity, the LHC and the rest of the accelerator chain have been shut down for about 2 years (from February 2013 to December 2014) due to maintenance and upgrade work, on the surface as well as underground.   CERN has developed a new e-learning course related to this Long Shutdown period (LS1) so as to provide all the collaborators working in the LS1 area with accurate security-oriented information. The objectives of this new course are to: Present LS1 and its context, Identify CERN facilities’ major risks, Identify the main risks associated with our co-activities, Explain how safety issues are being handled at CERN, Present all the basic safety measures to be respected, Present all emergency and rescue instructions. The course is available via the e-learning SIR application. It is compulsory for all newcomers at CERN, along with the “CERN Safety Introduction” training. It is also highly recommended that people who were already w...

  20. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  1. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    Science.gov (United States)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  2. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    , can be written as short C kernels operating locally on the underlying mesh, with no explicit parallelism. The executable code is then generated in C, CUDA or OpenCL and executed in parallel on the target architecture. The system also offers features of special relevance to the geosciences. In particular, the large scale separation between the vertical and horizontal directions in many geoscientific processes can be exploited to offer the flexibility of unstructured meshes in the horizontal direction, without the performance penalty usually associated with those methods.

  3. An improved steam generator model for the SASSYS code

    International Nuclear Information System (INIS)

    Pizzica, P.A.

    1989-01-01

    A new steam generator model has been developed for the SASSYS computer code, which analyzes accident conditions in a liquid metal cooled fast reactor. It has been incorporated into the new SASSYS balance-of-plant model but it can also function on a stand-alone basis. The steam generator can be used in a once-through mode, or a variant of the model can be used as a separate evaporator and a superheater with recirculation loop. The new model provides for an exact steady-state solution as well as the transient calculation. There was a need for a faster and more flexible model than the old steam generator model. The new model provides for more detail with its multi-mode treatment as opposed to the previous model's one node per region approach. Numerical instability problems which were the result of cell-centered spatial differencing, fully explicit time differencing, and the moving boundary treatment of the boiling crisis point in the boiling region have been reduced. This leads to an increase in speed as larger time steps can now be taken. The new model is an improvement in many respects. 2 refs., 3 figs

  4. BCM-2.0 - The new version of computer code ;Basic Channeling with Mathematica©;

    Science.gov (United States)

    Abdrashitov, S. V.; Bogdanov, O. V.; Korotchenko, K. B.; Pivovarov, Yu. L.; Rozhkova, E. I.; Tukhfatullin, T. A.; Eikhorn, Yu. L.

    2017-07-01

    The new symbolic-numerical code devoted to investigation of the channeling phenomena in periodic potential of a crystal has been developed. The code has been written in Wolfram Language taking advantage of analytical programming method. Newly developed different packages were successfully applied to simulate scattering, radiation, electron-positron pair production and other effects connected with channeling of relativistic particles in aligned crystal. The result of the simulation has been validated against data from channeling experiments carried out at SAGA LS.

  5. ASSOCIATING LONG-TERM γ-RAY VARIABILITY WITH THE SUPERORBITAL PERIOD OF LS I +61°303

    International Nuclear Information System (INIS)

    Ackermann, M.; Buehler, R.; Ajello, M.; Ballet, J.; Casandjian, J. M.; Barbiellini, G.; Bastieri, D.; Buson, S.; Bellazzini, R.; Bregeon, J.; Bonamente, E.; Cecchi, C.; Brandt, T. J.; Brigida, M.; Bruel, P.; Caliandro, G. A.; Cameron, R. A.; Caraveo, P. A.; Cavazzuti, E.; Chekhtman, A.

    2013-01-01

    Gamma-ray binaries are stellar systems for which the spectral energy distribution (discounting the thermal stellar emission) peaks at high energies. Detected from radio to TeV gamma rays, the γ-ray binary LS I +61°303 is highly variable across all frequencies. One aspect of this system's variability is the modulation of its emission with the timescale set by the ∼26.4960 day orbital period. Here we show that, during the time of our observations, the γ-ray emission of LS I +61°303 also presents a sinusoidal variability consistent with the previously known superorbital period of 1667 days. This modulation is more prominently seen at orbital phases around apastron, whereas it does not introduce a visible change close to periastron. It is also found in the appearance and disappearance of variability at the orbital period in the power spectrum of the data. This behavior could be explained by a quasi-cyclical evolution of the equatorial outflow of the Be companion star, whose features influence the conditions for generating gamma rays. These findings open the possibility to use γ-ray observations to study the outflows of massive stars in eccentric binary systems

  6. NSLINK, Coupling of NJOY Cross-Sections Generator Code to SCALE-3 System

    International Nuclear Information System (INIS)

    De Leege, P.F.A

    1991-01-01

    1 - Description of program or function: NSLINK (NJOY - SCALE - LINK) is a set of computer codes to couple the NJOY cross-section generation code to the SCALE-3 code system (using AMPX-2 master library format) retaining the Nordheim resolved resonance treatment option. 2 - Method of solution: The following module and codes are included in NSLINK: XLACSR: This module is a stripped-down version of the XLACS-2 code. The module passes all l=0 resonance parameters as well as the contribution from all other resonances to the group cross-sections, the contribution from the wings of the l=0 resonances, the background cross-section and possible interference for multilevel Breit-Wigner resonance parameters. The group cross-sections are stored in the appropriate 1-D cross-section arrays. The output file has AMPX-2 master format. The original NJOY code is used to calculate all other data. The XLACSR module is included in the NJOY code. MILER: This code converts NJOY output (GENDF format) to AMPX-2 master format. The code is an extensively revised version of the original MILER code. In addition, the treatment of thermal scattering matrices at different temperatures is included. UNITABR: This code is a revised version of the UNITAB code. It merges the output of XLACSR and MILER in such a way that contributions from the bodies of the l=0 resonances in the resolved energy range, calculated by XLACSR, are subtracted from the 1-D group cross-section arrays for fission (MT=18) and neutron capture (MT=102). The l=0 resonance parameters and the contributions from the bodies of these resonances are added separately (MT=1023, 1022 and 1021). The total cross-section (MT=1), the absorption cross- section (MT=27) and the neutron removal cross-section (MT=101) values are adjusted. In the case of Bondarenko data, infinite dilution values of the cross-sections (MT=1, 18 and 102) are changed in the same way as the 1-D cross-section. The output file of UNITABR is in AMPX-2 master format and

  7. An Empirical Model for Vane-Type Vortex Generators in a Navier-Stokes Code

    Science.gov (United States)

    Dudek, Julianne C.

    2005-01-01

    An empirical model which simulates the effects of vane-type vortex generators in ducts was incorporated into the Wind-US Navier-Stokes computational fluid dynamics code. The model enables the effects of the vortex generators to be simulated without defining the details of the geometry within the grid, and makes it practical for researchers to evaluate multiple combinations of vortex generator arrangements. The model determines the strength of each vortex based on the generator geometry and the local flow conditions. Validation results are presented for flow in a straight pipe with a counter-rotating vortex generator arrangement, and the results are compared with experimental data and computational simulations using a gridded vane generator. Results are also presented for vortex generator arrays in two S-duct diffusers, along with accompanying experimental data. The effects of grid resolution and turbulence model are also examined.

  8. Aspects of the design of the automated system for code generation of electrical items of technological equipment

    Directory of Open Access Journals (Sweden)

    Erokhin V.V.

    2017-09-01

    Full Text Available The article presents the aspects of designing an automated system for generating codes for electrical elements of process equipment using CASE-means. We propose our own technology of iterative development of such systems. The proposed methodology uses the tool to develop the ERwin Data Modeler databases of Computer Associates and the author's tool for the automatic generation of ERwin Class Builder code. The implemented design tool is a superstructure over the ERwin Data Modeler from Computer Associates, which extends its functionality. ERwin Data Modeler works with logical and physical data models and allows you to generate a description of the database and ddl-scripts.

  9. The Bacillus subtilis Conjugative Plasmid pLS20 Encodes Two Ribbon-Helix-Helix Type Auxiliary Relaxosome Proteins That Are Essential for Conjugation.

    Science.gov (United States)

    Miguel-Arribas, Andrés; Hao, Jian-An; Luque-Ortega, Juan R; Ramachandran, Gayetri; Val-Calvo, Jorge; Gago-Córdoba, César; González-Álvarez, Daniel; Abia, David; Alfonso, Carlos; Wu, Ling J; Meijer, Wilfried J J

    2017-01-01

    Bacterial conjugation is the process by which a conjugative element (CE) is transferred horizontally from a donor to a recipient cell via a connecting pore. One of the first steps in the conjugation process is the formation of a nucleoprotein complex at the origin of transfer ( oriT ), where one of the components of the nucleoprotein complex, the relaxase, introduces a site- and strand specific nick to initiate the transfer of a single DNA strand into the recipient cell. In most cases, the nucleoprotein complex involves, besides the relaxase, one or more additional proteins, named auxiliary proteins, which are encoded by the CE and/or the host. The conjugative plasmid pLS20 replicates in the Gram-positive Firmicute bacterium Bacillus subtilis . We have recently identified the relaxase gene and the oriT of pLS20, which are separated by a region of almost 1 kb. Here we show that this region contains two auxiliary genes that we name aux1 LS20 and aux2 LS20 , and which we show are essential for conjugation. Both Aux1 LS20 and Aux2 LS20 are predicted to contain a Ribbon-Helix-Helix DNA binding motif near their N-terminus. Analyses of the purified proteins show that Aux1 LS20 and Aux2 LS20 form tetramers and hexamers in solution, respectively, and that they both bind preferentially to oriT LS20 , although with different characteristics and specificities. In silico analyses revealed that genes encoding homologs of Aux1 LS20 and/or Aux2 LS20 are located upstream of almost 400 relaxase genes of the Rel LS20 family (MOB L ) of relaxases. Thus, Aux1 LS20 and Aux2 LS20 of pLS20 constitute the founding member of the first two families of auxiliary proteins described for CEs of Gram-positive origin.

  10. Investigation and Applications of In-Source Oxidation in Liquid Sampling-Atmospheric Pressure Afterglow Microplasma Ionization (LS-APAG) Source.

    Science.gov (United States)

    Xie, Xiaobo; Wang, Zhenpeng; Li, Yafeng; Zhan, Lingpeng; Nie, Zongxiu

    2017-06-01

    A liquid sampling-atmospheric pressure afterglow microplasma ionization (LS-APAG) source is presented for the first time, which is embedded with both electrospray ionization (ESI) and atmospheric pressure afterglow microplasma ionization (APAG) techniques. This ion source is capable of analyzing compounds with diverse molecule weights and polarities. An unseparated mixture sample was detected as a proof-of-concept, giving complementary information (both polarities and non-polarities) with the two ionization modes. It should also be noted that molecular mass can be quickly identified by ESI with clean and simple spectra, while the structure can be directly studied using APAG with in-source oxidation. The ionization/oxidation mechanism and applications of the LS-APAG source have been further explored in the analysis of nonpolar alkanes and unsaturated fatty acids/esters. A unique [M + O - 3H] + was observed in the case of individual alkanes (C 5 -C 19 ) and complex hydrocarbons mixture under optimized conditions. Moreover, branched alkanes generated significant in-source fragments, which could be further applied to the discrimination of isomeric alkanes. The technique also facilitates facile determination of double bond positions in unsaturated fatty acids/esters due to diagnostic fragments (the acid/ester-containing aldehyde and acid oxidation products) generated by on-line ozonolysis in APAG mode. Finally, some examples of in situ APAG analysis by gas sampling and surface sampling were given as well. Graphical Abstract ᅟ.

  11. Generation of neutron cross sections library for the Thermos code of the Fuel management System (FMS)

    International Nuclear Information System (INIS)

    Alonso V, G.; Viais J, J.

    1990-10-01

    There is developed a method to generate the library of neutron cross sections for the Thermos code by means of the database ENDF-B/IV and the NJOY code. The obtained results are compared with the version previous of the library of neutron cross sections which was processed using the version ENDF-B/III. (Author)

  12. LCM-seq reveals the crucial role of LsSOC1 in heat-promoted bolting of lettuce (Lactuca sativa L.).

    Science.gov (United States)

    Chen, Zijing; Zhao, Wensheng; Ge, Danfeng; Han, Yingyan; Ning, Kang; Luo, Chen; Wang, Shenglin; Liu, Renyi; Zhang, Xiaolan; Wang, Qian

    2018-05-17

    Lettuce (Lactuca sativa L.) is one of the most economically important vegetables. The floral transition in lettuce is accelerated under high temperatures, which can significantly decrease yields. However, the molecular mechanism underlying the floral tranition in lettuce is poorly known. Using laser capture microdissection coupled with RNA sequencing, we isolated shoot apical meristem cells from the bolting-sensitive lettuce line S39 at four critical stages of development. Subsequently, we screened specifically for the flowering-related gene LsSOC1 during the floral transition through comparative transcriptomic analysis. Molecular biology, developmental biology, and biochemical tools were combined to investigate the biological function of LsSOC1 in lettuce. LsSOC1 knockdown by RNA interference resulted in a significant delay in the timing of bolting and insensitivity to high temperature, which indicated that LsSOC1 functions as an activator during heat-promoted bolting in lettuce. We determined that two heat-shock transcription factors, HsfA1e and HsfA4c, bound to the promoter of LsSOC1 to confirm that LsSOC1 played an important role in heat-promoted bolting. This study indicates that LsSOC1 plays a crucial role in the heat-promoted bolting process in lettuce. Further investigation of LsSOC1 may be useful for clarification of the bolting mechanism in lettuce. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Development of a 3D FEL code for the simulation of a high-gain harmonic generation experiment

    International Nuclear Information System (INIS)

    Biedron, S. G.

    1999-01-01

    Over the last few years, there has been a growing interest in self-amplified spontaneous emission (SASE) free-electron lasers (FELs) as a means for achieving a fourth-generation light source. In order to correctly and easily simulate the many configurations that have been suggested, such as multi-segmented wigglers and the method of high-gain harmonic generation, we have developed a robust three-dimensional code. The specifics of the code, the comparison to the linear theory as well as future plans will be presented

  14. The Bacillus subtilis Conjugative Plasmid pLS20 Encodes Two Ribbon-Helix-Helix Type Auxiliary Relaxosome Proteins That Are Essential for Conjugation

    Directory of Open Access Journals (Sweden)

    Andrés Miguel-Arribas

    2017-11-01

    Full Text Available Bacterial conjugation is the process by which a conjugative element (CE is transferred horizontally from a donor to a recipient cell via a connecting pore. One of the first steps in the conjugation process is the formation of a nucleoprotein complex at the origin of transfer (oriT, where one of the components of the nucleoprotein complex, the relaxase, introduces a site- and strand specific nick to initiate the transfer of a single DNA strand into the recipient cell. In most cases, the nucleoprotein complex involves, besides the relaxase, one or more additional proteins, named auxiliary proteins, which are encoded by the CE and/or the host. The conjugative plasmid pLS20 replicates in the Gram-positive Firmicute bacterium Bacillus subtilis. We have recently identified the relaxase gene and the oriT of pLS20, which are separated by a region of almost 1 kb. Here we show that this region contains two auxiliary genes that we name aux1LS20 and aux2LS20, and which we show are essential for conjugation. Both Aux1LS20 and Aux2LS20 are predicted to contain a Ribbon-Helix-Helix DNA binding motif near their N-terminus. Analyses of the purified proteins show that Aux1LS20 and Aux2LS20 form tetramers and hexamers in solution, respectively, and that they both bind preferentially to oriTLS20, although with different characteristics and specificities. In silico analyses revealed that genes encoding homologs of Aux1LS20 and/or Aux2LS20 are located upstream of almost 400 relaxase genes of the RelLS20 family (MOBL of relaxases. Thus, Aux1LS20 and Aux2LS20 of pLS20 constitute the founding member of the first two families of auxiliary proteins described for CEs of Gram-positive origin.

  15. Steam generator transient studies using a simplified two-fluid computer code

    International Nuclear Information System (INIS)

    Munshi, P.; Bhatnagar, R.; Ram, K.S.

    1985-01-01

    A simplified two-fluid computer code has been used to simulate reactor-side (or primary-side) transients in a PWR steam generator. The disturbances are modelled as ramp inputs for pressure, internal energy and mass flow-rate for the primary fluid. The CPU time for a transient duration of 4 s is approx. 10 min on a DEC-1090 computer system. The results are thermodynamically consistent and encouraging for further studies. (author)

  16. Detectors plans for LS1

    International Nuclear Information System (INIS)

    Nessi, M.

    2012-01-01

    All experiments plan an effective usage of the LS1 shutdown period. After three years of running they will go through a consolidation phase, mostly to fix problems that have emerged over time, like single points of failure in the infrastructure, failures of low-voltage power supplies and optical links. Upgrades of some detector components will start, mainly related to the beam pipe, the innermost tracker elements and the trigger system. Detector components, which had to be staged for cost reasons in 2003, will then enter into the detector setup. The goal is to be fully ready for the new energy regime at nominal luminosity. This article reviews the planned maintenance and modification works for ATLAS, CMS, LHCb and ALICE experiments. (author)

  17. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    Fernandez Adiego, B.; Blanco Vinuela, E.; Prieto Barreiro, I.

    2012-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  18. Error-prone PCR mutation of Ls-EPSPS gene from Liriope spicata conferring to its enhanced glyphosate-resistance.

    Science.gov (United States)

    Mao, Chanjuan; Xie, Hongjie; Chen, Shiguo; Valverde, Bernal E; Qiang, Sheng

    2017-09-01

    Liriope spicata (Thunb.) Lour has a unique LsEPSPS structure contributing to the highest-ever-recognized natural glyphosate tolerance. The transformed LsEPSPS confers increased glyphosate resistance to E. coli and A. thaliana. However, the increased glyphosate-resistance level is not high enough to be of commercial value. Therefore, LsEPSPS was subjected to error-prone PCR to screen mutant EPSPS genes capable of endowing higher resistance levels. A mutant designated as ELs-EPSPS having five mutated amino acids (37Val, 67Asn, 277Ser, 351Gly and 422Gly) was selected for its ability to confer improved resistance to glyphosate. Expression of ELs-EPSPS in recombinant E. coli BL21 (DE3) strains enhanced resistance to glyphosate in comparison to both the LsEPSPS-transformed and -untransformed controls. Furthermore, transgenic ELs-EPSPS A. thaliana was about 5.4 fold and 2-fold resistance to glyphosate compared with the wild-type and the Ls-EPSPS-transgenic plants, respectively. Therefore, the mutated ELs-EPSPS gene has potential value for has potential for the development of glyphosate-resistant crops. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Introducing instrumental variables in the LS-SVM based identification framework

    NARCIS (Netherlands)

    Laurain, V.; Zheng, W-X.; Toth, R.

    2011-01-01

    Least-Squares Support Vector Machines (LS-SVM) represent a promising approach to identify nonlinear systems via nonparametric estimation of the nonlinearities in a computationally and stochastically attractive way. All the methods dedicated to the solution of this problem rely on the minimization of

  20. The effect and contribution of wind generated rotation on outlet temperature and heat gain of LS-2 parabolic trough solar collector

    Directory of Open Access Journals (Sweden)

    Sadaghiyani Omid Karimi

    2013-01-01

    Full Text Available The Monte Carlo ray tracing method is applied and coupled with finite volume numerical methods to study effect of rotation on outlet temperature and heat gain of LS-2 parabolic trough concentrator (PTC. Based on effect of sunshape, curve of mirror and use of MCRT, heat flux distribution around of inner wall of evacuated tube is calculated. After calculation of heat flux, the geometry of LS-2 Luz collector is created and finite volume method is applied to simulate. The obtained results are compared with Dudley et al test results for irrotational cases to validate these numerical solving models. Consider that, for rotational models ,the solving method separately with K.S. Ball's results. In this work, according to the structure of mentioned collector, we use plug as a flow restriction. In the rotational case studies, the inner wall rotates with different angular speeds. We compare results of rotational collector with irrotational. Also for these two main states, the location of plug changed then outlet temperature and heat gain of collector are studied. The results show that rotation have positive role on heat transfer processing and the rotational plug in bottom half of tube have better effectual than upper half of tube. Also the contribution of rotation is calculated in the all of case studies. Working fluid of these study is one of the oil derivatives namely Syltherm-800. The power of wind can be used to rotate tube of collector.

  1. Application and evaluation of LS-PIV technique for the monitoring of river surface velocities in high flow conditions

    OpenAIRE

    Jodeau , M.; Hauet , A.; Paquier , A.; Le Coz , J.; Dramais , G.

    2008-01-01

    Large Scale Particle Image Velocimetry (LS-PIV) is used to measure the surface flow velocities in a mountain stream during high flow conditions due to a reservoir release. A complete installation including video acquisition from a mobile elevated viewpoint and artificial flow seeding has been developed and implemented. The LS-PIV method was adapted in order to take into account the specific constraints of these high flow conditions. Using a usual LS-PIV data processing, significant variations...

  2. Shock Transmission Analyses of a Simplified Frigate Compartment Using LS-DYNA

    National Research Council Canada - National Science Library

    Trouwborst, W

    1999-01-01

    This report gives results as obtained with finite element analyses using the explicit finite element program LS-DYNA for a longitudinal slice of a frigate's compartment loaded with a shock pulse based...

  3. EN-CV during LS1: upgrade, consolidation, maintenance, operation

    International Nuclear Information System (INIS)

    Nonis, M.

    2012-01-01

    The Cooling and Ventilation (CV) Group in the Engineering Department (EN) will be heavily involved in several projects and activities during the long shutdown in 2013 and 2014 (LS1) within a time-frame limited to around twelve months. According to the requests received so far, most projects are related to the upgrade of users' equipment, consolidation work, and the construction of new plants. However, through the experience gained from the first years of the LHC run, some projects are also needed to adapt the existing installations to the new operating parameters. Some of these projects are presented hereafter, outlining the impact that they will have on operational working conditions or risks of breakdown. Among these projects we find: the PM32 raising pumps, the cooling of the CERN Control Center, R2E, the backup cooling towers for ATLAS and cryogenics, a thermosyphon for ATLAS, or new pumps in UWs. Finally, EN-CV activities during LS1 for maintenance, operation, and commissioning will be mentioned since they represent a major workload for the Group

  4. Process of cross section generation for radiation shielding calculations, using the NJOY code

    International Nuclear Information System (INIS)

    Ono, S.; Corcuera, R.P.

    1986-10-01

    The process of multigroup cross sections generation for radiation shielding calculations, using the NJOY code, is explained. Photon production cross sections, processed by the GROUPR module, and photon interaction cross sections processed by the GAMINR are given. These data are compared with the data produced by the AMPX system and published data. (author) [pt

  5. [Rapid determination of COD in aquaculture water based on LS-SVM with ultraviolet/visible spectroscopy].

    Science.gov (United States)

    Liu, Xue-Mei; Zhang, Hai-Liang

    2014-10-01

    Ultraviolet/visible (UV/Vis) spectroscopy was studied for the rapid determination of chemical oxygen demand (COD), which was an indicator to measure the concentration of organic matter in aquaculture water. In order to reduce the influence of the absolute noises of the spectra, the extracted 135 absorbance spectra were preprocessed by Savitzky-Golay smoothing (SG), EMD, and wavelet transform (WT) methods. The preprocessed spectra were then used to select latent variables (LVs) by partial least squares (PLS) methods. Partial least squares (PLS) was used to build models with the full spectra, and back- propagation neural network (BPNN) and least square support vector machine (LS-SVM) were applied to build models with the selected LVs. The overall results showed that BPNN and LS-SVM models performed better than PLS models, and the LS-SVM models with LVs based on WT preprocessed spectra obtained the best results with the determination coefficient (r2) and RMSE being 0. 83 and 14. 78 mg · L(-1) for calibration set, and 0.82 and 14.82 mg · L(-1) for the prediction set respectively. The method showed the best performance in LS-SVM model. The results indicated that it was feasible to use UV/Vis with LVs which were obtained by PLS method, combined with LS-SVM calibration could be applied to the rapid and accurate determination of COD in aquaculture water. Moreover, this study laid the foundation for further implementation of online analysis of aquaculture water and rapid determination of other water quality parameters.

  6. Experimental benchmark and code validation for airfoils equipped with passive vortex generators

    International Nuclear Information System (INIS)

    Baldacchino, D; Ferreira, C; Florentie, L; Timmer, N; Van Zuijlen, A; Manolesos, M; Chaviaropoulos, T; Diakakis, K; Papadakis, G; Voutsinas, S; González Salcedo, Á; Aparicio, M; García, N R.; Sørensen, N N.; Troldborg, N

    2016-01-01

    Experimental results and complimentary computations for airfoils with vortex generators are compared in this paper, as part of an effort within the AVATAR project to develop tools for wind turbine blade control devices. Measurements from two airfoils equipped with passive vortex generators, a 30% thick DU97W300 and an 18% thick NTUA T18 have been used for benchmarking several simulation tools. These tools span low-to-high complexity, ranging from engineering-level integral boundary layer tools to fully-resolved computational fluid dynamics codes. Results indicate that with appropriate calibration, engineering-type tools can capture the effects of vortex generators and outperform more complex tools. Fully resolved CFD comes at a much higher computational cost and does not necessarily capture the increased lift due to the VGs. However, in lieu of the limited experimental data available for calibration, high fidelity tools are still required for assessing the effect of vortex generators on airfoil performance. (paper)

  7. LS-DYNA Analysis of a Full-Scale Helicopter Crash Test

    Science.gov (United States)

    Annett, Martin S.

    2010-01-01

    A full-scale crash test of an MD-500 helicopter was conducted in December 2009 at NASA Langley's Landing and Impact Research facility (LandIR). The MD-500 helicopter was fitted with a composite honeycomb Deployable Energy Absorber (DEA) and tested under vertical and horizontal impact velocities of 26 ft/sec and 40 ft/sec, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of a system integrated LS-DYNA finite element model. In preparation for the full-scale crash test, a series of sub-scale and MD-500 mass simulator tests was conducted to evaluate the impact performances of various components, including a new crush tube and the DEA blocks. Parameters defined within the system integrated finite element model were determined from these tests. The objective of this paper is to summarize the finite element models developed and analyses performed, beginning with pre-test and continuing through post test validation.

  8. ARTEMIS: The core simulator of AREVA NP's next generation coupled neutronics/thermal-hydraulics code system ARCADIAR

    International Nuclear Information System (INIS)

    Hobson, Greg; Merk, Stephan; Bolloni, Hans-Wilhelm; Breith, Karl-Albert; Curca-Tivig, Florin; Van Geemert, Rene; Heinecke, Jochen; Hartmann, Bettina; Porsch, Dieter; Tiles, Viatcheslav; Dall'Osso, Aldo; Pothet, Baptiste

    2008-01-01

    AREVA NP has developed a next-generation coupled neutronics/thermal-hydraulics code system, ARCADIA R , to fulfil customer's current demands and even anticipate their future demands in terms of accuracy and performance. The new code system will be implemented world-wide and will replace several code systems currently used in various global regions. An extensive phase of verification and validation of the new code system is currently in progress. One of the principal components of this new system is the core simulator, ARTEMIS. Besides the stand-alone tests on the individual computational modules, integrated tests on the overall code are being performed in order to check for non-regression as well as for verification of the code. Several benchmark problems have been successfully calculated. Full-core depletion cycles of different plant types from AREVA's French, American and German regions (e.g. N4 and KONVOI types) have been performed with ARTEMIS (using APOLLO2-A cross sections) and compared directly with current production codes, e.g. with SCIENCE and CASCADE-3D, and additionally with measurements. (authors)

  9. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  10. Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films

    Energy Technology Data Exchange (ETDEWEB)

    Abazari, M; Safari, A [Glenn Howatt Electroceramics Laboratories, Department of Materials Science and Engineering, Rutgers-The state University of New Jersey, Piscataway, NJ 08854 (United States); Choi, T; Cheong, S-W [Rutgers Center for Emergent Materials, Department of Physics and Astronomy, Rutgers-The state University of New Jersey, Piscataway, NJ 08854 (United States)

    2010-01-20

    We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K{sub 0.44},Na{sub 0.52},Li{sub 0.04})(Nb{sub 0.84},Ta{sub 0.1},Sb{sub 0.06})O{sub 3} (KNN-LT-LS) thin films on SrTiO{sub 3} substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180{sup 0} domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d{sub 33}) of the films were calculated using piezoelectric displacement curves and shown to be {approx}53 pm V{sup -1} for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.

  11. Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films

    Science.gov (United States)

    Abazari, M.; Choi, T.; Cheong, S.-W.; Safari, A.

    2010-01-01

    We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K0.44,Na0.52,Li0.04)(Nb0.84,Ta0.1,Sb0.06)O3 (KNN-LT-LS) thin films on SrTiO3 substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180° domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d33) of the films were calculated using piezoelectric displacement curves and shown to be ~53 pm V-1 for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.

  12. Inverse European Latitudinal Cline at the timeless Locus of Drosophila melanogaster Reveals Selection on a Clock Gene: Population Genetics of ls-tim.

    Science.gov (United States)

    Zonato, Valeria; Vanin, Stefano; Costa, Rodolfo; Tauber, Eran; Kyriacou, Charalambos P

    2018-02-01

    The spread of adaptive genetic variants in populations is a cornerstone of evolutionary theory but with relatively few biologically well-understood examples. Previous work on the ls-tim variant of timeless, which encodes the light-sensitive circadian regulator in Drosophila melanogaster, suggests that it may have originated in southeastern Italy. Flies characterized by the new allele show photoperiod-related phenotypes likely to be adaptive in seasonal environments. ls-tim may be spreading from its point of origin in Italy by directional selection, but there are alternative explanations for its observed clinal geographical distribution, including balancing selection and demography. From population analyses of ls-tim frequencies collected on the eastern side of the Iberian Peninsula, we show that ls-tim frequencies are inverted compared with those in Italy. This pattern is consistent with a scenario of directional selection rather than latitude-associated balancing selection. Neutrality tests further reveal the signature of directional selection at the ls-tim site, which is reduced a few kb pairs either side of ls-tim. A reanalysis of allele frequencies from a large number of microsatellite loci do not demonstrate any frequent ls-tim-like spatial patterns, so a general demographic effect or population expansion from southeastern Italy cannot readily explain current ls-tim frequencies. Finally, a revised estimate of the age of ls-tim allele using linkage disequilibrium and coalescent-based approaches reveals that it may be only 300 to 3000 years old, perhaps explaining why it has not yet gone to fixation. ls-tim thus provides a rare temporal snapshot of a new allele that has come under selection before it reaches equilibrium.

  13. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  14. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  15. GeNN: a code generation framework for accelerated brain simulations

    Science.gov (United States)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  16. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  17. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  18. A dynamic, dependent type system for nuclear fuel cycle code generation

    Energy Technology Data Exchange (ETDEWEB)

    Scopatz, A. [The University of Chicago 5754 S. Ellis Ave, Chicago, IL 60637 (United States)

    2013-07-01

    The nuclear fuel cycle may be interpreted as a network or graph, thus allowing methods from formal graph theory to be used. Nodes are often idealized as nuclear fuel cycle facilities (reactors, enrichment cascades, deep geologic repositories). With the advent of modern object-oriented programming languages - and fuel cycle simulators implemented in these languages - it is natural to define a class hierarchy of facility types. Bright is a quasi-static simulator, meaning that the number of material passes through a facility is tracked rather than natural time. Bright is implemented as a C++ library that models many canonical components such as reactors, storage facilities, and more. Cyclus is a discrete time simulator, meaning that natural time is tracked through out the simulation. Therefore a robust, dependent type system was developed to enable inter-operability between Bright and Cyclus. This system is capable of representing any fuel cycle facility. Types declared in this system can then be used to automatically generate code which binds a facility implementation to a simulator front end. Facility model wrappers may be used either internally to a fuel cycle simulator or as a mechanism for inter-operating multiple simulators. While such a tool has many potential use cases it has two main purposes: enabling easy performance of code-to-code comparisons and the verification and the validation of user input.

  19. DISEÑO Y EVALUACIÓN DE UN CLASIFICADOR DE TEXTURAS BASADO EN LS-SVM

    Directory of Open Access Journals (Sweden)

    Beitmantt Cárdenas Quintero

    2013-07-01

    Full Text Available Evaluar el desempeño y el costo computacional de diferentes arquitecturas y metodologías Least Square Support Vector Machine (LS-SVM ante la segmentación de imágenes por textura y a partir de dichos resultados postular un modelo de un clasificador de texturas LS-SVM.  Metodología: Ante un problema de clasificación binaria representado por la segmentación  de 32 imágenes, organizadas en 4 grupos y formadas por pares de texturas típicas (granito/corteza, ladrillo/tapicería, madera/mármol, tejido/pelaje, se mide y compara el desempeño y el costo computacional de dos tipos de núcleo (Radial / Polinomial, dos funciones de optimización (mínimo local / búsqueda exhaustiva y dos funciones de costo (validación cruzada aleatoria / Validación cruzada dejando al menos uno en una LS-SVM que toma como entrada los pixeles que conforman la vecindad cruz del pixel a evaluar (no se hace extracción de características. Resultados: LS-SVM como clasificador de texturas, presenta mejor desempeño y exige menor costo computacional cuando utiliza un kernel de base radial y una función de optimización basada en un algoritmo de búsqueda de mínimos locales acompañado de una función de costo que use validación cruzada aleatoria.

  20. ASSOCIATING LONG-TERM {gamma}-RAY VARIABILITY WITH THE SUPERORBITAL PERIOD OF LS I +61 Degree-Sign 303

    Energy Technology Data Exchange (ETDEWEB)

    Ackermann, M.; Buehler, R. [Deutsches Elektronen Synchrotron DESY, D-15738 Zeuthen (Germany); Ajello, M. [Space Sciences Laboratory, 7 Gauss Way, University of California, Berkeley, CA 94720-7450 (United States); Ballet, J.; Casandjian, J. M. [Laboratoire AIM, CEA-IRFU/CNRS/Universite Paris Diderot, Service d' Astrophysique, CEA Saclay, F-91191 Gif sur Yvette (France); Barbiellini, G. [Istituto Nazionale di Fisica Nucleare, Sezione di Trieste, I-34127 Trieste (Italy); Bastieri, D.; Buson, S. [Istituto Nazionale di Fisica Nucleare, Sezione di Padova, I-35131 Padova (Italy); Bellazzini, R.; Bregeon, J. [Istituto Nazionale di Fisica Nucleare, Sezione di Pisa, I-56127 Pisa (Italy); Bonamente, E.; Cecchi, C. [Istituto Nazionale di Fisica Nucleare, Sezione di Perugia, I-06123 Perugia (Italy); Brandt, T. J. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Brigida, M. [Dipartimento di Fisica ' ' M. Merlin' ' dell' Universita e del Politecnico di Bari, I-70126 Bari (Italy); Bruel, P. [Laboratoire Leprince-Ringuet, Ecole polytechnique, CNRS/IN2P3, F-91128 Palaiseau (France); Caliandro, G. A. [Institute of Space Sciences (IEEE-CSIC), Campus UAB, E-08193 Barcelona (Spain); Cameron, R. A. [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States); Caraveo, P. A. [INAF-Istituto di Astrofisica Spaziale e Fisica Cosmica, I-20133 Milano (Italy); Cavazzuti, E. [Agenzia Spaziale Italiana (ASI) Science Data Center, I-00044 Frascati (Roma) (Italy); Chekhtman, A., E-mail: andrea.caliandro@ieec.uab.es, E-mail: hadasch@ieec.uab.es, E-mail: dtorres@ieec.uab.es [Center for Earth Observing and Space Research, College of Science, George Mason University, Fairfax, VA 22030 (United States); and others

    2013-08-20

    Gamma-ray binaries are stellar systems for which the spectral energy distribution (discounting the thermal stellar emission) peaks at high energies. Detected from radio to TeV gamma rays, the {gamma}-ray binary LS I +61 Degree-Sign 303 is highly variable across all frequencies. One aspect of this system's variability is the modulation of its emission with the timescale set by the {approx}26.4960 day orbital period. Here we show that, during the time of our observations, the {gamma}-ray emission of LS I +61 Degree-Sign 303 also presents a sinusoidal variability consistent with the previously known superorbital period of 1667 days. This modulation is more prominently seen at orbital phases around apastron, whereas it does not introduce a visible change close to periastron. It is also found in the appearance and disappearance of variability at the orbital period in the power spectrum of the data. This behavior could be explained by a quasi-cyclical evolution of the equatorial outflow of the Be companion star, whose features influence the conditions for generating gamma rays. These findings open the possibility to use {gamma}-ray observations to study the outflows of massive stars in eccentric binary systems.

  1. Improved Side Information Generation for Distributed Video Coding by Exploiting Spatial and Temporal Correlations

    Directory of Open Access Journals (Sweden)

    Ye Shuiming

    2009-01-01

    Full Text Available Distributed video coding (DVC is a video coding paradigm allowing low complexity encoding for emerging applications such as wireless video surveillance. Side information (SI generation is a key function in the DVC decoder, and plays a key-role in determining the performance of the codec. This paper proposes an improved SI generation for DVC, which exploits both spatial and temporal correlations in the sequences. Partially decoded Wyner-Ziv (WZ frames, based on initial SI by motion compensated temporal interpolation, are exploited to improve the performance of the whole SI generation. More specifically, an enhanced temporal frame interpolation is proposed, including motion vector refinement and smoothing, optimal compensation mode selection, and a new matching criterion for motion estimation. The improved SI technique is also applied to a new hybrid spatial and temporal error concealment scheme to conceal errors in WZ frames. Simulation results show that the proposed scheme can achieve up to 1.0 dB improvement in rate distortion performance in WZ frames for video with high motion, when compared to state-of-the-art DVC. In addition, both the objective and perceptual qualities of the corrupted sequences are significantly improved by the proposed hybrid error concealment scheme, outperforming both spatial and temporal concealments alone.

  2. Immunomodulatory Effects of Lactobacillus salivarius LS01 and Bifidobacterium breve BR03, Alone and in Combination, on Peripheral Blood Mononuclear Cells of Allergic Asthmatics.

    Science.gov (United States)

    Drago, Lorenzo; De Vecchi, Elena; Gabrieli, Arianna; De Grandi, Roberta; Toscano, Marco

    2015-07-01

    The aim of this study was to evaluate probiotic characteristics of Lactobacillus salivarius LS01 and Bifidobacterium breve BR03 alone and in combination and their immunomodulatory activity in asthmatic subjects. Subjects affected by allergic asthma were recruited. Initially, LS01 and BR03 were analyzed for their growth compatibility by a broth compatibility assay. To study the antimicrobial activity of probiotic strains, an agar diffusion assay was performed. Finally, cytokine production by peripheral blood mononuclear cells (PBMCs) stimulated with LS01 and BR03 was determined by means of specific quantitative enzyme-linked immunosorbent assay (ELISA). The growth of some clinical pathogens were slightly inhibited by LS01 and LS01-BR03 co-culture supernatant not neutralized to pH 6.5, while only the growth of E. coli and S. aureus was inhibited by the supernatant of LS01 and LS01-BR03 neutralized to pH 6.5. Furthermore, LS01 and BR03 combination was able to decrease the secretion of proinflammatory cytokines by PBMCs, leading to an intense increase in IL-10 production. L. salivarius LS01 and B. breve BR03 showed promising probiotic properties and beneficial immunomodulatory activity that are increased when the 2 strains are used in combination in the same formulation.

  3. Generation of the library of neutron cross sections for the Record code of the Fuel Management System (FMS)

    International Nuclear Information System (INIS)

    Alonso V, G.; Hernandez L, H.

    1991-11-01

    On the basis of the library structure of the RECORD code a method to generate the neutron cross sections by means of the ENDF-B/IV database and the NJOY code has been developed. The obtained cross sections are compared with those of the current library which was processed using the ENDF-B/III version. (Author)

  4. The CAIN computer code for the generation of MABEL input data sets: a user's manual

    International Nuclear Information System (INIS)

    Tilley, D.R.

    1983-03-01

    CAIN is an interactive FORTRAN computer code designed to overcome the substantial effort involved in manually creating the thermal-hydraulics input data required by MABEL-2. CAIN achieves this by processing output from either of the whole-core codes, RELAP or TRAC, interpolating where necessary, and by scanning RELAP/TRAC output in order to generate additional information. This user's manual describes the actions required in order to create RELAP/TRAC data sets from magnetic tape, to create the other input data sets required by CAIN, and to operate the interactive command procedure for the execution of CAIN. In addition, the CAIN code is described in detail. This programme of work is part of the Nuclear Installations Inspectorate (NII)'s contribution to the United Kingdom Atomic Energy Authority's independent safety assessment of pressurized water reactors. (author)

  5. Hypertension Knowledge-Level Scale (HK-LS: A Study on Development, Validity and Reliability

    Directory of Open Access Journals (Sweden)

    Cemalettin Kalyoncu

    2012-03-01

    Full Text Available This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensions encompassed 60.3% of the total variance. Cronbach alpha coefficients were 0.82 for the entire scale and 0.92, 0.59, 0.67, 0.77, 0.72, and 0.76 for the sub-dimensions of definition, medical treatment, drug compliance, lifestyle, diet, and complications, respectively. The scale ensured internal consistency in reliability and construct validity, as well as stability over time. Significant relationships were found between knowledge score and age, gender, educational level, and history of hypertension of the participants. No correlation was found between knowledge score and working at an income-generating job. The present scale, developed to measure the knowledge level of hypertension among Turkish adults, was found to be valid and reliable.

  6. Hypertension Knowledge-Level Scale (HK-LS): a study on development, validity and reliability.

    Science.gov (United States)

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-03-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥ 18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensions encompassed 60.3% of the total variance. Cronbach alpha coefficients were 0.82 for the entire scale and 0.92, 0.59, 0.67, 0.77, 0.72, and 0.76 for the sub-dimensions of definition, medical treatment, drug compliance, lifestyle, diet, and complications, respectively. The scale ensured internal consistency in reliability and construct validity, as well as stability over time. Significant relationships were found between knowledge score and age, gender, educational level, and history of hypertension of the participants. No correlation was found between knowledge score and working at an income-generating job. The present scale, developed to measure the knowledge level of hypertension among Turkish adults, was found to be valid and reliable.

  7. Using MathWorks' Simulink® and Real-Time Workshop® Code Generator to Produce Attitude Control Test and Flight Code

    OpenAIRE

    Salada, Mark; Dellinger, Wayne

    1998-01-01

    This paper describes the use of a commercial product, MathWorks' RealTime Workshop® (RTW), to generate actual flight code for NASA's Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics (TIMED) mission. The Johns Hopkins University Applied Physics Laboratory is handling the design and construction of this satellite for NASA. As TIMED is scheduled to launch in May of the year 2000, software development for both ground and flight systems are well on their way. However, based on experien...

  8. A computer code for calculation of radioactive nuclide generation and depletion, decay heat and γ ray spectrum. FPGS90

    International Nuclear Information System (INIS)

    Ihara, Hitoshi; Katakura, Jun-ichi; Nakagawa, Tsuneo

    1995-11-01

    In a nuclear reactor radioactive nuclides are generated and depleted with burning up of nuclear fuel. The radioactive nuclides, emitting γ ray and β ray, play role of radioactive source of decay heat in a reactor and radiation exposure. In safety evaluation of nuclear reactor and nuclear fuel cycle, it is needed to estimate the number of nuclides generated in nuclear fuel under various burn-up condition of many kinds of nuclear fuel used in a nuclear reactor. FPGS90 is a code calculating the number of nuclides, decay heat and spectrum of emitted γ ray from fission products produced in a nuclear fuel under the various kinds of burn-up condition. The nuclear data library used in FPGS90 code is the library 'JNDC Nuclear Data Library of Fission Products - second version -', which is compiled by working group of Japanese Nuclear Data Committee for evaluating decay heat in a reactor. The code has a function of processing a so-called evaluated nuclear data file such as ENDF/B, JENDL, ENSDF and so on. It also has a function of making figures of calculated results. Using FPGS90 code it is possible to do all works from making library, calculating nuclide generation and decay heat through making figures of the calculated results. (author)

  9. A computer code for calculation of radioactive nuclide generation and depletion, decay heat and {gamma} ray spectrum. FPGS90

    Energy Technology Data Exchange (ETDEWEB)

    Ihara, Hitoshi; Katakura, Jun-ichi; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1995-11-01

    In a nuclear reactor radioactive nuclides are generated and depleted with burning up of nuclear fuel. The radioactive nuclides, emitting {gamma} ray and {beta} ray, play role of radioactive source of decay heat in a reactor and radiation exposure. In safety evaluation of nuclear reactor and nuclear fuel cycle, it is needed to estimate the number of nuclides generated in nuclear fuel under various burn-up condition of many kinds of nuclear fuel used in a nuclear reactor. FPGS90 is a code calculating the number of nuclides, decay heat and spectrum of emitted {gamma} ray from fission products produced in a nuclear fuel under the various kinds of burn-up condition. The nuclear data library used in FPGS90 code is the library `JNDC Nuclear Data Library of Fission Products - second version -`, which is compiled by working group of Japanese Nuclear Data Committee for evaluating decay heat in a reactor. The code has a function of processing a so-called evaluated nuclear data file such as ENDF/B, JENDL, ENSDF and so on. It also has a function of making figures of calculated results. Using FPGS90 code it is possible to do all works from making library, calculating nuclide generation and decay heat through making figures of the calculated results. (author).

  10. High-energy emissions from the gamma-ray binary LS 5039

    Energy Technology Data Exchange (ETDEWEB)

    Takata, J.; Leung, Gene C. K.; Cheng, K. S. [Department of Physics, University of Hong Kong, Pokfulam Road (Hong Kong); Tam, P. H. T.; Kong, A. K. H. [Institute of Astronomy and Department of Physics, National Tsing Hua University, Hsinchu, Taiwan (China); Hui, C. Y., E-mail: takata@hku.hk, E-mail: gene930@connect.hku.hk, E-mail: hrspksc@hku.hk [Department of Astronomy and Space Science, Chungnam National University, Daejeon (Korea, Republic of)

    2014-07-20

    We study mechanisms of multi-wavelength emissions (X-ray, GeV, and TeV gamma-rays) from the gamma-ray binary LS 5039. This paper is composed of two parts. In the first part, we report on results of observational analysis using 4 yr data of the Fermi Large Area Telescope. Due to the improvement of instrumental response function and increase of the statistics, the observational uncertainties of the spectrum in the ∼100-300 MeV bands and >10 GeV bands are significantly improved. The present data analysis suggests that the 0.1-100 GeV emissions from LS 5039 contain three different components: (1) the first component contributes to <1 GeV emissions around superior conjunction, (2) the second component dominates in the 1-10 GeV energy bands, and (3) the third component is compatible with the lower-energy tail of the TeV emissions. In the second part, we develop an emission model to explain the properties of the phase-resolved emissions in multi-wavelength observations. Assuming that LS 5039 includes a pulsar, we argue that emissions from both the magnetospheric outer gap and the inverse-Compton scattering process of cold-relativistic pulsar wind contribute to the observed GeV emissions. We assume that the pulsar is wrapped by two kinds of termination shock: Shock-I due to the interaction between the pulsar wind and the stellar wind and Shock-II due to the effect of the orbital motion. We propose that the X-rays are produced by the synchrotron radiation at the Shock-I region and the TeV gamma-rays are produced by the inverse-Compton scattering process at the Shock-II region.

  11. Remediation of Learning Disable Children Following L.S. Vygotsky's Approach

    Directory of Open Access Journals (Sweden)

    Janna M. Glozman

    2011-01-01

    Full Text Available The paper defines remediating education, its peculiarities against trasitional education, main tasks and principles, based upon the cultural-historical theory of L.S. Vygotsky. Base functional systems formed during remediation are discussed. Peculiarities of individual, group and dyadic methods of remediation are described with regard to its potential for mediating child's activity.

  12. A new Em-like protein from Lactuca sativa, LsEm1, enhances drought and salt stress tolerance in Escherichia coli and rice.

    Science.gov (United States)

    Xiang, Dian-Jun; Man, Li-Li; Zhang, Chun-Lan; Peng-Liu; Li, Zhi-Gang; Zheng, Gen-Chang

    2018-02-07

    Late embryogenesis abundant (LEA) proteins are closely related to abiotic stress tolerance of plants. In the present study, we identified a novel Em-like gene from lettuce, termed LsEm1, which could be classified into group 1 LEA proteins, and shared high homology with Cynara cardunculus Em protein. The LsEm1 protein contained three different 20-mer conserved elements (C-element, N-element, and M-element) in the C-termini, N-termini, and middle-region, respectively. The LsEm1 mRNAs were accumulated in all examined tissues during the flowering and mature stages, with a little accumulation in the roots and leaves during the seedling stage. Furthermore, the LsEm1 gene was also expressed in response to salt, dehydration, abscisic acid (ABA), and cold stresses in young seedlings. The LsEm1 protein could effectively reduce damage to the lactate dehydrogenase (LDH) and protect LDH activity under desiccation and salt treatments. The Escherichia coli cells overexpressing the LsEm1 gene showed a growth advantage over the control under drought and salt stresses. Moreover, LsEm1-overexpressing rice seeds were relatively sensitive to exogenously applied ABA, suggesting that the LsEm1 gene might depend on an ABA signaling pathway in response to environmental stresses. The transgenic rice plants overexpressing the LsEm1 gene showed higher tolerance to drought and salt stresses than did wild-type (WT) plants on the basis of the germination performances, higher survival rates, higher chlorophyll content, more accumulation of soluble sugar, lower relative electrolyte leakage, and higher superoxide dismutase activity under stress conditions. The LsEm1-overexpressing rice lines also showed less yield loss compared with WT rice under stress conditions. Furthermore, the LsEm1 gene had a positive effect on the expression of the OsCDPK9, OsCDPK13, OsCDPK15, OsCDPK25, and rab21 (rab16a) genes in transgenic rice under drought and salt stress conditions, implying that overexpression of these

  13. Novel Hybrid of LS-SVM and Kalman Filter for GPS/INS Integration

    Science.gov (United States)

    Xu, Zhenkai; Li, Yong; Rizos, Chris; Xu, Xiaosu

    Integration of Global Positioning System (GPS) and Inertial Navigation System (INS) technologies can overcome the drawbacks of the individual systems. One of the advantages is that the integrated solution can provide continuous navigation capability even during GPS outages. However, bridging the GPS outages is still a challenge when Micro-Electro-Mechanical System (MEMS) inertial sensors are used. Methods being currently explored by the research community include applying vehicle motion constraints, optimal smoother, and artificial intelligence (AI) techniques. In the research area of AI, the neural network (NN) approach has been extensively utilised up to the present. In an NN-based integrated system, a Kalman filter (KF) estimates position, velocity and attitude errors, as well as the inertial sensor errors, to output navigation solutions while GPS signals are available. At the same time, an NN is trained to map the vehicle dynamics with corresponding KF states, and to correct INS measurements when GPS measurements are unavailable. To achieve good performance it is critical to select suitable quality and an optimal number of samples for the NN. This is sometimes too rigorous a requirement which limits real world application of NN-based methods.The support vector machine (SVM) approach is based on the structural risk minimisation principle, instead of the minimised empirical error principle that is commonly implemented in an NN. The SVM can avoid local minimisation and over-fitting problems in an NN, and therefore potentially can achieve a higher level of global performance. This paper focuses on the least squares support vector machine (LS-SVM), which can solve highly nonlinear and noisy black-box modelling problems. This paper explores the application of the LS-SVM to aid the GPS/INS integrated system, especially during GPS outages. The paper describes the principles of the LS-SVM and of the KF hybrid method, and introduces the LS-SVM regression algorithm. Field

  14. LS1 Report: The cryogenic line goes through the scanner

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    In spite of the complexity of LS1, with many different activities taking place in parallel and sometimes overlapping, the dashboard shows that work is progressing on schedule. This week, teams have started X-raying the cryogenic line to examine its condition in minute detail.   The LS1 schedule is pretty unfathomable for those who don't work in the tunnels or installations, but if you look down all the columns and stop at the line indicating today’s date, you can see that all of the priority and critical items are bang on time, like a Swiss watch. More specifically: the SMACC project in the LHC is on schedule, with a new testing phase for the interconnections which have already been consolidated; preparations are under way for the cable replacement campaign at Point 1 of the SPS (about 20% of the cables will not be replaced as they are completely unused); and the demineralised water distribution line is back in service, as are the electrical substations for the 400 and 66 kV line...

  15. DISEÑO Y EVALUACIÓN DE UN CLASIFICADOR DE TEXTURAS BASADO EN LS-SVM

    OpenAIRE

    Beitmantt Cárdenas Quintero; Nelson Enrique Vera Parra; Pablo Emilio Rozo García

    2013-01-01

    Evaluar el desempeño y el costo computacional de diferentes arquitecturas y metodologías Least Square Support Vector Machine (LS-SVM) ante la segmentación de imágenes por textura y a partir de dichos resultados postular un modelo de un clasificador de texturas LS-SVM.  Metodología: Ante un problema de clasificación binaria representado por la segmentación  de 32 imágenes, organizadas en 4 grupos y formadas por pares de texturas típicas (granito/corteza, ladrillo/tapicería, madera/mármol, teji...

  16. Mobility of the native Bacillus subtilis conjugative plasmid pLS20 is regulated by intercellular signaling.

    Science.gov (United States)

    Singh, Praveen K; Ramachandran, Gayetri; Ramos-Ruiz, Ricardo; Peiró-Pastor, Ramón; Abia, David; Wu, Ling J; Meijer, Wilfried J J

    2013-10-01

    Horizontal gene transfer mediated by plasmid conjugation plays a significant role in the evolution of bacterial species, as well as in the dissemination of antibiotic resistance and pathogenicity determinants. Characterization of their regulation is important for gaining insights into these features. Relatively little is known about how conjugation of Gram-positive plasmids is regulated. We have characterized conjugation of the native Bacillus subtilis plasmid pLS20. Contrary to the enterococcal plasmids, conjugation of pLS20 is not activated by recipient-produced pheromones but by pLS20-encoded proteins that regulate expression of the conjugation genes. We show that conjugation is kept in the default "OFF" state and identified the master repressor responsible for this. Activation of the conjugation genes requires relief of repression, which is mediated by an anti-repressor that belongs to the Rap family of proteins. Using both RNA sequencing methodology and genetic approaches, we have determined the regulatory effects of the repressor and anti-repressor on expression of the pLS20 genes. We also show that the activity of the anti-repressor is in turn regulated by an intercellular signaling peptide. Ultimately, this peptide dictates the timing of conjugation. The implications of this regulatory mechanism and comparison with other mobile systems are discussed.

  17. A genetic code alteration is a phenotype diversity generator in the human pathogen Candida albicans.

    Directory of Open Access Journals (Sweden)

    Isabel Miranda

    Full Text Available BACKGROUND: The discovery of genetic code alterations and expansions in both prokaryotes and eukaryotes abolished the hypothesis of a frozen and universal genetic code and exposed unanticipated flexibility in codon and amino acid assignments. It is now clear that codon identity alterations involve sense and non-sense codons and can occur in organisms with complex genomes and proteomes. However, the biological functions, the molecular mechanisms of evolution and the diversity of genetic code alterations remain largely unknown. In various species of the genus Candida, the leucine CUG codon is decoded as serine by a unique serine tRNA that contains a leucine 5'-CAG-3'anticodon (tRNA(CAG(Ser. We are using this codon identity redefinition as a model system to elucidate the evolution of genetic code alterations. METHODOLOGY/PRINCIPAL FINDINGS: We have reconstructed the early stages of the Candida genetic code alteration by engineering tRNAs that partially reverted the identity of serine CUG codons back to their standard leucine meaning. Such genetic code manipulation had profound cellular consequences as it exposed important morphological variation, altered gene expression, re-arranged the karyotype, increased cell-cell adhesion and secretion of hydrolytic enzymes. CONCLUSION/SIGNIFICANCE: Our study provides the first experimental evidence for an important role of genetic code alterations as generators of phenotypic diversity of high selective potential and supports the hypothesis that they speed up evolution of new phenotypes.

  18. LHC Experimental Beam Pipe Upgrade during LS1

    CERN Document Server

    Lanza, G; Baglin, V; Chiggiato, P

    2014-01-01

    The LHC experimental beam pipes are being improved during the ongoing Long Shutdown 1 (LS1). Several vacuum chambers have been tested and validated before their installation inside the detectors. The validation tests include: leak tightness, ultimate vacuum pressure, material outgassing rate, and residual gas composition. NEG coatings are assessed by sticking probability measurement with the help of Monte Carlo simulations. In this paper the motivation for the beam pipe upgrade, the validation tests of the components and the results are presented and discussed.

  19. Environmental codes of practice for steam electric power generation

    International Nuclear Information System (INIS)

    1985-03-01

    The Design Phase Code is one of a series of documents being developed for the steam electric power generation industry. This industry includes fossil-fuelled stations (gas, oil and coal-fired boilers), and nuclear-powered stations (CANDU heavy water reactors). In this document, environmental concerns associated with water-related and solid waste activities of steam electric plants are discussed. Design recommendations are presented that will minimize the detrimental environmental effects of once-through cooling water systems, of wastewaters discharged to surface waters and groundwaters, and of solid waste disposal sites. Recommendations are also presented for the design of water-related monitoring systems and programs. Cost estimates associated with the implementation of these recommendations are included. These technical guides for new or modified steam electric stations are the result to consultation with a federal-provincial-industry task force

  20. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system

    International Nuclear Information System (INIS)

    Yang, W.S.; Lee, C.H.

    2008-01-01

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC 2 -2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC 2 -2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC 2 -2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC 2 -2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC 2 -2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC 2 -2, VIM, and NJOY. For almost all nuclides considered, MC 2 -2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC 2 -2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC 2 -2/TWODANT calculations were in good agreement with MCNP solutions within ∼0.25% Δρ, except a few small LANL fast assemblies. Relative to the MCNP solution, the MC 2 -2/TWODANT

  1. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system.

    Energy Technology Data Exchange (ETDEWEB)

    Yang, W. S.; Lee, C. H. (Nuclear Engineering Division)

    2008-05-16

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC{sup 2}-2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC{sup 2}-2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC{sup 2}-2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC{sup 2}-2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC{sup 2}-2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC{sup 2}-2, VIM, and NJOY. For almost all nuclides considered, MC{sup 2}-2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC{sup 2}-2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC{sup 2}-2/TWODANT calculations were in good agreement with MCNP solutions within {approx}0.25% {Delta}{rho}, except a few small LANL fast assemblies

  2. Investigation of an Alternative Fuel Form for the Liquid Salt Cooled Very High Temperature Reactor (LS-VHTR)

    International Nuclear Information System (INIS)

    Casino, William A. Jr.

    2006-01-01

    Much of the recent studies investigating the use of liquid salts as reactor coolants have utilized a core configuration of graphite prismatic fuel block assemblies with TRISO particles embedded into cylindrical fuel compacts arranged in a triangular pitch lattice. Although many calculations have been performed for this fuel form in gas cooled reactors, it would be instructive to investigate whether an alternative fuel form may yield improved performance for the liquid salt-cooled Very High Temperature Reactor (LS-VHTR). This study investigates how variations in the fuel form will impact the performance of the LS-VHTR during normal and accident conditions and compares the results with a similar analysis that was recently completed for a LS-VHTR core made up of prismatic block fuel. (author)

  3. Computing LS factor by runoff paths on TIN

    Science.gov (United States)

    Kavka, Petr; Krasa, Josef; Bek, Stanislav

    2013-04-01

    The article shows results of topographic factor (the LS factor in USLE) derivation enhancement focused on detailed Airborne Laser Scanning (ALS) based DEMs. It describes a flow paths generation technique using triangulated irregular network (TIN) for terrain morphology description, which is not yet established in soil loss computations. This technique was compared with other procedures of flow direction and flow paths generation based on commonly used raster model (DEM). These overland flow characteristics together with therefrom derived flow accumulation are significant inputs for many scientific models. Particularly they are used in all USLE-based soil erosion models, from which USLE2D, RUSLE3D, Watem/Sedem or USPED can be named as the most acknowledged. Flow routing characteristics are also essential parameters in physically based hydrological and soil erosion models like HEC-HMS, Wepp, Erosion3D, LISEM, SMODERP, etc. Mentioned models are based on regular raster grids, where the identification of runoff direction is problematic. The most common method is Steepest descent (one directional flow), which corresponds well with the concentration of surface runoff into concentrated flow. The Steepest descent algorithm for the flow routing doesn't provide satisfying results, it often creates parallel and narrow flow lines while not respecting real morphological conditions. To overcome this problem, other methods (such as Flux Decomposition, Multiple flow, Deterministic Infinity algorithm etc.) separate the outflow into several components. This approach leads to unrealistic diffusion propagation of the runoff and makes it impossible to be used for simulation of dominant morphological features, such as artificial rills, hedges, sediment traps etc. The modern methods of mapping ground elevations, especially ALS, provide very detailed models even for large river basins, including morphological details. New algorithms for derivation a runoff direction have been developed as

  4. LHC-GCS a model-driven approach for automatic PLC and SCADA code generation

    CERN Document Server

    Thomas, Geraldine; Barillère, Renaud; Cabaret, Sebastien; Kulman, Nikolay; Pons, Xavier; Rochez, Jacques

    2005-01-01

    The LHC experiments’ Gas Control System (LHC GCS) project [1] aims to provide the four LHC experiments (ALICE, ATLAS, CMS and LHCb) with control for their 23 gas systems. To ease the production and maintenance of 23 control systems, a model-driven approach has been adopted to generate automatically the code for the Programmable Logic Controllers (PLCs) and for the Supervision Control And Data Acquisition (SCADA) systems. The first milestones of the project have been achieved. The LHC GCS framework [4] and the generation tools have been produced. A first control application has actually been generated and is in production, and a second is in preparation. This paper describes the principle and the architecture of the model-driven solution. It will in particular detail how the model-driven solution fits with the LHC GCS framework and with the UNICOS [5] data-driven tools.

  5. Signal Simulation and Experimental Research on Acoustic Emission using LS-DYNA

    Directory of Open Access Journals (Sweden)

    Zhang Jianchao

    2015-09-01

    Full Text Available To calculate sound wave velocity, we performed the Hsu-Nielsen lead break experiment using the ANSYS/LS-DYNA finite element software. First, we identified the key problems in the finite element analysis, such as selecting the exciting force, dividing the grid density, and setting the calculation steps. Second, we established the finite element model of the sound wave transmission in a plate under the lead break simulation. Results revealed not only the transmission characteristics of the sound wave but also the simulation and calculation of the transmission velocity of the longitudinal and transverse waves through the time travel curve of the vibration velocity of the sound wave at various nodes. Finally, the Hsu-Nielsen lead break experiment was implemented. The results of the theoretical calculation and simulation analysis were consistent with the experimental results, thus demonstrating that the research method using the ANSYS/LS-DYNA software to simulate sound wave transmissions in acoustic emission experiments is feasible and effective.

  6. Generating multi-photon W-like states for perfect quantum teleportation and superdense coding

    Science.gov (United States)

    Li, Ke; Kong, Fan-Zhen; Yang, Ming; Ozaydin, Fatih; Yang, Qing; Cao, Zhuo-Liang

    2016-08-01

    An interesting aspect of multipartite entanglement is that for perfect teleportation and superdense coding, not the maximally entangled W states but a special class of non-maximally entangled W-like states are required. Therefore, efficient preparation of such W-like states is of great importance in quantum communications, which has not been studied as much as the preparation of W states. In this paper, we propose a simple optical scheme for efficient preparation of large-scale polarization-based entangled W-like states by fusing two W-like states or expanding a W-like state with an ancilla photon. Our scheme can also generate large-scale W states by fusing or expanding W or even W-like states. The cost analysis shows that in generating large-scale W states, the fusion mechanism achieves a higher efficiency with non-maximally entangled W-like states than maximally entangled W states. Our scheme can also start fusion or expansion with Bell states, and it is composed of a polarization-dependent beam splitter, two polarizing beam splitters and photon detectors. Requiring no ancilla photon or controlled gate to operate, our scheme can be realized with the current photonics technology and we believe it enable advances in quantum teleportation and superdense coding in multipartite settings.

  7. Impact of Distributed Generation Grid Code Requirements on Islanding Detection in LV Networks

    Directory of Open Access Journals (Sweden)

    Fabio Bignucolo

    2017-01-01

    Full Text Available The recent growing diffusion of dispersed generation in low voltage (LV distribution networks is entailing new rules to make local generators participate in network stability. Consequently, national and international grid codes, which define the connection rules for stability and safety of electrical power systems, have been updated requiring distributed generators and electrical storage systems to supply stabilizing contributions. In this scenario, specific attention to the uncontrolled islanding issue has to be addressed since currently required anti-islanding protection systems, based on relays locally measuring voltage and frequency, could no longer be suitable. In this paper, the effects on the interface protection performance of different LV generators’ stabilizing functions are analysed. The study takes into account existing requirements, such as the generators’ active power regulation (according to the measured frequency and reactive power regulation (depending on the local measured voltage. In addition, the paper focuses on other stabilizing features under discussion, derived from the medium voltage (MV distribution network grid codes or proposed in the literature, such as fast voltage support (FVS and inertia emulation. Stabilizing functions have been reproduced in the DIgSILENT PowerFactory 2016 software environment, making use of its native programming language. Later, they are tested both alone and together, aiming to obtain a comprehensive analysis on their impact on the anti-islanding protection effectiveness. Through dynamic simulations in several network scenarios the paper demonstrates the detrimental impact that such stabilizing regulations may have on loss-of-main protection effectiveness, leading to an increased risk of unintentional islanding.

  8. A program code generator for multiphysics biological simulation using markup languages.

    Science.gov (United States)

    Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi

    2012-01-01

    To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.

  9. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  10. Whole-genome sequencing of Bacillus velezensis LS69, a strain with a broad inhibitory spectrum against pathogenic bacteria.

    Science.gov (United States)

    Liu, Guoqiang; Kong, Yingying; Fan, Yajing; Geng, Ce; Peng, Donghai; Sun, Ming

    2017-05-10

    Bacillus velezensis LS69 was found to exhibit antagonistic activity against a diverse spectrum of pathogenic bacteria. It has one circular chromosome of 3,917,761bp with 3,643 open reading frames. Genome analysis identified ten gene clusters involved in nonribosomal synthesis of polyketides (macrolactin, bacillaene and difficidin), lipopeptides (surfactin, fengycin, bacilysin and iturin A) and bacteriocins (amylolysin and amylocyclicin). In addition, B. velezensis LS69 was found to contain a series of genes involved in enhancing plant growth and triggering plant immunity. Whole genome sequencing of Bacillus velezensis LS69 will provide a basis for elucidation of its biocontrol mechanisms and facilitate its applications in the future. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. The Ne3LS Network, Quebec's initiative to evaluate the impact and promote a responsible and sustainable development of nanotechnology

    International Nuclear Information System (INIS)

    Endo, Charles-Anica; Emond, Claude; Battista, Renaldo; Parizeau, Marie-Helene; Beaudry, Catherine

    2011-01-01

    The spectacular progress made by nanosciences and nanotechnologies elicits as much hope and fear. Consequently, a great number of research and training initiatives on the ethical, environmental, economic, legal and social issues regarding nanotechnology development (Ne 3 LS) are emerging worldwide. In Quebec, Canada, a Task Force was mandated by NanoQuebec to conceive a Ne 3 LS research and training strategy to assess those issues. This Task Force brought together experts from universities, governments or industry working in nanosciences and nanotechnologies or in Ne 3 LS. Their resulting action plan, made public in November 2006, contained several recommendations, including the creation of a knowledge network (Ne 3 LS Network). In the following years, after consulting with numerous key players concerned with the possible impacts of nanosciences and nanotechnologies in Quebec, the Ne 3 LS Network was launched in January 2010 in partnership with the Fonds quebecois de la recherche sur la nature et les technologies, the Fonds quebecois de la recherche sur la societe et la culture and the Fonds de la recherche en sante du Quebec, NanoQuebec, the Institut de recherche Robert-Sauve en sante et en securite du travail as well as the University of Montreal. Its objectives are to 1) Foster the development of Ne 3 LS research activities (grants and fellowships); 2) Spearhead the Canadian and international Ne 3 LS network; 3) Take part in the training of researchers and experts; 4) Encourage the creation of interactive tools for the general public; 5) Facilitate collaboration between decision-makers and experts; 6) Involve the scientific community through a host of activities (symposium, conferences, thematic events); 7) Build multidisciplinary research teams to evaluate the impact of nanotechnology.

  12. A New European Slope Length and Steepness Factor (LS-Factor for Modeling Soil Erosion by Water

    Directory of Open Access Journals (Sweden)

    Panos Panagos

    2015-04-01

    Full Text Available The Universal Soil Loss Equation (USLE model is the most frequently used model for soil erosion risk estimation. Among the six input layers, the combined slope length and slope angle (LS-factor has the greatest influence on soil loss at the European scale. The S-factor measures the effect of slope steepness, and the L-factor defines the impact of slope length. The combined LS-factor describes the effect of topography on soil erosion. The European Soil Data Centre (ESDAC developed a new pan-European high-resolution soil erosion assessment to achieve a better understanding of the spatial and temporal patterns of soil erosion in Europe. The LS-calculation was performed using the original equation proposed by Desmet and Govers (1996 and implemented using the System for Automated Geoscientific Analyses (SAGA, which incorporates a multiple flow algorithm and contributes to a precise estimation of flow accumulation. The LS-factor dataset was calculated using a high-resolution (25 m Digital Elevation Model (DEM for the whole European Union, resulting in an improved delineation of areas at risk of soil erosion as compared to lower-resolution datasets. This combined approach of using GIS software tools with high-resolution DEMs has been successfully applied in regional assessments in the past, and is now being applied for first time at the European scale.

  13. Hermitian self-dual quasi-abelian codes

    Directory of Open Access Journals (Sweden)

    Herbert S. Palines

    2017-12-01

    Full Text Available Quasi-abelian codes constitute an important class of linear codes containing theoretically and practically interesting codes such as quasi-cyclic codes, abelian codes, and cyclic codes. In particular, the sub-class consisting of 1-generator quasi-abelian codes contains large families of good codes. Based on the well-known decomposition of quasi-abelian codes, the characterization and enumeration of Hermitian self-dual quasi-abelian codes are given. In the case of 1-generator quasi-abelian codes, we offer necessary and sufficient conditions for such codes to be Hermitian self-dual and give a formula for the number of these codes. In the case where the underlying groups are some $p$-groups, the actual number of resulting Hermitian self-dual quasi-abelian codes are determined.

  14. Simple Detection of Large InDeLS by DHPLC: The ACE Gene as a Model

    Directory of Open Access Journals (Sweden)

    Renata Guedes Koyama

    2008-01-01

    Full Text Available Insertion-deletion polymorphism (InDeL is the second most frequent type of genetic variation in the human genome. For the detection of large InDeLs, researchers usually resort to either PCR gel analysis or RFLP, but these are time consuming and dependent on human interpretation. Therefore, a more efficient method for genotyping this kind of genetic variation is needed. In this report, we describe a method that can detect large InDeLs by DHPLC (denaturating high-performance liquid chromatography using the angiotensin-converting enzyme (ACE gene I/D polymorphism as a model. The InDeL targeted in this study is characterized by a 288 bp Alu element insertion (I. We used DHPLC at nondenaturating conditions to analyze the PCR product with a flow through the chromatographic column under two different gradients based on the differences between D and I sequences. The analysis described is quick and easy, making this technique a suitable and efficient means for DHPLC users to screen InDeLs in genetic epidemiological studies.

  15. Xerostomia Quality of Life Scale (XeQoLS) questionnaire: validation of Italian version in head and neck cancer patients.

    Science.gov (United States)

    Lastrucci, Luciana; Bertocci, Silvia; Bini, Vittorio; Borghesi, Simona; De Majo, Roberta; Rampini, Andrea; Gennari, Pietro Giovanni; Pernici, Paola

    2018-01-01

    To translate the Xerostomia Quality-of-Life Scale (XeQoLS) into Italian language (XeQoLS-IT). Xerostomia is the most relevant acute and late toxicity in patients with head and neck cancer treated with radiotherapy (RT). Patient-reported outcome (PRO) instruments are subjective report on patient perception of health status. The XeQoLS consists of 15 items and measures the impact of salivary gland dysfunction and xerostomia on the four major domains of oral health-related QoL. The XeQoLS-IT was created through a linguistic validation multi-step process: forward translation (TF), backward translation (TB) and administration of the questionnaire to 35 Italian patients with head and neck cancer. Translation was independently carried out by two radiation oncologists who were Italian native speakers. The two versions were compared and adapted to obtain a reconciled version, version 1 (V1). V1 was translated back into English by an Italian pro skilled in teaching English. After review of discrepancies and choice of the most appropriate wording for clarity and similarity to the original, version 2 (V2) was reached by consensus. To evaluate version 2, patients completed the XeQoLS-IT questionnaire and also underwent a cognitive debriefing. The questionnaire was considered simple by the patients. The clarity of the instructions and the easiness to answer questions had a mean value of 4.5 (± 0.71) on a scale from 1 to 5. A valid multi-step process led to the creation of the final version of the XeQoLS-IT, a suitable instrument for the perception of xerostomia in patients treated with RT.

  16. The Control System of CERN Accelerators Vacuum (LS1 Activities and New Developments)

    CERN Document Server

    Gomes, P; Bellorini, F; Blanchard, S; Boivin, J P; Gama, J; Girardot, G; Pigny, G; Rio, B; Vestergard, H; Kopylov, L; Merker, S; Mikheev, M

    2014-01-01

    After 3 years of operation, the LHC entered its first Long Shutdown period (LS1), in February 2013 [1]. Major consolidation and maintenance works are being performed across the whole CERN’s Accelerator chain, in order to prepare the LHC to restart at higher energy, in 2015. The injector chain shall resume earlier, in mid-14. We report about the on-going vacuum-controls projects. Some of them concern the renovation of the controls of certain machines; others are associated with the consolidations of the vacuum systems of LHC and its injectors; and a few are completely new installations. ue to the wide age-span of the existing vacuum installations, there is a mix of design philosophies and of control-equipment generations. The renovations and the novel projects offer an opportunity to improve the uniformity and efficiency of vacuum controls by: reducing the number of equipment versions with similar functionality; identifying, naming, labelling, and documenting all pieces of equipment; homogenizing the contr...

  17. UFOs in the LHC after LS1

    International Nuclear Information System (INIS)

    Baer, T.; Barnes, M.J.; Carlier, E.; Cerutti, F.; Dehning, B.; Ducimetiere, L.; Ferrari, A.; Garrel, N.; Gerardin, A.; Goddard, B.; Holzer, E.B.; Jackson, S.; Jimenez, J.M.; Kain, V.; Lechner, A.; Mertens, V.; Misiowiec, M.; Moron Ballester, R.; Nebot del Busto, E.; Norderhaug Drosdal, L.; Nordt, A.; Uythoven, J.; Velghe, B.; Vlachoudis, V.; Wenninger, J.; Zamantzas, C.; Zimmermann, F.; Fuster Martinez, N.

    2012-01-01

    UFOs (Unidentified Falling Objects) are potentially a major luminosity limitation for nominal LHC operation. With large-scale increases of the BLM thresholds, their impact on LHC availability was mitigated in the second half of 2011. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. Therefore, in 2011, the diagnostics for UFO events were significantly improved, dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. In this paper, the state of knowledge is summarized and extrapolations for LHC operation after LS1 are presented. Mitigation strategies are proposed and related tests and measures for 2012 are specified. (authors)

  18. UFOs in the LHC after LS1

    CERN Document Server

    Baer, T; Carlier, E; Cerutti, F; Dehning, B; Ducimetière, L; Ferrari, A; Garrel, N; Gérardin, A; Goddard, B; Holzer, E B; Jackson, S; Jimenez, J M; Kain, V; Lechner, A; Mertens, V; Misiowiec, M; Morón Ballester, R; Nebot del Busto, E; Norderhaug Drosdal, L; Nordt, A; Uythoven, J; Velghe, B; Vlachoudis, V; Wenninger, J; Zamantzas, C; Zimmermann, F; Fuster Martinez, N

    2012-01-01

    UFOs (Unidentified Falling Objects) are potentially a major luminosity limitation for nominal LHC operation. With large-scale increases of the BLM thresholds, their impact on LHC availability was mitigated in the second half of 2011. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. Therefore, in 2011, the diagnostics for UFO events were significantly improved, dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. In this paper, the state of knowledge is summarized and extrapolations for LHC operation after LS1 are presented. Mitigation strategies are proposed and related tests and measures for 2012 are specified.

  19. ABCB1 (P-glycoprotein) reduces bacterial attachment to human gastrointestinal LS174T epithelial cells.

    Science.gov (United States)

    Crowe, Andrew; Bebawy, Mary

    2012-08-15

    The aim of this project was to show elevated P-glycoprotein (P-gp) expression decreasing bacterial association with LS174T human gastrointestinal cells, and that this effect could be reversed upon blocking functional P-gp efflux. Staphylococcus aureus, Klebsiella pneumoniae, Pseudomonas aeruginosa, Lactobacillus acidophilus and numerous strains of Escherichia coli, from commensal to enteropathogenic and enterohaemorrhagic strains (O157:H7) were fluorescently labelled and incubated on LS174T cultures either with or without P-gp amplification using rifampicin. PSC-833 was used as a potent functional P-gp blocking agent. Staphylococcus and Pseudomonas displayed the greatest association with the LS174T cells. Surprisingly, lactobacilli retained more fluorescence than enteropathogenic-E. coli in this system. Irrespective of attachment differences between the bacterial species, the increase in P-gp protein expression decreased bacterial fluorescence by 25-30%. This included the GFP-labelled E. coli, and enterohaemorrhagic E. coli (O157:H7). Blocking P-gp function through the co-administration of PSC-833 increased the amount of bacteria associated with P-gp expressing LS174T cells back to control levels. As most bacteria were affected to the same degree, irrespective of pathogenicity, it is unlikely that P-gp has a direct influence on adhesion of bacteria, and instead P-gp may be playing an indirect role by secreting a bank of endogenous factors or changing the local environment to one less suited to bacterial growth in general. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  20. LS1 “First Long Shutdown of LHC and its Injector Chains”

    CERN Multimedia

    Foraz, K; Barberan, M; Bernardini, M; Coupard, J; Gilbert, N; Hay, D; Mataguez, S; McFarlane, D

    2014-01-01

    The LHC and its Injectors were stopped in February 2013, in order to maintain, consolidate and upgrade the different equipment of the accelerator chain, with the goal of achieving LHC operation at the design energy of 14 TeV in the centre-of-mass. Prior to the start of this First Long Shutdown (LS1), a major effort of preparation was performed in order to optimize the schedule and the use of resources across the different machines, with the aim of resuming LHC physics in early 2015. The rest of the CERN complex will restart beam operation in the second half of 2014. This paper presents the schedule of the LS1, describes the organizational set-up for the coordination of the works, the main activities, the different main milestones, which have been achieved so far, and the decisions taken in order to mitigate the issues encountered.

  1. Evaluation of four-dimensional nonbinary LDPC-coded modulation for next-generation long-haul optical transport networks.

    Science.gov (United States)

    Zhang, Yequn; Arabaci, Murat; Djordjevic, Ivan B

    2012-04-09

    Leveraging the advanced coherent optical communication technologies, this paper explores the feasibility of using four-dimensional (4D) nonbinary LDPC-coded modulation (4D-NB-LDPC-CM) schemes for long-haul transmission in future optical transport networks. In contrast to our previous works on 4D-NB-LDPC-CM which considered amplified spontaneous emission (ASE) noise as the dominant impairment, this paper undertakes transmission in a more realistic optical fiber transmission environment, taking into account impairments due to dispersion effects, nonlinear phase noise, Kerr nonlinearities, and stimulated Raman scattering in addition to ASE noise. We first reveal the advantages of using 4D modulation formats in LDPC-coded modulation instead of conventional two-dimensional (2D) modulation formats used with polarization-division multiplexing (PDM). Then we demonstrate that 4D LDPC-coded modulation schemes with nonbinary LDPC component codes significantly outperform not only their conventional PDM-2D counterparts but also the corresponding 4D bit-interleaved LDPC-coded modulation (4D-BI-LDPC-CM) schemes, which employ binary LDPC codes as component codes. We also show that the transmission reach improvement offered by the 4D-NB-LDPC-CM over 4D-BI-LDPC-CM increases as the underlying constellation size and hence the spectral efficiency of transmission increases. Our results suggest that 4D-NB-LDPC-CM can be an excellent candidate for long-haul transmission in next-generation optical networks.

  2. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  3. LS1 to LHC Report: LHC key handed back to Operations

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    This week, after 23 months of hard work involving about 1000 people every day, the key to the LHC was symbolically handed back to the Operations team. The first long shutdown is over and the machine is getting ready for a restart that will bring its beam to full energy in early spring.   Katy Foraz, LS1 activities coordinator, symbolically hands the LHC key to the operations team, represented, left to right, by Jorg Wenninger, Mike Lamont and Mirko Pojer. All the departments, all the machines and all the experimental areas were involved in the first long shutdown of the LHC that began in February 2013. Over the last two years, the Bulletin has closely followed  all the work and achievements that had been carefully included in the complex general schedule drawn up and managed by the team led by Katy Foraz from the Engineering Department. “The work on the schedule began two years before the start of LS1 and one of the first things we realised was that there was no commercial...

  4. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  5. Range and railgun development results at LS and PA ''Soyuz''

    International Nuclear Information System (INIS)

    Babakov, Y.P.; Plekhanov, A.V.; Zheleznyi, V.B.

    1995-01-01

    A rail electromagnetic accelerator is one of the most reliable and simple devices for accelerating macroparticles up to high velocity. These accelerators allow scientists to carry out fundamental and applied investigations to study both equation-of-state of materials at high pressure due to high velocity encounters, and creation of conditions for shock thermonuclear fusion. In the department ''Energophyzika'' LS and PA ''Soyuz'' range was created. It was equipped with an inductor with storage capacity up to 12.5 MJ energized by a solid propellant MHD generator and a capacitor bank (energy capacity up to 6 MJ). These systems deliver currents of 1 MA and 2 MA, respectively. Diagnostic, recording, and autocalculation systems allow use of as many as 120 data channels with acquisition frequency up to 10 MHz. Recent technical successes in railgun construction, using special methods to compact plasma armature and produce high velocity trailing contact, creation of hybrid armatures, and optimizing acceleration made possible to gain velocities in the range of 6.2 to 6.8 km/s (masses of 3.8 to 10 g) and velocities 2.7 to 3.8 km/s (masses of 50 to 100 g) on railguns with length 2 to 4 m

  6. Improved numerical grid generation techniques for the B2 edge plasma code

    International Nuclear Information System (INIS)

    Stotler, D.P.; Coster, D.P.

    1992-06-01

    Techniques used to generate grids for edge fluid codes such as B2 from numerically computed equilibria are discussed. Fully orthogonal, numerically derived grids closely resembling analytically prescribed meshes can be obtained. But, the details of the poloidal field can vary, yielding significantly different plasma parameters in the simulations. The magnitude of these differences is consistent with the predictions of an analytic model of the scrape-off layer. Both numerical and analytic grids are insensitive to changes in their defining parameters. Methods for implementing nonorthogonal boundaries in these meshes are also presented; they differ slightly from those required for fully orthogonal grids

  7. Extension of a GIS procedure for calculating the RUSLE equation LS factor

    NARCIS (Netherlands)

    Zhang, H.; Yang, Q.; Li, R.; Liu, Q.; Moore, D.; He, P.; Ritsema, C.J.; Geissen, V.

    2013-01-01

    The Universal Soil Loss Equation (USLE) and revised USLE (RUSLE) are often used to estimate soil erosion at regional landscape scales, however a major limitation is the difficulty in extracting the LS factor. The geographic information system-based (GIS-based) methods which have been developed for

  8. The successful completion of LS1, consolidation and preparations for the future

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    For CERN’s Technology (TE) Department, success in LS1 is more important than finishing. In other words, the aim is to reach the finish line having maintained the highest standards of safety, quality and performance. Other challenges need to be faced too, before, during and after LS1, and the Department always approaches them with optimism. The new Department Head tells us how his 750 colleagues work to keep the Laboratory at the cutting edge of high-energy physics technology.   José Miguel Jiménez. “We can only grow once we’ve stabilised our base.” The message presented by José Miguel Jiménez, who stepped into the role of TE Department Head in January 2014, is clear, as are his priorities going forward: “The Technology Department needs to be consolidated in terms of both its personnel and its assembly and test infrastructures, some of which are unique.” The TE Department is tasked with pr...

  9. ASTROPHYSICAL PARAMETERS OF LS 2883 AND IMPLICATIONS FOR THE PSR B1259-63 GAMMA-RAY BINARY

    International Nuclear Information System (INIS)

    Negueruela, Ignacio; Lorenzo, Javier; Ribo, Marc; Herrero, Artemio; Khangulyan, Dmitry; Aharonian, Felix A.

    2011-01-01

    Only a few binary systems with compact objects display TeV emission. The physical properties of the companion stars represent basic input for understanding the physical mechanisms behind the particle acceleration, emission, and absorption processes in these so-called gamma-ray binaries. Here we present high-resolution and high signal-to-noise optical spectra of LS 2883, the Be star forming a gamma-ray binary with the young non-accreting pulsar PSR B1259-63, showing it to rotate faster and be significantly earlier and more luminous than previously thought. Analysis of the interstellar lines suggests that the system is located at the same distance as (and thus is likely a member of) Cen OB1. Taking the distance to the association, d = 2.3 kpc, and a color excess of E(B - V) = 0.85 for LS 2883 results in M V ∼ -4.4. Because of fast rotation, LS 2883 is oblate (R eq ≅ 9.7 R sun and R pole ≅ 8.1 R sun ) and presents a temperature gradient (T eq ∼ 27,500 K, log g eq = 3.7; T pole ∼ 34,000 K, log g pole = 4.1). If the star did not rotate, it would have parameters corresponding to a late O-type star. We estimate its luminosity at log(L * /L sun ) ≅ 4.79 and its mass at M * ∼ 30 M sun . The mass function then implies an inclination of the binary system i orb ∼ 23 0 , slightly smaller than previous estimates. We discuss the implications of these new astrophysical parameters of LS 2883 for the production of high-energy and very high-energy gamma rays in the PSR B1259-63/LS 2883 gamma-ray binary system. In particular, the stellar properties are very important for prediction of the line-like bulk Comptonization component from the unshocked ultrarelativistic pulsar wind.

  10. RADHEAT-V3, a code system for generating coupled neutron and gamma-ray group constants and analyzing radiation transport

    International Nuclear Information System (INIS)

    Koyama, Kinji; Taji, Yukichi; Miyasaka, Shun-ichi; Minami, Kazuyoshi.

    1977-07-01

    The modular code system RADHEAT is for producing coupled multigroup neutron and gamma-ray cross section sets, analyzing the neutron and gamma-ray transport, and calculating the energy deposition and atomic displacements due to these radiations in a nuclear reactor or shield. The basic neutron cross sections and secondary gamma-ray production data are taken from ENDF/B and POPOP4 libraries respectively. The system (1) generates multigroup neutron cross sections, energy deposition coefficients and atomic displacement factors due to neutron reactions, (2) generates multigroup gamma-ray cross sections and energy transfer coefficients, (3) generates secondary gamma-ray production cross sections, (4) combines these cross sections into the coupled set, (5) outputs and updates the multigroup cross section libraries in convenient formats for other transport codes, (6) analyzes the neutron and gamma-ray transport and calculates the energy deposition and the number density of atomic displacements in a medium, (7) collapses the cross sections to a broad-group structure, by option, using the weighting functions obtained by one-dimensional transport calculation, and (8) plots, by option, multigroup cross sections, and neutron and gamma-ray distributions. Definitions of the input data required in various options of the code system are also given. (auth.)

  11. Electrodialytic extraction of heavy metals from Greenlandic MSWI fly ash as a function of remediation time and L/S ratio

    DEFF Research Database (Denmark)

    Kirkelund, Gunvor Marie; Jensen, Pernille Erland; Ottosen, Lisbeth M.

    2013-01-01

    , where the fly ash was suspended in distilled water in different liquid to solid (L/S) ratios. Remediation times of 7 and 14 days were tested and the current strength was 50 mA in all experiments. The highest removal was seen when an acidic pH in the fly ash suspension was obtained. In an experiment...... lasting 14 days with L/S 10, up to 60 % Cd, 45 % Zn, 20 % Ni and Ba was removed. Regardless of the remediation time and L/S ratio, the fraction of soluble Ba, Cr and Pb decreased due to the electrodialytic remediation. The electrodialytic remediation method showed potential as a treatment method...

  12. Synthesis and characterization of sulfonate polystyrene-lignosulfonate-alumina (SPS-LS-Al{sub 2}O{sub 3}) polyblends as electrolyte membranes for fuel cell

    Energy Technology Data Exchange (ETDEWEB)

    Gonggo, Siang Tandi, E-mail: standigonggo@yahoo.com [Chemistry Research Groups, Faculty of Teacher Training and Educational Sciences, Tadulako University (Indonesia)

    2015-09-30

    The new type of electrolyte membrane materials has been prepared by blend sulfonated polystyrene (SPS), lignosulfonate (LS), and alumina (SPS-LS-Al{sub 2}O{sub 3}) by casting polymer solution. The resulting polymer electrolyte membranes were then characterized by functional groups analysis, mechanical properties, water uptake, ion exchange capacity, and proton conductivity. SPS-LS-Al{sub 2}O{sub 3} membranes with alumina composition various have been proven qualitatively by analysis of functional groups. Increasing the Al{sub 2}O{sub 3} ratio resulted in higher ion exchange capacity (IEC), mechanical strength and proton conductivity, but water uptake decreased. The SPS-LS-Al{sub 2}O{sub 3} blend showed higher proton conductivity than Nafion 117.

  13. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  14. SPRINT: A Tool to Generate Concurrent Transaction-Level Models from Sequential Code

    Directory of Open Access Journals (Sweden)

    Richard Stahl

    2007-01-01

    Full Text Available A high-level concurrent model such as a SystemC transaction-level model can provide early feedback during the exploration of implementation alternatives for state-of-the-art signal processing applications like video codecs on a multiprocessor platform. However, the creation of such a model starting from sequential code is a time-consuming and error-prone task. It is typically done only once, if at all, for a given design. This lack of exploration of the design space often leads to a suboptimal implementation. To support our systematic C-based design flow, we have developed a tool to generate a concurrent SystemC transaction-level model for user-selected task boundaries. Using this tool, different parallelization alternatives have been evaluated during the design of an MPEG-4 simple profile encoder and an embedded zero-tree coder. Generation plus evaluation of an alternative was possible in less than six minutes. This is fast enough to allow extensive exploration of the design space.

  15. Electromagnetic transitions of heavy quarkonia in the boosted LS-coupling scheme

    International Nuclear Information System (INIS)

    Ishida, Shin; Morikawa, Akiyoshi; Oda, Masuho

    1998-01-01

    Radiative transitions among heavy quarkonium systems are investigated in a general framework of the boosted LS-coupling (BLS) scheme, where mesons are treated in a manifestly covariant way and conserved effective currents are explicitly given. As a result it is shown that our theory reproduces the qualitative features of experiments remarkably well, giving evidence for the validity of the BLS scheme. (author)

  16. Study on the performance of the Particle Identification Detectors at LHCb after the LHC First Long Shutdown (LS1)

    CERN Document Server

    Fontana, Marianna

    2016-01-01

    During the First Long Shutdown (LS1), the LHCb experiment has introduced major modification in the data-processing procedure and modified part of the detector to deal with the increased energy and the increased heavy-hadron production cross-section. In this contribution we review the performance of the particle identification detectors at LHCb, Rich, Calorimeters, and Muon system, after the LS1

  17. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  18. Diagnosis of leg osseous infection in diabetics: place of scintigraphy of the polynuclear hydrocarbons labelled by HMPAO-{sup 99m}Tc (HMPAO-LS); Diagnostic de l`infection osseuse du pied chez le diabetique: place de la scintigraphie aux polynucleaires marques a l`HMPAO-Tc 99m (HMPAO-LS)

    Energy Technology Data Exchange (ETDEWEB)

    Devillers, A.; Moisan, A.; Hennion, F.; Poirier, J.Y.; Bourguet, P. [CRLCC Eugene Marquis, Medecine Nucleaire, CHRU Endocrinologie, Rennes (France)

    1997-12-31

    In case of leg infection in diabetics it is difficult to differentiate between a chronic osteopathy, an infection of the soft tissues and an osteomyelitis (OM). We considered of interest to evaluate by a HMPAO-LS prospective study the diagnosis of osteitis in diabetic legs. Twenty seven diabetic (DID type 1 = 11, DNID type 2 = 16) patients (19 M and 8 F, average age 64 years), clinically suspected with a leg osteo-articular infection (malum perforans or cellulitis, uni- or bi-lateral, unique or multiple lesions) were included in the study. For all of them a standard radiographic examination centered on the legs, a (3 phases) osseous scintigraphy and a HMPAO-LS were achieved over 3 days. The HMPAO-LS was considered as in favour of a OM when there was an abnormal accumulation of granulocytes concordant with a hyper-fixation by osseous scintigraphy. In case of 37 retained lesions 20 OM were proved: - 8 sites, on radiological arguments and histological and/or microbiological criteria after osseous biopsy; - 8, by the radiological examination only; - 4, on biopsy arguments, only. Seventeen lesions were not OM, of which 7 were cellulitis. The sensitivity of the HMPAO-LS was 90% with a specificity of 94%. 8/20 OM had initial normal radiographs confirming the precocity of OM diagnosis by HMPAO-LS. Twelve HMPAO-LS were effected in OM evolutive surveillance. All of them became negative on the control initial site of the healing of OM. In conclusion, the HMPAO-LS appears to be an alternative to choose in the diagnosis and in the evolution surveillance of leg OM in diabetics, particularly when the healing of the perforans softs is incomplete and the question of the osteitis healing or sequel of an anti-biotherapy is posed

  19. Quenches after LS1

    International Nuclear Information System (INIS)

    Verweij, A.P.

    2012-01-01

    In this paper I will give an overview of the different types of quenches that occur in the LHC, followed by an estimate of the number of quenches that we can expect after LS1. Beam-induced quenches and false triggering of the QPS will be the main cause of those quenches that cause a beam dump. Possibly in total up to 10-20 per year. After consolidation of the 13 kA joints, the approach for the BLM settings can be less conservative than in 2010-2012 in order to maximize beam time. This will cause some quenches but, anyhow, a beam.induced quench is not more risky than a quench provoked by false triggering. It is not easy to predict the number of BLM triggered beam dumps, needed to avoid magnet quenches, because it is not sure how to scale beam losses and UFO's from 3.5 TeV to 6.5 TeV, and it is not sure if the thresholds at 3.5 TeV are correct. Quench events will be much more massive (ex: RB quench at 6 kA → 2 MJ, RB quench at 11 kA → 6-20 MJ), and as a result cryo recuperation much longer. There will also be more ramp induced quenches after a FPA in other circuits due to higher ramp rates and smaller temperature margins (mutual coupling)

  20. Fast-Solving Quasi-Optimal LS-S3VM Based on an Extended Candidate Set.

    Science.gov (United States)

    Ma, Yuefeng; Liang, Xun; Kwok, James T; Li, Jianping; Zhou, Xiaoping; Zhang, Haiyan

    2018-04-01

    The semisupervised least squares support vector machine (LS-S 3 VM) is an important enhancement of least squares support vector machines in semisupervised learning. Given that most data collected from the real world are without labels, semisupervised approaches are more applicable than standard supervised approaches. Although a few training methods for LS-S 3 VM exist, the problem of deriving the optimal decision hyperplane efficiently and effectually has not been solved. In this paper, a fully weighted model of LS-S 3 VM is proposed, and a simple integer programming (IP) model is introduced through an equivalent transformation to solve the model. Based on the distances between the unlabeled data and the decision hyperplane, a new indicator is designed to represent the possibility that the label of an unlabeled datum should be reversed in each iteration during training. Using the indicator, we construct an extended candidate set consisting of the indices of unlabeled data with high possibilities, which integrates more information from unlabeled data. Our algorithm is degenerated into a special scenario of the previous algorithm when the extended candidate set is reduced into a set with only one element. Two strategies are utilized to determine the descent directions based on the extended candidate set. Furthermore, we developed a novel method for locating a good starting point based on the properties of the equivalent IP model. Combined with the extended candidate set and the carefully computed starting point, a fast algorithm to solve LS-S 3 VM quasi-optimally is proposed. The choice of quasi-optimal solutions results in low computational cost and avoidance of overfitting. Experiments show that our algorithm equipped with the two designed strategies is more effective than other algorithms in at least one of the following three aspects: 1) computational complexity; 2) generalization ability; and 3) flexibility. However, our algorithm and other algorithms have

  1. Diagnosis of leg osseous infection in diabetics: place of scintigraphy of the polynuclear hydrocarbons labelled by HMPAO-99mTc (HMPAO-LS)

    International Nuclear Information System (INIS)

    Devillers, A.; Moisan, A.; Hennion, F.; Poirier, J.Y.; Bourguet, P.

    1997-01-01

    In case of leg infection in diabetics it is difficult to differentiate between a chronic osteopathy, an infection of the soft tissues and an osteomyelitis (OM). We considered of interest to evaluate by a HMPAO-LS prospective study the diagnosis of osteitis in diabetic legs. Twenty seven diabetic (DID type 1 = 11, DNID type 2 = 16) patients (19 M and 8 F, average age 64 years), clinically suspected with a leg osteo-articular infection (malum perforans or cellulitis, uni- or bi-lateral, unique or multiple lesions) were included in the study. For all of them a standard radiographic examination centered on the legs, a (3 phases) osseous scintigraphy and a HMPAO-LS were achieved over 3 days. The HMPAO-LS was considered as in favour of a OM when there was an abnormal accumulation of granulocytes concordant with a hyper-fixation by osseous scintigraphy. In case of 37 retained lesions 20 OM were proved: - 8 sites, on radiological arguments and histological and/or microbiological criteria after osseous biopsy; - 8, by the radiological examination only; - 4, on biopsy arguments, only. Seventeen lesions were not OM, of which 7 were cellulitis. The sensitivity of the HMPAO-LS was 90% with a specificity of 94%. 8/20 OM had initial normal radiographs confirming the precocity of OM diagnosis by HMPAO-LS. Twelve HMPAO-LS were effected in OM evolutive surveillance. All of them became negative on the control initial site of the healing of OM. In conclusion, the HMPAO-LS appears to be an alternative to choose in the diagnosis and in the evolution surveillance of leg OM in diabetics, particularly when the healing of the perforans softs is incomplete and the question of the osteitis healing or sequel of an anti-biotherapy is posed

  2. The Ne3LS Network, Québec's initiative to evaluate the impact and promote a responsible and sustainable development of nanotechnology

    Science.gov (United States)

    Endo, Charles-Anica; Emond, Claude; Battista, Renaldo; Parizeau, Marie-Hélène; Beaudry, Catherine

    2011-07-01

    The spectacular progress made by nanosciences and nanotechnologies elicits as much hope and fear. Consequently, a great number of research and training initiatives on the ethical, environmental, economic, legal and social issues regarding nanotechnology development (Ne3LS) are emerging worldwide. In Québec, Canada, a Task Force was mandated by NanoQuébec to conceive a Ne3LS research and training strategy to assess those issues. This Task Force brought together experts from universities, governments or industry working in nanosciences and nanotechnologies or in Ne3LS. Their resulting action plan, made public in November 2006, contained several recommendations, including the creation of a knowledge network (Ne3LS Network). In the following years, after consulting with numerous key players concerned with the possible impacts of nanosciences and nanotechnologies in Québec, the Ne3LS Network was launched in January 2010 in partnership with the Fonds québécois de la recherche sur la nature et les technologies, the Fonds québécois de la recherche sur la société et la culture and the Fonds de la recherche en santé du Québec, NanoQuébec, the Institut de recherche Robert-Sauvé en santé et en sécurité du travail as well as the University of Montreal. Its objectives are to 1) Foster the development of Ne3LS research activities (grants and fellowships); 2) Spearhead the Canadian and international Ne3LS network; 3) Take part in the training of researchers and experts; 4) Encourage the creation of interactive tools for the general public; 5) Facilitate collaboration between decision-makers and experts; 6) Involve the scientific community through a host of activities (symposium, conferences, thematic events); 7) Build multidisciplinary research teams to evaluate the impact of nanotechnology.

  3. Semantic Web applications and tools for the life sciences: SWAT4LS 2010.

    Science.gov (United States)

    Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott; Splendiani, Andrea

    2012-01-25

    As Semantic Web technologies mature and new releases of key elements, such as SPARQL 1.1 and OWL 2.0, become available, the Life Sciences continue to push the boundaries of these technologies with ever more sophisticated tools and applications. Unsurprisingly, therefore, interest in the SWAT4LS (Semantic Web Applications and Tools for the Life Sciences) activities have remained high, as was evident during the third international SWAT4LS workshop held in Berlin in December 2010. Contributors to this workshop were invited to submit extended versions of their papers, the best of which are now made available in the special supplement of BMC Bioinformatics. The papers reflect the wide range of work in this area, covering the storage and querying of Life Sciences data in RDF triple stores, tools for the development of biomedical ontologies and the semantics-based integration of Life Sciences as well as clinicial data.

  4. State-of-the-art of wind turbine design codes: main features overview for cost-effective generation

    Energy Technology Data Exchange (ETDEWEB)

    Molenaar, D-P.; Dijkstra, S. [Delft University of Technology (Netherlands). Mechanical Engineering Systems and Control Group

    1999-07-01

    For successful large-scale application of wind energy, the price of electricity generated by wind turbines should decrease. Model-based control can be important since it has the potential to reduce fatigue loads, while simultaneously maintaining a desired amount of energy production. The controller synthesis, however, requires a mathematical model describing the most important dynamics of the complete wind turbine. In the wind energy community there is a wide variety in codes used to model a wind turbine's dynamic behaviour or to carry out design calculations. In this paper, the main features of the state-of-the-art wind turbine design codes have been investigated in order to judge the appropriateness of using one of these for the modeling, identification and control of flexible, variable speed wind turbines. It can be concluded that, although the sophistication of the design codes has increased enormously over the last two decades, they are, in general, not suitable for the design, and easy implementation of optimal operating strategies.

  5. Bursts generate a non-reducible spike-pattern code

    Directory of Open Access Journals (Sweden)

    Hugo G Eyherabide

    2009-05-01

    Full Text Available On the single-neuron level, precisely timed spikes can either constitute firing-rate codes or spike-pattern codes that utilize the relative timing between consecutive spikes. There has been little experimental support for the hypothesis that such temporal patterns contribute substantially to information transmission. Using grasshopper auditory receptors as a model system, we show that correlations between spikes can be used to represent behaviorally relevant stimuli. The correlations reflect the inner structure of the spike train: a succession of burst-like patterns. We demonstrate that bursts with different spike counts encode different stimulus features, such that about 20% of the transmitted information corresponds to discriminating between different features, and the remaining 80% is used to allocate these features in time. In this spike-pattern code, the "what" and the "when" of the stimuli are encoded in the duration of each burst and the time of burst onset, respectively. Given the ubiquity of burst firing, we expect similar findings also for other neural systems.

  6. Support vector machine regression (LS-SVM)--an alternative to artificial neural networks (ANNs) for the analysis of quantum chemistry data?

    Science.gov (United States)

    Balabin, Roman M; Lomakina, Ekaterina I

    2011-06-28

    A multilayer feed-forward artificial neural network (MLP-ANN) with a single, hidden layer that contains a finite number of neurons can be regarded as a universal non-linear approximator. Today, the ANN method and linear regression (MLR) model are widely used for quantum chemistry (QC) data analysis (e.g., thermochemistry) to improve their accuracy (e.g., Gaussian G2-G4, B3LYP/B3-LYP, X1, or W1 theoretical methods). In this study, an alternative approach based on support vector machines (SVMs) is used, the least squares support vector machine (LS-SVM) regression. It has been applied to ab initio (first principle) and density functional theory (DFT) quantum chemistry data. So, QC + SVM methodology is an alternative to QC + ANN one. The task of the study was to estimate the Møller-Plesset (MPn) or DFT (B3LYP, BLYP, BMK) energies calculated with large basis sets (e.g., 6-311G(3df,3pd)) using smaller ones (6-311G, 6-311G*, 6-311G**) plus molecular descriptors. A molecular set (BRM-208) containing a total of 208 organic molecules was constructed and used for the LS-SVM training, cross-validation, and testing. MP2, MP3, MP4(DQ), MP4(SDQ), and MP4/MP4(SDTQ) ab initio methods were tested. Hartree-Fock (HF/SCF) results were also reported for comparison. Furthermore, constitutional (CD: total number of atoms and mole fractions of different atoms) and quantum-chemical (QD: HOMO-LUMO gap, dipole moment, average polarizability, and quadrupole moment) molecular descriptors were used for the building of the LS-SVM calibration model. Prediction accuracies (MADs) of 1.62 ± 0.51 and 0.85 ± 0.24 kcal mol(-1) (1 kcal mol(-1) = 4.184 kJ mol(-1)) were reached for SVM-based approximations of ab initio and DFT energies, respectively. The LS-SVM model was more accurate than the MLR model. A comparison with the artificial neural network approach shows that the accuracy of the LS-SVM method is similar to the accuracy of ANN. The extrapolation and interpolation results show that LS-SVM is

  7. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  8. Automated importance generation and biasing techniques for Monte Carlo shielding techniques by the TRIPOLI-3 code

    International Nuclear Information System (INIS)

    Both, J.P.; Nimal, J.C.; Vergnaud, T.

    1990-01-01

    We discuss an automated biasing procedure for generating the parameters necessary to achieve efficient Monte Carlo biasing shielding calculations. The biasing techniques considered here are exponential transform and collision biasing deriving from the concept of the biased game based on the importance function. We use a simple model of the importance function with exponential attenuation as the distance to the detector increases. This importance function is generated on a three-dimensional mesh including geometry and with graph theory algorithms. This scheme is currently being implemented in the third version of the neutron and gamma ray transport code TRIPOLI-3. (author)

  9. Dielectronic recombination of P5+ and Cl7+ in configuration-average, LS-coupling, and intermediate-coupling approximations

    International Nuclear Information System (INIS)

    Badnell, N.R.; Pindzola, M.S.

    1989-01-01

    We have calculated dielectronic recombination cross sections and rate coefficients for the Ne-like ions P 5+ and Cl 7+ in configuration-average, LS-coupling, and intermediate-coupling approximations. Autoionization into excited states reduces the cross sections and rate coefficients by substantial amounts in all three methods. There is only rough agreement between the configuration-average cross-section results and the corresponding intermediate-coupling results. There is good agreement, however, between the LS-coupling cross-section results and the corresponding intermediate-coupling results. The LS-coupling and intermediate-coupling rate coefficients agree to better than 5%, while the configuration-average rate coefficients are about 30% higher than the other two coupling methods. External electric field effects, as calculated in the configuration-average approximation, are found to be relatively small for the cross sections and completely negligible for the rate coefficients. Finally, the general formula of Burgess was found to overestimate the rate coefficients by roughly a factor of 5, mainly due to the neglect of autoionization into excited states

  10. One-way quantum repeaters with quantum Reed-Solomon codes

    Science.gov (United States)

    Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang

    2018-05-01

    We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of d -level systems for large dimension d . We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generations of quantum repeaters using quantum Reed-Solomon codes and identify parameter regimes where each generation performs the best.

  11. Computer code to simulate transients in a steam generator of PWR nuclear power plants

    International Nuclear Information System (INIS)

    Silva, J.M. da.

    1979-01-01

    A digital computer code KIBE was developed to simulate the transient behavior of a Steam Generator used in Pressurized Water Reactor Power PLants. The equations of Conservation of mass, energy and momentum were numerically integrated by an implicit method progressively in the several axial sections into which the Steam Generator was divided. Forced convection heat transfer was assumed on the primary side, while on the secondary side all the different modes of heat transfer were permitted and deternined from the various correlations. The stability of the stationary state was verified by its reproducibility during the integration of the conservation equation without any pertubation. Transient behavior resulting from pertubations in the flow and the internal energy (temperature) at the inlet of the primary side were simulated. The results obtained exhibited satisfactory behaviour. (author) [pt

  12. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  13. Modelling of WWER-1000 steam generators by REALP5/MOD3.2 code

    Energy Technology Data Exchange (ETDEWEB)

    D`Auria, F.; Galassi, G.M. [Univ. of Pisa (Italy); Frogheri, M. [Univ. of Genova (Italy)

    1997-12-31

    The presentation summarises the results of best estimate calculations carried out with reference to the WWER-1000 Nuclear Power Plant, utilizing a qualified nodalization set-up for the Relap5/Mod3.2 code. The nodalization development has been based on the data of the Kozloduy Bulgarian Plant. The geometry of the steam generator imposed drastic changes in noding philosophy with respect to what is suitable for the U-tubes steam generators. For the secondary side a symmetry axis was chosen to separate (in the nodalization) the hot and the cold sides of the tubes. In this way the secondary side of the steam generators was divided into three zones: (a) the hot zone including the hot collector and the hot l/2 parts of the tubes; (b) the cold zone including the cold collector and the cold 1/2 parts of the tubes; (c) the downcomer region, where down flow is assumed. As a consequence of above in the primary side more nodes are placed on the hot side of the tubes. Steady state and transient qualification has been achieved, considering the criteria proposed at the University of Pisa, utilizing plant transient data from the Kozloduy and the Ukrainian Zaporosche Plants. The results of the application of the qualified WWER-1000 Relap5/Mod3.2 nodalization to various transients including large break LOCA, small break LOCA and steam generator tube rupture, together with a sensitivity analysis on the steam generators, are reported in the presentation. Emphasis is given to the prediction of the steam generators performances. 23 refs.

  14. Modelling of WWER-1000 steam generators by REALP5/MOD3.2 code

    Energy Technology Data Exchange (ETDEWEB)

    D` Auria, F; Galassi, G M [Univ. of Pisa (Italy); Frogheri, M [Univ. of Genova (Italy)

    1998-12-31

    The presentation summarises the results of best estimate calculations carried out with reference to the WWER-1000 Nuclear Power Plant, utilizing a qualified nodalization set-up for the Relap5/Mod3.2 code. The nodalization development has been based on the data of the Kozloduy Bulgarian Plant. The geometry of the steam generator imposed drastic changes in noding philosophy with respect to what is suitable for the U-tubes steam generators. For the secondary side a symmetry axis was chosen to separate (in the nodalization) the hot and the cold sides of the tubes. In this way the secondary side of the steam generators was divided into three zones: (a) the hot zone including the hot collector and the hot l/2 parts of the tubes; (b) the cold zone including the cold collector and the cold 1/2 parts of the tubes; (c) the downcomer region, where down flow is assumed. As a consequence of above in the primary side more nodes are placed on the hot side of the tubes. Steady state and transient qualification has been achieved, considering the criteria proposed at the University of Pisa, utilizing plant transient data from the Kozloduy and the Ukrainian Zaporosche Plants. The results of the application of the qualified WWER-1000 Relap5/Mod3.2 nodalization to various transients including large break LOCA, small break LOCA and steam generator tube rupture, together with a sensitivity analysis on the steam generators, are reported in the presentation. Emphasis is given to the prediction of the steam generators performances. 23 refs.

  15. Evaluation of angular integrals in the generation of transfer matrices for multigroup transport codes

    International Nuclear Information System (INIS)

    Garcia, R.D.M.

    1985-01-01

    The generalization of a semi-analytical technique for the evaluation of angular integrals that appear in the generation of elastic and discrete inelastic tranfer matrices for transport codes is carried out. In contrast to the generalized series expansions which are found to be too complex and thus of little practical value, when compared to the Gaussian quadrature technique, the recursion relations developed in this work are superior to the quadrature scheme, for those cases where the round-off error propagation is not significant. (Author) [pt

  16. Measuring Trace Gas Emission from Multi-Distributed Sources Using Vertical Radial Plume Mapping (VRPM and Backward Lagrangian Stochastic (bLS Techniques

    Directory of Open Access Journals (Sweden)

    Thomas K. Flesch

    2011-09-01

    Full Text Available Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The vertical radial plume mapping (VRPM and the backward Lagrangian stochastic (bLS techniques with an open-path optical spectroscopic sensor were evaluated for relative accuracy for multiple emission-source and sensor configurations. The relative accuracy was calculated by dividing the measured emission rate by the actual emission rate; thus, a relative accuracy of 1.0 represents a perfect measure. For a single area emission source, the VRPM technique yielded a somewhat high relative accuracy of 1.38 ± 0.28. The bLS technique resulted in a relative accuracy close to unity, 0.98 ± 0.24. Relative accuracies for dual source emissions for the VRPM and bLS techniques were somewhat similar to single source emissions, 1.23 ± 0.17 and 0.94 ± 0.24, respectively. When the bLS technique was used with vertical point concentrations, the relative accuracy was unacceptably low,

  17. Finite Element Simulation of Medium-Range Blast Loading Using LS-DYNA

    Directory of Open Access Journals (Sweden)

    Yuzhen Han

    2015-01-01

    Full Text Available This study investigated the Finite Element simulation of blast loading using LS-DYNA. The objective is to identify approaches to reduce the requirement of computation effort while maintaining reasonable accuracy, focusing on blast loading scheme, element size, and its relationship with scale of explosion. The study made use of the recently developed blast loading scheme in LS-DYNA, which removes the necessity to model the explosive in the numerical models but still maintains the advantages of nonlinear fluid-structure interaction. It was found that the blast loading technique could significantly reduce the computation effort. It was also found that the initial density of air in the numerical model could be purposely increased to partially compensate the error induced by the use of relatively large air elements. Using the numerical approach, free air blast above a scaled distance of 0.4 m/kg1/3 was properly simulated, and the fluid-structure interaction at the same location could be properly duplicated using proper Arbitrary Lagrangian Eulerian (ALE coupling scheme. The study also showed that centrifuge technique, which has been successfully employed in model tests to investigate the blast effects, may be used when simulating the effect of medium- to large-scale explosion at small scaled distance.

  18. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  19. Estimation of reactor core calculation by HELIOS/MASTER at power generating condition through DeCART, whole-core transport code

    International Nuclear Information System (INIS)

    Kim, H. Y.; Joo, H. G.; Kim, K. S.; Kim, G. Y.; Jang, M. H.

    2003-01-01

    The reactivity and power distribution errors of the HELIOS/MASTER core calculation under power generating conditions are assessed using a whole core transport code DeCART. For this work, the cross section tablesets were generated for a medium sized PWR following the standard procedure and two group nodal core calculations were performed. The test cases include the HELIOS calculations for 2-D assemblies at constant thermal conditions, MASTER 3D assembly calculations at power generating conditions, and the core calculations at HZP, HFP, and an abnormal power conditions. In all these cases, the results of the DeCART code in which pinwise thermal feedback effects are incorporated are used as the reference. The core reactivity, assemblywise power distribution, axial power distribution, peaking factor, and thermal feedback effects are then compared. The comparison shows that the error of the HELIOS/MASTER system in the core reactivity, assembly wise power distribution, pin peaking factor are only 100∼300 pcm, 3%, and 2%, respectively. As far as the detailed pinwise power distribution is concerned, however, errors greater than 15% are observed

  20. Calculation of neutron spectra produced in neutron generator target: Code testing.

    Science.gov (United States)

    Gaganov, V V

    2018-03-01

    DT-neutron spectra calculated using the SRIANG code was benchmarked against the results obtained by widely used Monte Carlo codes: PROFIL, SHORIN, TARGET, ENEA-JSI, MCUNED, DDT and NEUSDESC. The comparison of the spectra obtained by different codes confirmed the correctness of SRIANG calculations. The cross-checking of the compared spectra revealed some systematic features and possible errors of analysed codes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. LS1 Report: short-circuit tests

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    As the LS1 draws to an end, teams move from installation projects to a phase of intense testing. Among these are the so-called 'short-circuit tests'. Currently under way at Point 7, these tests verify the cables, the interlocks, the energy extraction systems, the power converters that provide current to the superconducting magnets and the cooling system.   Thermal camera images taken during tests at point 4 (IP4). Before putting beam into the LHC, all of the machine's hardware components need to be put to the test. Out of these, the most complicated are the superconducting circuits, which have a myriad of different failure modes with interlock and control systems. While these will be tested at cold - during powering tests to be done in August - work can still be done beforehand. "While the circuits in the magnets themselves cannot be tested at warm, what we can do is verify the power converter and the circuits right up to the place the cables go into the magn...

  2. LS1 Report: Summer cool down

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    As the final LS1 activities are carried out in the machine, teams have been cooling down the accelerator sector by sector in preparation for beams.   The third sector of the LHC to be cooled down - sector 1-2 - has seen the process begin this week. During the cool-down phase, survey teams are measuring and smoothing (or realigning) the magnets at cold. By the end of August, five sectors of the machine will be in the process of cooling down, with one (sector 6-7) at cold. The LHC Access Safety System (LASS) is now being commissioned, and will be validated during the DSO tests at the beginning of October. As teams consolidate the modifications made to LASS during the shutdown, many points were closed for testing purposes. The CSCM (copper stabiliser continuity measurement) tests have been completed in the first sector (6-7) and no defect has been found. These results will be presented to the LHC Machine Committee next week. CSCM tests will start in the second sector in mid-August. Following many...

  3. LS1 Report: onwards and upwards

    CERN Multimedia

    Katarina Anthony

    2013-01-01

    For the first time since 2008, engineers have taken most of the LHC’s electromagnetic circuits up to the current needed for magnets to guide beams around the machine at the design energy of 7 TeV. This first phase of intensive tests has been instrumental for the planning of upcoming machine interventions.   All of the circuits in Sector 67 were powered to a 7 TeV equivalent current, with the main circuits (to be consolidated during LS1) powered at 4 TeV. Around 1700 magnet circuits are needed to circulate beams in the LHC. Come 2015, each and every one of these circuits will have to be able to accept their 7 TeV equivalent current. For the LHC’s 24 main dipole and quadrupole circuits, this will mean the consolidation of all their interconnections. But what about the rest of the LHC’s circuits that had been mostly operating at around 60% of the nominal value? How will they handle the ramp-up to design energy? Those questions were asked and answered during the rec...

  4. LS1 Report: LHCb's early Christmas

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Accelerator chain up and running... CCC Operators back at their desks... all telltale signs of the start of Run 2! For the experiments, that means there are just a few short weeks left for them to prepare for beams. Over at LHCb, teams have kept ahead of the curve by focusing on new installations and improvements.   A delicate task: re-connecting the beam pipe in LHCb. From the primary detector services to the DAQ system to the high level trigger, November's injector test beams saw their way through a well-prepared LHCb experiment. “We set the transfer line tests as our deadline for the restart - the entire experiment had to be at nominal position and conditions,” says Eric Thomas, LHCb deputy Technical Coordinator and LHCb LS1 Project Coordinator. “Achieving this was a major milestone for the collaboration. If beam were to come tomorrow, we would be ready.” The injector tests gave the LHCb team a chance to synchronise their detectors, and to al...

  5. Balanced and sparse Tamo-Barg codes

    KAUST Repository

    Halbawi, Wael; Duursma, Iwan; Dau, Hoang; Hassibi, Babak

    2017-01-01

    We construct balanced and sparse generator matrices for Tamo and Barg's Locally Recoverable Codes (LRCs). More specifically, for a cyclic Tamo-Barg code of length n, dimension k and locality r, we show how to deterministically construct a generator matrix where the number of nonzeros in any two columns differs by at most one, and where the weight of every row is d + r - 1, where d is the minimum distance of the code. Since LRCs are designed mainly for distributed storage systems, the results presented in this work provide a computationally balanced and efficient encoding scheme for these codes. The balanced property ensures that the computational effort exerted by any storage node is essentially the same, whilst the sparse property ensures that this effort is minimal. The work presented in this paper extends a similar result previously established for Reed-Solomon (RS) codes, where it is now known that any cyclic RS code possesses a generator matrix that is balanced as described, but is sparsest, meaning that each row has d nonzeros.

  6. Balanced and sparse Tamo-Barg codes

    KAUST Repository

    Halbawi, Wael

    2017-08-29

    We construct balanced and sparse generator matrices for Tamo and Barg\\'s Locally Recoverable Codes (LRCs). More specifically, for a cyclic Tamo-Barg code of length n, dimension k and locality r, we show how to deterministically construct a generator matrix where the number of nonzeros in any two columns differs by at most one, and where the weight of every row is d + r - 1, where d is the minimum distance of the code. Since LRCs are designed mainly for distributed storage systems, the results presented in this work provide a computationally balanced and efficient encoding scheme for these codes. The balanced property ensures that the computational effort exerted by any storage node is essentially the same, whilst the sparse property ensures that this effort is minimal. The work presented in this paper extends a similar result previously established for Reed-Solomon (RS) codes, where it is now known that any cyclic RS code possesses a generator matrix that is balanced as described, but is sparsest, meaning that each row has d nonzeros.

  7. Lymphoscintigraphy (LS) in infants and children: techniques and scintigraphic patterns

    International Nuclear Information System (INIS)

    Somerville, J.; Parsons, G.; Howman-Giles, R.; Lewis, G.; Uren, R.; Mansberg, R.

    1999-01-01

    Full text: Radionuclide imaging of the lymphatic system with intradermal injection of radiopharmaceutical is a rapid, safe and simple technique for the evaluation of lymphatic abnormalities in infants and children. 99 Tc m -antimony sulphide colloid is the agent of choice. The radiopharmaceutical (dose 5 mBq in 0.1 ml) is injected intradermally into both limbs being investigated. Emla cream is useful to reduce the initial discomfort of the injection. Imaging is performed immediately for approximately 30-60 min to assess the flow rate and lymph channels to the draining node fields. Further imaging at 2 and 4 h may be necessary. Normal LS in the lower limbs shows the tracer to pass into lymphatics almost immediately and channels are usually visualized within 5 min. In the lower limbs, symmetrical lymph flow to nodes are seen in the groin, iliac and paravertebral region with activity later seen in the liver. In 31 patients studied over the last 3 years, 17 studies were normal and 14 were abnormal: Klippel-Trenaunay-Weber syndrome (n = 4) with delayed flow and dermal backflow; congenital lymph/vascular malformations (n = 6) with various delayed flow patterns and focal accumulations; congenital lymphoedema (n 3) and pulmonary lymphangiectasia (n = 1). Aplasia and hypoplasia of lymph systems are readily identified. In conclusion, LS is a valuable diagnostic technique to assess lymph flow and diagnose lymphatic malformations and the causes of lymphoedema in children

  8. Characterization of open-cycle coal-fired MHD generators. Quarterly technical summary report No. 6, October 1--December 31, 1977. [PACKAGE code

    Energy Technology Data Exchange (ETDEWEB)

    Kolb, C.E.; Yousefian, V.; Wormhoudt, J.; Haimes, R.; Martinez-Sanchez, M.; Kerrebrock, J.L.

    1978-01-30

    Research has included theoretical modeling of important plasma chemical effects such as: conductivity reductions due to condensed slag/electron interactions; conductivity and generator efficiency reductions due to the formation of slag-related negative ion species; and the loss of alkali seed due to chemical combination with condensed slag. A summary of the major conclusions in each of these areas is presented. A major output of the modeling effort has been the development of an MHD plasma chemistry core flow model. This model has been formulated into a computer program designated the PACKAGE code (Plasma Analysis, Chemical Kinetics, And Generator Efficiency). The PACKAGE code is designed to calculate the effect of coal rank, ash percentage, ash composition, air preheat temperatures, equivalence ratio, and various generator channel parameters on the overall efficiency of open-cycle, coal-fired MHD generators. A complete description of the PACKAGE code and a preliminary version of the PACKAGE user's manual are included. A laboratory measurements program involving direct, mass spectrometric sampling of the positive and negative ions formed in a one atmosphere coal combustion plasma was also completed during the contract's initial phase. The relative ion concentrations formed in a plasma due to the methane augmented combustion of pulverized Montana Rosebud coal with potassium carbonate seed and preheated air are summarized. Positive ions measured include K/sup +/, KO/sup +/, Na/sup +/, Rb/sup +/, Cs/sup +/, and CsO/sup +/, while negative ions identified include PO/sub 3//sup -/, PO/sub 2//sup -/, BO/sub 2//sup -/, OH/sup -/, SH/sup -/, and probably HCrO/sub 3/, HMoO/sub 4//sup -/, and HWO/sub 3//sup -/. Comparison of the measurements with PACKAGE code predictions are presented. Preliminary design considerations for a mass spectrometric sampling probe capable of characterizing coal combustion plasmas from full scale combustors and flow trains are presented

  9. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  10. Preliminary results on clinical effects of probiotic Lactobacillus salivarius LS01 in children affected by atopic dermatitis.

    Science.gov (United States)

    Niccoli, Antonio A; Artesi, Anna L; Candio, Francesco; Ceccarelli, Sara; Cozzali, Rita; Ferraro, Luigi; Fiumana, Donatella; Mencacci, Manuela; Morlupo, Maurizio; Pazzelli, Paola; Rossi, Laura; Toscano, Marco; Drago, Lorenzo

    2014-01-01

    The goal of this study was to evaluate the clinical efficacy of an intake of Lactobacillus salivarius LS01 (DSM 22775) for the treatment of atopic dermatitis (AD) in children. AD is an inflammatory and pruritic chronic relapsing skin disorder with multifactorial etiopathology. Some evidence suggests that probiotics may improve AD by modulating the immune system and the composition of intestinal microbiota. A total of 43 patients aged from 0 to 11 years were enrolled in the study (M/F ratio=1:1) and treated with the probiotic strain L. salivarius LS01. Clinical efficacy of probiotic treatment was assessed from baseline by changes in itch index and in the objective SCORAD/SCORAD index. Patients being given probiotic treatment showed a significant improvement in clinical parameters (SCORAD and itch values) from baseline. The reduction in SCORAD and itch index observed after 4 weeks of treatment also persisted after the cessation of probiotic supplementation. L. salivarius LS01 seems to be able to improve the quality of life of children affected by AD and, as a consequence, it may have promising clinical and research implications.

  11. A novel method of generating and remembering international morse codes

    Digital Repository Service at National Institute of Oceanography (India)

    Charyulu, R.J.K.

    untethered communications have been advanced, despite as S.O.S International Morse Code will be at rescue as an emergency tool, when all other modes fail The details of hte method and actual codes have been enumerated....

  12. Loft CIS analysis 2''-LS-118-AB outside containment penetration S5-D

    International Nuclear Information System (INIS)

    Morton, D.K.

    1978-01-01

    A stress analysis was performed on the 2''-LS-118-AB pipe system outside containment penetration S5-D. Deadweight, thermal expansion, and seismic loads were considered. The results indicate that this piping will meet ASME Section III, Class 2 requirements provided a U-bolt (S4) is installed as indicated in this report

  13. Simulation of sludge deposit onto a 900 MW steam generator tubesheet with the 3D code GENEPI

    International Nuclear Information System (INIS)

    Pascal-Ribot, S.; Debec-Mathet, E.; Soussan, D.; Grandotto, M.

    1998-01-01

    Heat transfer processes use fluids which are generally not pure and can react with transfer surfaces. These surfaces are subject to deposits which can be sediments harmful to heat transfer and to integrity of materials. For nuclear plant steam generators, sludge build-up accelerates secondary side corrosion by concentrating chemical species. A major safety problem involved with such a corrosion is the growing of circumferential cracks which are very difficult to detect and size with eddy current probes. With a view to understand and control this problem, it is necessary to develop a mathematical model for the prediction of sludge behavior in PWR steam generators. Based on fundamental principles, this work intends to use different models available in literature for the prediction of the phenomenon leading to the accumulation of sludge particles at the bottom (the tubesheet) of a PWR. For that, a three-dimensional simulation of magnetite particulate fouling with the finite elements code GENEPI is performed on a 900 MWe steam generator. The use of GENEPI code, originally designed and qualified for the analysis of steam generators thermalhydraulics is done in two steps. First, the local thermalhydraulic conditions of the carrier phase are calculated with the classical conservation equations of mass, momentum and enthalpy for the steam/water mixture (homogeneous model). Then, they are used for the solving of a particle transport equation. The mass transfer processes, which have been taken into account, are gravitational settling, sticking probability and reentrainment describing respectively the transport of sludge particles to the tubesheet, the particle attachment to this surface and the re-suspension of deposited particles from the tubesheet. A sink term characterizing the blowdown effect is also considered in the calculations. Deposition on the tube bundle surface area is not modelled. For this first approach, the simulation is made with a single particle size and

  14. ARCADIAR - A New Generation of Coupled Neutronics / Core Thermal- Hydraulics Code System at AREVA NP

    International Nuclear Information System (INIS)

    Curca-Tivig, Florin; Merk, Stephan; Pautz, Andreas; Thareau, Sebastien

    2007-01-01

    Anticipating future needs of our customers and willing to concentrate synergies and competences existing in the company for the benefit of our customers, AREVA NP decided in 2002 to develop the next generation of coupled neutronics/ core thermal-hydraulic (TH) code systems for fuel assembly and core design calculations for both, PWR and BWR applications. The global CONVERGENCE project was born: after a feasibility study of one year (2002) and a conceptual phase of another year (2003), development was started at the beginning of 2004. The present paper introduces the CONVERGENCE project, presents the main feature of the new code system ARCADIA R and concludes on customer benefits. ARCADIA R is designed to meet AREVA NP market and customers' requirements worldwide. Besides state-of-the-art physical modeling, numerical performance and industrial functionality, the ARCADIA R system is featuring state-of-the-art software engineering. The new code system will bring a series of benefits for our customers: e.g. improved accuracy for heterogeneous cores (MOX/ UOX, Gd...), better description of nuclide chains, and access to local neutronics/ thermal-hydraulics and possibly thermal-mechanical information (3D pin by pin full core modeling). ARCADIA is a registered trademark of AREVA NP. (authors)

  15. Dual Coding, Reasoning and Fallacies.

    Science.gov (United States)

    Hample, Dale

    1982-01-01

    Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)

  16. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  17. Numerical Simulation of Wind Turbine Rotors Autorotation by Using the Modified LS-STAG Immersed Boundary Method

    Directory of Open Access Journals (Sweden)

    Ilia K. Marchevsky

    2017-01-01

    Full Text Available A software package is developed for numerical simulation of wind turbine rotors autorotation by using the modified LS-STAG level-set/cut-cell immersed boundary method. The level-set function is used for immersed boundaries description. Algorithm of level-set function construction for complex-shaped airfoils, based on Bézier curves usage, is proposed. Also, algorithm for the level-set function recalculation at any time without reconstructing the Bézier curve for each new rotor position is described. The designed second-order Butterworth low-pass filter for aerodynamic torque filtration for simulations using coarse grids is presented. To verify the modified LS-STAG method, the flow past autorotating Savonius rotor with two blades was simulated at Re=1.96·105.

  18. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  19. Biodistribution, pharmacokinetic, and imaging studies with 186Re-labeled NR-LU-10 whole antibody in LS174T colonic tumor-bearing mice

    International Nuclear Information System (INIS)

    Goldrosen, M.H.; Biddle, W.C.; Pancook, J.; Bakshi, S.; Vanderheyden, J.L.; Fritzberg, A.R.; Morgan, A.C. Jr.; Foon, K.A.

    1990-01-01

    Biodistribution, pharmacokinetic, and radioimaging studies were performed with 186Re-labeled NR-LU-10 whole antibody in athymic nude mice bearing the LS174T tumor growing either s.c. or in an experimental hepatic metastasis model. NR-LU-10 is an IgG2b murine monoclonal antibody (MAb) that reacts with virtually all human tumors of epithelial origin. NR-BC-1, a IgG2b murine MAb that reacts with normal human B-cell and B malignancies, was used as an isotype-matched control. These MAbs were radiolabeled with 186Re by a preformed chelate approach by using the triamide thiolate ligand system. 186Re-labeled NR-LU-10 (50 microCi) was injected into nude mice bearing LS174T tumors growing s.c. Biodistribution studies revealed that the LS174T tumor retained the highest concentration of 186Re-labeled NR-LU-10 at day 6. The tumor:blood ratio ranged from 0.1:1 to 10.8:1 by day 6, the last day of analysis. In contrast the tumor:blood ratio of 186Re-labeled NR-BC-1, the isotype-matched MAb control, was 1:1 on day 6. Pharmacokinetic analysis indicated that the t1/2 beta of NR-LU-10 for blood and other tissues ranged from 21 to 25 h, while the t1/2 beta for the LS174T tumor averaged 52 h. The area under the curve for tumor compared to blood was 2.8- to 5.7-fold higher than the area under the curve for all other tissues and organs. The mean residence time for NR-LU-10 in blood and all other organs ranged from 23 to 26 h, while the mean residence time for NR-LU-10 in the LS174T tumor was 72 h. Scintigraphic images revealed selective uptake of the 186Re-labeled NR-LU-10, but not of the 186Re-labeled NR-BC-1, at the LS174T tumor site. Studies in an experimental model of hepatic metastasis revealed a similar selective pattern of 186Re-labeled NR-LU-10 accumulation. Scintigraphic images of the LS174T tumor growing within the athymic nude mouse liver were obtained

  20. Improving temporal resolution in fMRI using a 3D spiral acquisition and low rank plus sparse (L+S) reconstruction.

    Science.gov (United States)

    Petrov, Andrii Y; Herbst, Michael; Andrew Stenger, V

    2017-08-15

    Rapid whole-brain dynamic Magnetic Resonance Imaging (MRI) is of particular interest in Blood Oxygen Level Dependent (BOLD) functional MRI (fMRI). Faster acquisitions with higher temporal sampling of the BOLD time-course provide several advantages including increased sensitivity in detecting functional activation, the possibility of filtering out physiological noise for improving temporal SNR, and freezing out head motion. Generally, faster acquisitions require undersampling of the data which results in aliasing artifacts in the object domain. A recently developed low-rank (L) plus sparse (S) matrix decomposition model (L+S) is one of the methods that has been introduced to reconstruct images from undersampled dynamic MRI data. The L+S approach assumes that the dynamic MRI data, represented as a space-time matrix M, is a linear superposition of L and S components, where L represents highly spatially and temporally correlated elements, such as the image background, while S captures dynamic information that is sparse in an appropriate transform domain. This suggests that L+S might be suited for undersampled task or slow event-related fMRI acquisitions because the periodic nature of the BOLD signal is sparse in the temporal Fourier transform domain and slowly varying low-rank brain background signals, such as physiological noise and drift, will be predominantly low-rank. In this work, as a proof of concept, we exploit the L+S method for accelerating block-design fMRI using a 3D stack of spirals (SoS) acquisition where undersampling is performed in the k z -t domain. We examined the feasibility of the L+S method to accurately separate temporally correlated brain background information in the L component while capturing periodic BOLD signals in the S component. We present results acquired in control human volunteers at 3T for both retrospective and prospectively acquired fMRI data for a visual activation block-design task. We show that a SoS fMRI acquisition with an

  1. CMS outreach event to close LS1

    CERN Multimedia

    Achintya Rao

    2015-01-01

    CMS opened its doors to about 700 students from schools near CERN, who visited the detector on 16 and 17 February during the last major CMS outreach event of LS1.   Pellentesque sapien mi, pharetra vitae, auctor eu, congue sed, turpis. Enthusiastic CMS guides spent a day and a half showing the equally enthusiastic visitors, aged 10 to 18, the beauty of CMS and particle physics. The recently installed wheelchair lift was called into action and enabled a visitor who arrived on crutches to access the detector cavern unimpeded.  The CMS collaboration had previously devoted a day to school visits after the successful “Neighbourhood Days” in May 2014 and, encouraged by the turnout, decided to extend an invitation to local schools once again. The complement of nearly 40 guides and crowd marshals was aided by a support team that coordinated the transportation of the young guests and received them at Point 5, where a dedicated safety team including first-aiders, security...

  2. Purification and genetic characterisation of the novel bacteriocin LS2 produced by the human oral strain Lactobacillus salivarius BGHO1.

    Science.gov (United States)

    Busarcevic, Milos; Dalgalarrondo, Michèle

    2012-08-01

    The aim of this study was to investigate the antimicrobial potential of Lactobacillus salivarius BGHO1, a human oral strain with probiotic characteristics and a broad inhibitory spectrum both against Gram-positive and Gram-negative pathogens. Here we present the bacteriocin LS2, an extremely pH- and heat-stable peptide with antilisterial activity. LS2 is a novel member of the class IId bacteriocins, unique among all currently characterised bacteriocins. It is somewhat similar to putative bacteriocins from several oral streptococci, including the cariogenic Streptococcus mutans. LS2 is a 41-amino-acid, highly hydrophobic cationic peptide of 4115.1Da that is sensitive to proteolytic enzymes. LS2 was purified from cells of strain BGHO1 by solvent extraction and reverse-phase chromatography. Mass spectrometry was used to determine the molecular mass of the purified peptide. N-terminal amino acid sequencing enabled identification of the LS2 structural gene bacls2 by a reverse genetics approach. Downstream of the bacls2 gene, two bacteriocin-like genes were found, named blp1a and blp1b, and one putative bacteriocin immunity gene named bimlp. We also present the identification of the 242-kb megaplasmid pMPHO1 by pulsed-field gel electrophoresis, which harbours the genes bacls2, blp1a, blp1b and bimlp. Two peptides with antimicrobial activity, whose approximate sizes corresponded to those of blp1a and blp1b, were identified only after culturing strain BGHO1 in a chemically defined medium. This study demonstrated the capacity of Lactobacillus salivarius BGHO1 to produce multiple bacteriocins and further established this strain as a promising probiotic candidate. Copyright © 2012 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.

  3. Audit of Clinical Coding of Major Head and Neck Operations

    Science.gov (United States)

    Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean

    2009-01-01

    INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944

  4. Effect of Co-overexpression of Nisin Key Genes on Nisin Production Improvement in Lactococcus lactis LS01.

    Science.gov (United States)

    Ni, Zhi-Jian; Zhang, Xiao-Yuan; Liu, Fei; Wang, Miao; Hao, Rong-Hua; Ling, Pei-Xue; Zhu, Xi-Qiang

    2017-06-01

    Nisin is a small antimicrobial peptide produced by several subset strains of Lactococcus lactis. To improve nisin yield in the producer L. lactis LS01, we proposed a successive fusion of nisA with nisRK and nisFEG into a single shuttle expression vector pMG36e under the control of the native strong constitutive promoter p32. Subsequently, the recombinant vectors were transplanted into the producer cell through electroporation. Nisin productivity was determined through sodium dodecyl sulfate-polyacrylamide gel electrophoresis and bioactivity assays. Expression of nisin peptide was detected by agar diffusion bioassay, and the transcriptional levels of the target genes involved in nisin biosynthesis were investigated via semi-quantitative reverse transcription PCR expression analysis using 16S ribosomal RNA (rRNA) as an internal control. Results suggested that the introduction of empty plasmid did not affect nisin production of L. lactis LS01, whereas by our rational construction and screening, the engineered strain co-overexpressing nisA, nisRK, and nisFEG achieved a maximum increment in bioactive nisin production with a yield of 2470 IU/ml in shake flasks and 4857 IU/ml in 1.0-l fermenters, which increased by approximately 66.3 and 52.6% (P < 0.05), respectively, compared with that of the original strain under the given fermentation conditions. Meanwhile, the transcriptional analysis revealed that the expression of most of these multicopy genes except nisE at transcriptional level were upregulated in the two recombinant strains (LS01/pAR and LS01/pARF), possibly contributing to the improved nisin production. Therefore, this study would provide a potential strategy to improve the economic benefits of nisin manufacture for large-scale industrial production.

  5. Development of the next generation code system as an engineering modelling language. 3. Study with prototyping. 2

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Chiba, Go; Kasahara, Naoto; Ishikawa, Makoto

    2004-04-01

    In the fast reactor development, numerical simulations using analysis code play and important role for complementing theory and experiment. In order to efficiently advance the research and development of fast reactors, JNC promotes the development of next generation simulation code (NGSC). In this report, an investigation research result by prototyping which carried out for the conceptual design of the NGSC is described. From the viewpoint of the cooperative research with CEA (Commissariat a l'Energie Atomique) in France, a trend survey on several platforms for numerical analysis and an applicability evaluation of the SALOME platform in CEA for the NGSC were carried out. As a result of the evaluation, it is confirmed that the SALOME had been satisfied the features of efficiency, openness, universality, expansibility and completeness that are required by the NGSC. In addition, it is confirmed that the SALOME had the concept of the control layer required by the NGSC and would be one of the important candidates as a platform of the NGSC. In the field of the structure analysis, the prototype of the PRTS.NET code was reexamined from the viewpoint of class structure and input/output specification in order to improve the data processing efficiency and maintainability. In the field of the reactor physics analysis, a development test of a new code with C++ and a reuse test of an existing code written in Fortran was carried out in view of utilizing the SALOME for the NGSC. (author)

  6. Generation of initial geometries for the simulation of the physical system in the DualPHYsics code

    International Nuclear Information System (INIS)

    Segura Q, E.

    2013-01-01

    In the diverse research areas of the Instituto Nacional de Investigaciones Nucleares (ININ) are different activities related to science and technology, one of great interest is the study and treatment of the collection and storage of radioactive waste. Therefore at ININ the draft on the simulation of the pollutants diffusion in the soil through a porous medium (third stage) has this problem inherent aspects, hence a need for such a situation is to generate the initial geometry of the physical system For the realization of the simulation method is implemented smoothed particle hydrodynamics (SPH). This method runs in DualSPHysics code, which has great versatility and ability to simulate phenomena of any physical system where hydrodynamic aspects combine. In order to simulate a physical system DualSPHysics code, you need to preset the initial geometry of the system of interest, then this is included in the input file of the code. The simulation sets the initial geometry through regular geometric bodies positioned at different points in space. This was done through a programming language (Fortran, C + +, Java, etc..). This methodology will provide the basis to simulate more complex geometries future positions and form. (Author)

  7. XML-Based Generator of C++ Code for Integration With GUIs

    Science.gov (United States)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  8. LS1 Report: achieving the unachievable

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    The dismantling and extraction of a defective DFBA module from LHC Point 6, announced a few weeks ago, has been completed without a hitch. The DFBAs in the LHC are unique and irreplaceable components that must be handled with care.   The Transport team extract the defective module in one of the two DFBAs at Point 6. This module was brought to the surface, where it is currently being repared. Dismantling and extracting part of an electrical feed box (DFBA) had not been planned and could not have been foreseen. Nonetheless, that is what had to be done. When the LS1 teams discovered that the bellows of one of the DFBAs in Sector 5-6 were damaged - and completely inaccessible - they were not exactly overwhelmed with solutions. In fact, they had only one option: to dismantle them and take them up to the surface. Step 1: measure the alignment of the module to be taken out in relation to the beam lines to ensure that when the DFBA is put back in, it is in the right position for the beam to pass thr...

  9. FERMI/LAT OBSERVATIONS OF LS 5039

    International Nuclear Information System (INIS)

    Abdo, A. A.; Ackermann, M.; Ajello, M.; Bechtol, K.; Berenji, B.; Blandford, R. D.; Bloom, E. D.; Borgland, A. W.; Atwood, W. B.; Axelsson, M.; Baldini, L.; Bellazzini, R.; Bregeon, J.; Brez, A.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Baughman, B. M.; Bonamente, E.; Brigida, M.

    2009-01-01

    The first results from observations of the high-mass X-ray binary LS 5039 using the Fermi Gamma-ray Space Telescope data between 2008 August and 2009 June are presented. Our results indicate variability that is consistent with the binary period, with the emission being modulated with a period of 3.903 ± 0.005 days; the first detection of this modulation at GeV energies. The light curve is characterized by a broad peak around superior conjunction in agreement with inverse Compton scattering models. The spectrum is represented by a power law with an exponential cutoff, yielding an overall flux (100 MeV-300 GeV) of 4.9 ± 0.5(stat) ± 1.8(syst) x10 -7 photon cm -2 s -1 , with a cutoff at 2.1 ± 0.3(stat) ± 1.1(syst) GeV and photon index Γ = 1.9 ± 0.1(stat) ± 0.3(syst). The spectrum is observed to vary with orbital phase, specifically between inferior and superior conjunction. We suggest that the presence of a cutoff in the spectrum may be indicative of magnetospheric emission similar to the emission seen in many pulsars by Fermi.

  10. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  11. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  12. Coded ultrasonic remote control without batteries

    International Nuclear Information System (INIS)

    Gerhardy, C; Burlage, K; Schomburg, W K

    2009-01-01

    A concept for battery-less remote controls has been developed based on mechanically actuated beams and micro whistles generating ultrasound signals. These signals need to be frequency or time coded to increase the number of signals which can be distinguished from each other and environmental ultrasound. Several designs for generating coded ultrasonic signals have been investigated

  13. Spread-spectrum communication using binary spatiotemporal chaotic codes

    International Nuclear Information System (INIS)

    Wang Xingang; Zhan Meng; Gong Xiaofeng; Lai, C.H.; Lai, Y.-C.

    2005-01-01

    We propose a scheme to generate binary code for baseband spread-spectrum communication by using a chain of coupled chaotic maps. We compare the performances of this type of spatiotemporal chaotic code with those of a conventional code used frequently in digital communication, the Gold code, and demonstrate that our code is comparable or even superior to the Gold code in several key aspects: security, bit error rate, code generation speed, and the number of possible code sequences. As the field of communicating with chaos faces doubts in terms of performance comparison with conventional digital communication schemes, our work gives a clear message that communicating with chaos can be advantageous and it deserves further attention from the nonlinear science community

  14. Evaluation of medium-chain-length polyhydroxyalkanoate production by Pseudomonas putida LS46 using biodiesel by-product streams.

    Science.gov (United States)

    Fu, Jilagamazhi; Sharma, Umesh; Sparling, Richard; Cicek, Nazim; Levin, David B

    2014-07-01

    Medium-chain-length polyhydroxyalkanoate (mcl-PHA) production by Pseudomonas putida LS46 was analyzed in shake-flask-based batch reactions, using pure chemical-grade glycerol (PG), biodiesel-derived "waste" glycerol (WG), and biodiesel-derived "waste" free fatty acids (WFA). Cell growth, substrate consumption, mcl-PHA accumulation within the cells, and the monomer composition of the synthesized biopolymers were monitored. The patterns of mcl-PHA synthesis in P. putida LS46 cells grown on PG and WG were similar but differed from that of cells grown with WFA. Polymer accumulation in glycerol-based cultures was stimulated by nitrogen limitation and plateaued after 48 h in both PG and WG cultures, with a total accumulation of 17.9% cell dry mass and 16.3% cell dry mass, respectively. In contrast, mcl-PHA synthesis was independent of nitrogen concentration in P. putida LS46 cells cultured with WFA, which accumulated to 29% cell dry mass. In all cases, the mcl-PHAs synthesized consisted primarily of 3-hydroxyoctanoate (C(8)) and 3-hydroxydecanoate (C(10)). WG and WFA supported similar or greater cell growth and mcl-PHA accumulation than PG under the experimental conditions used. These results suggest that biodiesel by-product streams could be used as low-cost carbon sources for sustainable mcl-PHA production.

  15. Frequency domain based LS channel estimation in OFDM based Power line communications

    OpenAIRE

    Bogdanović, Mario

    2015-01-01

    This paper is focused on low voltage power line communication (PLC) realization with an emphasis on channel estimation techniques. The Orthogonal Frequency Division Multiplexing (OFDM) scheme is preferred technology in PLC systems because of its effective combat with frequency selective fading properties of PLC channel. As the channel estimation is one of the crucial problems in OFDM based PLC system because of a problematic area of PLC signal attenuation and interference, the improved LS est...

  16. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  17. One way quantum repeaters with quantum Reed-Solomon codes

    OpenAIRE

    Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang

    2018-01-01

    We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of $d$-level systems for large dimension $d$. We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generation of quantum repeaters using quan...

  18. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  19. LIBVERSIONINGCOMPILER: An easy-to-use library for dynamic generation and invocation of multiple code versions

    Science.gov (United States)

    Cherubin, S.; Agosta, G.

    2018-01-01

    We present LIBVERSIONINGCOMPILER, a C++ library designed to support the dynamic generation of multiple versions of the same compute kernel in a HPC scenario. It can be used to provide continuous optimization, code specialization based on the input data or on workload changes, or otherwise to dynamically adjust the application, without the burden of a full dynamic compiler. The library supports multiple underlying compilers but specifically targets the LLVM framework. We also provide examples of use, showing the overhead of the library, and providing guidelines for its efficient use.

  20. A Comparison of Nuclear Power Plant Simulator with RELAP5/MOD3 code about Steam Generator Tube Rupture

    International Nuclear Information System (INIS)

    Kim, Sung Hyun; Moon, Chan Ki; Park, Sung Baek; Na, Man Gyun

    2013-01-01

    The RELAP5/MOD3 code introduced in cooperation with U. S. NRC has been utilized mainly for validation calculation of accident analysis submitted by licensee in Korea. The Korea Institute of Nuclear Safety has built a verification system of LWR accident analysis with RELAP5/MOD3 code engine. Therefore, the simulator replicates the design basis accident and its results are compared with RELAP5/MOD3 code results that will have important implications in the verification of the simulator in the future. The SGTR simulations were performed by the simulator and its results were compared with ones by RELAP5/MOD3 code in this study. Thus, the results of this study can be used as materials to build the verification system of the nuclear power plant simulator. We tried to compare with RELAP5/MOD3 verification code by replicating major parameters of steam generator tube rupture using the simulator for OPR-1000 in Yonggwang training center. By comparing the changes in temperature, pressure and inventory of the reactor coolant system and main steam system during the SGTR, it was confirmed that the main behaviors of SGTR which the simulator and RELAP5/MOD3 code showed are similar. However, the behavior of SG pressure and level that are important parameters to diagnose the accident were a little different. We estimated that RELAP5/MOD3 code was not reflected the major control systems in detail, such as FWCS, SBCS and PPCS. The different behaviors of SG level and pressure in this study should be needed an additional review. As a result of the comparison, the major simulation parameters behavior by RELAP5/MOD3 code agreed well with the one by the simulator. Therefore, it is thought that RELAP5/MOD3 code is used as a tool for validation of NPP simulator in the near future through this study

  1. Balanced Reed-Solomon codes for all parameters

    KAUST Repository

    Halbawi, Wael; Liu, Zihan; Hassibi, Babak

    2016-01-01

    We construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes with any length n and dimension k. By sparsest, we mean that each row has the least possible number of nonzeros, while balanced means that the number of nonzeros in any two columns differs by at most one. Codes allowing such encoding schemes are useful in distributed settings where computational load-balancing is critical. The problem was first studied by Dau et al. who showed, using probabilistic arguments, that there always exists an MDS code over a sufficiently large field such that its generator matrix is both sparsest and balanced. Motivated by the need for an explicit construction with efficient decoding, the authors of the current paper showed that the generator matrix of a cyclic Reed-Solomon code of length n and dimension k can always be transformed to one that is both sparsest and balanced, when n and k are such that k/n (n-k+1) is an integer. In this paper, we lift this condition and construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes for any set of parameters.

  2. Balanced Reed-Solomon codes for all parameters

    KAUST Repository

    Halbawi, Wael

    2016-10-27

    We construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes with any length n and dimension k. By sparsest, we mean that each row has the least possible number of nonzeros, while balanced means that the number of nonzeros in any two columns differs by at most one. Codes allowing such encoding schemes are useful in distributed settings where computational load-balancing is critical. The problem was first studied by Dau et al. who showed, using probabilistic arguments, that there always exists an MDS code over a sufficiently large field such that its generator matrix is both sparsest and balanced. Motivated by the need for an explicit construction with efficient decoding, the authors of the current paper showed that the generator matrix of a cyclic Reed-Solomon code of length n and dimension k can always be transformed to one that is both sparsest and balanced, when n and k are such that k/n (n-k+1) is an integer. In this paper, we lift this condition and construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes for any set of parameters.

  3. A code system to generate multigroup cross-sections using basic data

    International Nuclear Information System (INIS)

    Garg, S.B.; Kumar, Ashok

    1978-01-01

    For the neutronic studies of nuclear reactors, multigroup cross-sections derived from the basic energy point data are needed. In order to carry out the design based studies, these cross-sections should also incorporate the temperature and fuel concentration effects. To meet these requirements, a code system comprising of RESRES, UNRES, FIGERO, INSCAT, FUNMO, AVER1 and BGPONE codes has been adopted. The function of each of these codes is discussed. (author)

  4. Prediction of fetal lung maturity using the lecithin/sphingomyelin (L/S) ratio analysis with a simplified sample preparation, using a commercial microtip-column combined with mass spectrometric analysis.

    Science.gov (United States)

    Kwak, Ho-Seok; Chung, Hee-Jung; Choi, Young Sik; Min, Won-Ki; Jung, So Young

    2015-07-01

    Fetal lung maturity is estimated using the lecithin/sphingomyelin ratio (L/S ratio) in amniotic fluid and it is commonly measured with thin-layer chromatography (TLC). The TLC method is time consuming and technically difficult; however, it is widely used because there is no alternative. We evaluated a novel method for measuring the L/S ratio, which involves a tip-column with a cation-exchange resin and mass spectrometry. Phospholipids in the amniotic fluid were extracted using methanol and chloroform. Choline-containing phospholipids such as lecithin and sphingomyelin were purified by passing them through the tip-column. LC-MS/MS and MALDI-TOF were used to directly analyze the purified samples. The L/S ratio by mass spectrometry was calculated from the sum peak intensity of the six lecithin, and that of sphingomyelin 34:1. In 20 samples, the L/S ratio determined with TLC was significantly correlated with that obtained by LC-MS/MS and MALDI-TOF. There was a 100% concordance between the L/S ratio by TLC and that by LC-MS/MS (kappa value=1.0). The concordance between the L/S ratio by TLC and that by MALDI-TOF was also 100% (kappa value=1.0). Our method provides a faster, simpler, and more reliable assessment of fetal lung maturity. The L/S ratio measured by LC-MS/MS and MALDI-TOF offers a compelling alternative method to traditional TLC. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Quasi-cyclic unit memory convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Paaske, Erik; Ballan, Mark

    1990-01-01

    Unit memory convolutional codes with generator matrices, which are composed of circulant submatrices, are introduced. This structure facilitates the analysis of efficient search for good codes. Equivalences among such codes and some of the basic structural properties are discussed. In particular......, catastrophic encoders and minimal encoders are characterized and dual codes treated. Further, various distance measures are discussed, and a number of good codes, some of which result from efficient computer search and some of which result from known block codes, are presented...

  6. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    International Nuclear Information System (INIS)

    Blink, J.A.

    1985-03-01

    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs

  7. Comparison of Two Commercial FE-Codes for Sheet Metal Forming

    International Nuclear Information System (INIS)

    Revuelta, A.; Larkiola, J.; Kanervo, K.; Korhonen, A. S.; Myllykoski, P.

    2007-01-01

    There is urgent need to develop new advanced fast and cost-effective mass-production methods for small sheet metal components. Traditionally progressive dies have been designed by using various CAD techniques. Recent results in mass production of small sheet metal parts using progressive dies and a transfer press showed that the tool design time may be cut in up to a half by using 3D finite element simulation of forming. In numerical simulation of sheet metal forming better constitutive models are required to obtain more accurate results, reduce the time for tool design and cut the production costs further. Accurate models are needed to describe the initial yielding, subsequent work hardening and to predict the formability. In this work two commercially available finite element simulation codes, PAM-STAMP and LS-DYNA, were compared in forming of small austenitic stainless steel sheet part for electronic industry. Several constitutive models were used in both codes and the results were compared. Comparisons were made between the same models in each of the codes and also between different models in the same code. Material models ranged from very simple to advanced ones, which took into account anisotropy and both isotropic and kinematic hardening behavior. In order to make a valid comparison we employed similar finite element meshes. The effects of the material models parameters were studied and the results were compared with experiments. The effects of the computational time were also studied

  8. Parallel Calculations in LS-DYNA

    Science.gov (United States)

    Vartanovich Mkrtychev, Oleg; Aleksandrovich Reshetov, Andrey

    2017-11-01

    Nowadays, structural mechanics exhibits a trend towards numeric solutions being found for increasingly extensive and detailed tasks, which requires that capacities of computing systems be enhanced. Such enhancement can be achieved by different means. E.g., in case a computing system is represented by a workstation, its components can be replaced and/or extended (CPU, memory etc.). In essence, such modification eventually entails replacement of the entire workstation, i.e. replacement of certain components necessitates exchange of others (faster CPUs and memory devices require buses with higher throughput etc.). Special consideration must be given to the capabilities of modern video cards. They constitute powerful computing systems capable of running data processing in parallel. Interestingly, the tools originally designed to render high-performance graphics can be applied for solving problems not immediately related to graphics (CUDA, OpenCL, Shaders etc.). However, not all software suites utilize video cards’ capacities. Another way to increase capacity of a computing system is to implement a cluster architecture: to add cluster nodes (workstations) and to increase the network communication speed between the nodes. The advantage of this approach is extensive growth due to which a quite powerful system can be obtained by combining not particularly powerful nodes. Moreover, separate nodes may possess different capacities. This paper considers the use of a clustered computing system for solving problems of structural mechanics with LS-DYNA software. To establish a range of dependencies a mere 2-node cluster has proven sufficient.

  9. Fast code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Oliveira, P.M.C. de; Penna, T.J.P.

    1988-01-01

    A computer code to generate the dynamic evolution of the Ising model on a square lattice, following the Metropolis algorithm is presented. The computer time consumption is reduced by a factor of 8 when one compares our code with traditional multiple spin codes. The memory allocation size is also reduced by a factor of 4. The code is easily generalizable for other lattices and models. (author) [pt

  10. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  11. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  12. LS1 Report: Setting the bar high

    CERN Multimedia

    Anaïs Schaeffer

    2014-01-01

    This week LS1 successfully passed an important milestone: the first pressure test of a complete sector, sector 6-7.  The objective of this test was to check the mechanical integrity and overall leak-tightness of this section of the LHC by injecting it with pressurised helium.   The team in charge of the preparation and of the realisation of the pressure tests in sector 6-7. “Given the scale of the work and of the operations carried out during 2013, particularly in the framework of the SMACC project and of the repair of the compensators of the cryogenic distribution line (QRL), we need to revalidate the integrity of the systems before the accelerator starts up again,” explains Olivier Pirotte, who is in charge of the pressure tests (TE-CRG). The pressure tests are performed over a single day after two weeks of intensive activity to prepare and specially configure the cryogenic instrumentation in the tunnel, and the pressure within a sector is increased in stages,...

  13. LS1 Report: Thank you magnetic horn!

    CERN Multimedia

    Antonella Del Rosso & Katarina Anthony

    2014-01-01

    Experiments at the Antimatter Decelerator (AD) have been receiving beams since the beginning of this week. There is a crucial element at the heart of the chain that prepares the antiproton beam: the so-called magnetic horn, a delicate piece of equipment that had to be refurbished during LS1 and that is now showing just how well it can perform.   View from the top of the target and horn trolley, along the direction of the beam. Antiprotons for the AD are produced by smashing a beam of protons from the PS onto an iridium target. However, the particles produced by the nuclear interactions are emitted at very wide angles; without a focussing element, all these precious particles would be lost. “A magnetic horn is placed at the exit of the target to focus back a large fraction of the negative particles, including antiprotons, parallel to the beam line and with the right momentum,” explains Marco Calviani, physicist in the EN Department and the expert in charge of the AD targe...

  14. LS1 Report: ALICE ups the ante

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    SPS up and running... LHC almost cold... CCC Operators back at their desks... all telltale signs of the start of Run 2! For the experiments, that means there are just a few short months left for them to prepare for beams. The CERN Bulletin will be checking in with each of the Big Four to see how they are getting on during these closing months...   It has been a long road for the ALICE LS1 team. From major improvements to the 19 sub-detectors to a full re-cabling and replacement of LEP-era electrical infrastructure, no part of the ALICE cavern has gone untouched.* With the experiment set to close in early December, the teams are making finishing touches before turning their focus towards re-commissioning and calibration. "Earlier this week, we installed the last two modules of the di-jet calorimeter," explains Werner Riegler, ALICE technical coordinator. "These are the final parts of a 60 degree calorimeter extension that is installed opposite the present calorimeter, c...

  15. Ultrafast all-optical code-division multiple-access networks

    Science.gov (United States)

    Kwong, Wing C.; Prucnal, Paul R.; Liu, Yanming

    1992-12-01

    In optical code-division multiple access (CDMA), the architecture of optical encoders/decoders is another important factor that needs to be considered, besides the correlation properties of those already extensively studied optical codes. The architecture of optical encoders/decoders affects, for example, the amount of power loss and length of optical delays that are associated with code sequence generation and correlation, which, in turn, affect the power budget, size, and cost of an optical CDMA system. Various CDMA coding architectures are studied in the paper. In contrast to the encoders/decoders used in prime networks (i.e., prime encodes/decoders), which generate, select, and correlate code sequences by a parallel combination of fiber-optic delay-lines, and in 2n networks (i.e., 2n encoders/decoders), which generate and correlate code sequences by a serial combination of 2 X 2 passive couplers and fiber delays with sequence selection performed in a parallel fashion, the modified 2n encoders/decoders generate, select, and correlate code sequences by a serial combination of directional couplers and delays. The power and delay- length requirements of the modified 2n encoders/decoders are compared to that of the prime and 2n encoders/decoders. A 100 Mbit/s optical CDMA experiment in free space demonstrating the feasibility of the all-serial coding architecture using a serial combination of 50/50 beam splitters and retroreflectors at 10 Tchip/s (i.e., 100,000 chip/bit) with 100 fs laser pulses is reported.

  16. ATLAS TDAQ application gateway upgrade during LS1

    CERN Document Server

    KOROL, A; The ATLAS collaboration; BOGDANCHIKOV, A; BRASOLIN, F; CONTESCU, A C; DUBROV, S; HAFEEZ, M; LEE, C J; SCANNICCHIO, D A; TWOMEY, M; VORONKOV, A; ZAYTSEV, A

    2014-01-01

    The ATLAS Gateway service is implemented with a set of dedicated computer nodes to provide a fine-grained access control between CERN General Public Network (GPN) and ATLAS Technical Control Network (ATCN). ATCN connects the ATLAS online farm used for ATLAS Operations and data taking, including the ATLAS TDAQ (Trigger and Data Aquisition) and DCS (Detector Control System) nodes. In particular, it provides restricted access to the web services (proxy), general login sessions (via SSH and RDP protocols), NAT and mail relay from ATCN. At the Operating System level the implementation is based on virtualization technologies. Here we report on the Gateway upgrade during Long Shutdown 1 (LS1) period: it includes the transition to the last production release of the CERN Linux distribution (SLC6), the migration to the centralized configuration management system (based on Puppet) and the redesign of the internal system architecture.

  17. Effects of doping on ferroelectric properties and leakage current behavior of KNN-LT-LS thin films on SrTiO3 substrate

    Science.gov (United States)

    Abazari, M.; Safari, A.

    2009-05-01

    We report the effects of Ba, Ti, and Mn dopants on ferroelectric polarization and leakage current of (K0.44Na0.52Li0.04)(Nb0.84Ta0.1Sb0.06)O3 (KNN-LT-LS) thin films deposited by pulsed laser deposition. It is shown that donor dopants such as Ba2+, which increased the resistivity in bulk KNN-LT-LS, had an opposite effect in the thin film. Ti4+ as an acceptor B-site dopant reduces the leakage current by an order of magnitude, while the polarization values showed a slight degradation. Mn4+, however, was found to effectively suppress the leakage current by over two orders of magnitude while enhancing the polarization, with 15 and 23 μC/cm2 remanent and saturated polarization, whose values are ˜70% and 82% of the reported values for bulk composition. This phenomenon has been associated with the dual effect of Mn4+ in KNN-LT-LS thin film, by substituting both A- and B-site cations. A detailed description on how each dopant affects the concentrations of vacancies in the lattice is presented. Mn-doped KNN-LT-LS thin films are shown to be a promising candidate for lead-free thin films and applications.

  18. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  19. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  20. Automatic Generation of Agents using Reusable Soft Computing Code Libraries to develop Multi Agent System for Healthcare

    OpenAIRE

    Priti Srinivas Sajja

    2015-01-01

    This paper illustrates architecture for a multi agent system in healthcare domain. The architecture is generic and designed in form of multiple layers. One of the layers of the architecture contains many proactive, co-operative and intelligent agents such as resource management agent, query agent, pattern detection agent and patient management agent. Another layer of the architecture is a collection of libraries to auto-generate code for agents using soft computing techni...

  1. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  2. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  3. Design and construction of a graphical interface for automatic generation of simulation code GEANT4

    International Nuclear Information System (INIS)

    Driss, Mozher; Bouzaine Ismail

    2007-01-01

    This work is set in the context of the engineering studies final project; it is accomplished in the center of nuclear sciences and technologies in Sidi Thabet. This project is about conceiving and developing a system based on graphical user interface which allows an automatic codes generation for simulation under the GEANT4 engine. This system aims to facilitate the use of GEANT4 by scientific not necessary expert in this engine and to be used in different areas: research, industry and education. The implementation of this project uses Root library and several programming languages such as XML and XSL. (Author). 5 refs

  4. SASSYS LMFBR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.

    1982-01-01

    The SASSYS code provides detailed steady-state and transient thermal-hydraulic analyses of the reactor core, inlet and outlet coolant plenums, primary and intermediate heat-removal systems, steam generators, and emergency shut-down heat removal systems in liquid-metal-cooled fast-breeder reactors (LMFBRs). The main purpose of the code is to analyze the consequences of failures in the shut-down heat-removal system and to determine whether this system can perform its mission adequately even with some of its components inoperable. The code is not plant-specific. It is intended for use with any LMFBR, using either a loop or a pool design, a once-through steam generator or an evaporator-superheater combination, and either a homogeneous core or a heterogeneous core with internal-blanket assemblies

  5. Requests from use experience of ORIGEN code. Activity of the working group on evaluation of nuclide generation and depletion

    International Nuclear Information System (INIS)

    Matsumura, Tetsuo

    2005-01-01

    A questionnaire survey was carried out through the committee members of the working group on evaluation of nuclide generation and depletion about the demand accuracy of the ORIGEN code which is used widely in various fields of design analysis and evaluation. WG committee asked each organization's ORIGEN user, and obtained the replay from various fields. (author)

  6. Interface of RETRAN/MASTER Code System for APR1400

    International Nuclear Information System (INIS)

    Ku, Keuk Jong; Kang, Sang Hee; Kim, Han Gon

    2008-01-01

    MASTER(Multi-purpose Analyzer for Static and Transient Effects of Reactors), which was developed by KAERI, is a nuclear analysis and design code which can simulate the pressurized water reactor core or boiling water reactor core in 3-dimensional geometry. RETRAN is a best-estimate code for transient analysis of Non-LOCA. RETRAN code generates neutron number density in core using point kinetics model which includes feedback reactivities and converts the neutron number density into reactor power. It is conventional that RETRAN code for power generation is roughly to extrapolate feedback reactivities which are provided by MASTER code only one time before transient analysis. The purpose of this paper is to interface RETRAN code with MASTER code by real-time processing and to supply adequate feedback reactivities to RETRAN code. So, we develop interface code called MATRAN for real-time feedback reactivity processing. And for the application of MATRAN code, we compare the results of real-time MATRAN code with those of conventional RETRAN/MASTER code

  7. Microbial Carbonate Precipitation by Synechococcus PCC8806, LS0519 and Synechocystis PCC6803 on Concrete Surfaces and in Low Saturation Solution

    Science.gov (United States)

    Zhu, T.; Lin, Y.; Dittrich, M.

    2015-12-01

    Microbial carbonate precipitation (MCP) by cyanobacteria has been recognized in a variety of environment such as freshwater, marine, cave, and even desert. Recently, their calcification potential has been tested in an emerging technology-- bioconcrete. This study is to explore the calcification by three cyanobacteria strains under different environmental conditions. Experiment A was carried out in 2mM NaHCO3 and 5mM CaCl2, with a cell concentration of 107 cells L-1. In experiment B, one side of the concrete surface was treated with bacteria and then immersed in the solution containing 0.4 mM NaHCO3 and 300 mM CaCl2. In experiment A, the pH of the abiotic condition remained constant around 8.55, while that of biotic conditions increased by 0.15 units in the presence of LS0519, and by 0.3 units in the presence of PCC8806 or PCC6803 within 8 hours. Over a period of 30 hours, PCC8806, LS0519 and PCC6803 removed 0.1, 0.12 and 0.2 mM calcium from the solution respectively. After 30 hours, the alkalinity of the solution decreased by 30 mg/L, 10 mg/L and 5 mg/L respectively in the presence of PCC6803, LS0519 and PCC8806. Under scanning electron microscopy (SEM), no precipitate was found in the abiotic condition, while calcium carbonate was associated by all the three strains. Among them, PCC6803 precipitated more carbonates. In experiment B, LS0519 and PCC8806 increased the pH with a value of 0.25, while PCC6803 increased the pH by 0.33 units. SEM shows LS0519 was less likely attached to the concrete surface. Neither did the precipitates on concrete surface differ from that in the abiotic condition. In comparison, PCC8806 and PCC6803 were closely associated with 8-μm porous precipitates. Cells were either found enclosed in precipitates or connecting two precipitates. In conclusion, all the three strains triggered the calcium carbonate precipitation. LS0519 has a little impact on the carbonate precipitation in the solution, but negligent influence on the concrete surface

  8. Short-term hydro generation scheduling of Xiluodu and Xiangjiaba cascade hydropower stations using improved binary-real coded bee colony optimization algorithm

    International Nuclear Information System (INIS)

    Lu, Peng; Zhou, Jianzhong; Wang, Chao; Qiao, Qi; Mo, Li

    2015-01-01

    Highlights: • STHGS problem is decomposed into two parallel sub-problems of UC and ELD. • Binary coded BCO is used to solve UC sub-problem with 0–1 discrete variables. • Real coded BCO is used to solve ELD sub-problem with continuous variables. • Some heuristic repairing strategies are designed to handle various constraints. • The STHGS of Xiluodu and Xiangjiaba cascade stations is solved by IB-RBCO. - Abstract: Short-term hydro generation scheduling (STHGS) of cascade hydropower stations is a typical nonlinear mixed integer optimization problem to minimize the total water consumption while simultaneously meeting the grid requirements and other hydraulic and electrical constraints. In this paper, STHGS problem is decomposed into two parallel sub-problems of unit commitment (UC) and economic load dispatch (ELD), and the methodology of improved binary-real coded bee colony optimization (IB-RBCO) algorithm is proposed to solve them. Firstly, the improved binary coded BCO is used to solve the UC sub-problem with 0–1 discrete variables, and the heuristic repairing strategy for unit state constrains is applied to generate the feasible unit commitment schedule. Then, the improved real coded BCO is used to solve the ELD sub-problem with continuous variables, and an effective method is introduced to handle various unit operation constraints. Especially, the new updating strategy of DE/best/2/bin method with dynamic parameter control mechanism is applied to real coded BCO to improve the search ability of IB-RBCO. Finally, to verify the feasibility and effectiveness of the proposed IB-RBCO method, it is applied to solve the STHGS problem of Xiluodu and Xiangjiaba cascaded hydropower stations, and the simulating results are compared with other intelligence algorithms. The simulation results demonstrate that the proposed IB-RBCO method can get higher-quality solutions with less water consumption and shorter calculating time when facing the complex STHGS problem

  9. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  10. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  11. Development of FEMAG. Calculation code of magnetic field generated by ferritic plates in the tokamak devices

    Energy Technology Data Exchange (ETDEWEB)

    Urata, Kazuhiro [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2003-03-01

    In design of the future fusion devises in which low activation ferritic steel is planned to use as the plasma facing material and/or the inserts for ripple reduction, the appreciation of the error field effect against the plasma as well as the optimization of ferritic plate arrangement to reduce the toroidal field ripple require calculation of magnetic field generated by ferritic steel. However iterative calculations concerning the non-linearity in B-H curve of ferritic steel disturbs high-speed calculation required as the design tool. In the strong toroidal magnetic field that is characteristic in the tokamak fusion devices, fully magnetic saturation of ferritic steel occurs. Hence a distribution of magnetic charges as magnetic field source is determined straightforward and any iteration calculation are unnecessary. Additionally objective ferritic steel geometry is limited to the thin plate and ferritic plates are installed along the toroidal magnetic field. Taking these special conditions into account, high-speed calculation code ''FEMAG'' has been developed. In this report, the formalization of 'FEMAG' code, how to use 'FEMAG', and the validity check of 'FEMAG' in comparison with a 3D FEM code, with the measurements of the magnetic field in JFT-2M are described. The presented examples are numerical results of design studies for JT-60 modification. (author)

  12. Application of the thermal-hydraulic codes in VVER-440 steam generators modelling

    Energy Technology Data Exchange (ETDEWEB)

    Matejovic, P.; Vranca, L.; Vaclav, E. [Nuclear Power Plant Research Inst. VUJE (Slovakia)

    1995-12-31

    Performances with the CATHARE2 V1.3U and RELAP5/MOD3.0 application to the VVER-440 SG modelling during normal conditions and during transient with secondary water lowering are described. Similar recirculation model was chosen for both codes. In the CATHARE calculation, no special measures were taken with the aim to optimize artificially flow rate distribution coefficients for the junction between SG riser and steam dome. Contrary to RELAP code, the CATHARE code is able to predict reasonable the secondary swell level in nominal conditions. Both codes are able to model properly natural phase separation on the SG water level. 6 refs.

  13. Application of the thermal-hydraulic codes in VVER-440 steam generators modelling

    Energy Technology Data Exchange (ETDEWEB)

    Matejovic, P; Vranca, L; Vaclav, E [Nuclear Power Plant Research Inst. VUJE (Slovakia)

    1996-12-31

    Performances with the CATHARE2 V1.3U and RELAP5/MOD3.0 application to the VVER-440 SG modelling during normal conditions and during transient with secondary water lowering are described. Similar recirculation model was chosen for both codes. In the CATHARE calculation, no special measures were taken with the aim to optimize artificially flow rate distribution coefficients for the junction between SG riser and steam dome. Contrary to RELAP code, the CATHARE code is able to predict reasonable the secondary swell level in nominal conditions. Both codes are able to model properly natural phase separation on the SG water level. 6 refs.

  14. QR CODE IN LIBRARY PRACTICE SOME EXAMPLES

    OpenAIRE

    Ajay Shanker Mishra*, Sachin Kumar Umre, Pavan Kumar Gupta

    2017-01-01

    Quick Response (QR) code is one such technology which can cater to the user demand of providing access to resources through mobile. The main objective of this article to review the concept of Quick Response Code (QR code) and describe the practice of reading and generating QR codes. Research paper attempt to the basic concept, structure, technological pros and cons of the QR code. The literature is filled with potential uses for Quick Response (QR) codes in the library practices like e-resour...

  15. Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.

    Science.gov (United States)

    Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P

    2018-01-04

    clearly improved with MC-based OSEM reconstruction, e.g., the activity recovery was 88% for the largest sphere, while it was 66% for AC-OSEM and 79% for RRC-OSEM. The GPU-based MC code generated an MC-based SPECT/CT reconstruction within a few minutes, and reconstructed patient images of 177 Lu-DOTATATE treatments revealed clearly improved resolution and contrast.

  16. Procedures of grasp92 code to calculate accurate Dirac-Coulomb energy for the ground sate of helium atom

    International Nuclear Information System (INIS)

    Utsumi, Takayuki; Sasaki, Akira

    2000-02-01

    The procedures of grasp92 code to calculate accurate (relative error nearly equal 10 -7 ) eigenvalue for the ground sate of helium atom of the Dirac-Coulomb Hamiltonian are presented. The grasp92 code, based on the multi-configuration Dirac-Fock method, is widely used to calculate the atomic properties. However, the main part of the accurate calculations, extended optimal level calculation (EOL), suffer frequently numerical instabilities due to the lack of the confident procedures. The purpose of this report is to illustrate the guideline for stable EOL calculations by calculating the most fundamental atomic system, i.e. the ground sate of helium atom ls 2 1 S 2 . This procedure could be extended for the high-precise eigenfunction calculation of more complex atomic systems, for example highly ionized atoms and high-Z atoms. (author)

  17. AMPX: a modular code system for generating coupled multigroup neutron-gamma libraries from ENDF/B

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Lucius, J.L.; Petrie, L.M.; Ford, W.E. III; White, J.E.; Wright, R.Q.

    1976-03-01

    AMPX is a modular system for producing coupled multigroup neutron-gamma cross section sets. Basic neutron and gamma cross-section data for AMPX are obtained from ENDF/B libraries. Most commonly used operations required to generate and collapse multigroup cross-section sets are provided in the system. AMPX is flexibly dimensioned; neutron group structures, and gamma group structures, and expansion orders to represent anisotropic processes are all arbitrary and limited only by available computer core and budget. The basic processes provided will (1) generate multigroup neutron cross sections; (2) generate multigroup gamma cross sections; (3) generate gamma yields for gamma-producing neutron interactions; (4) combine neutron cross sections, gamma cross sections, and gamma yields into final ''coupled sets''; (5) perform one-dimensional discrete ordinates transport or diffusion theory calculations for neutrons and gammas and, on option, collapse the cross sections to a broad-group structure, using the one-dimensional results as weighting functions; (6) plot cross sections, on option, to facilitate the ''evaluation'' of a particular multigroup set of data; (7) update and maintain multigroup cross section libraries in such a manner as to make it not only easy to combine new data with previously processed data but also to do it in a single pass on the computer; and (8) output multigroup cross sections in convenient formats for other codes. (auth)

  18. Analysis of flow-induced vibration of heat exchanger and steam generator tube bundles using the AECL computer code PIPEAU-2

    International Nuclear Information System (INIS)

    Gorman, D.J.

    1983-12-01

    PIPEAU-2 is a computer code developed at the Chalk River Nuclear Laboratories for the flow-induced vibration analysis of heat exchanger and steam generator tube bundles. It can perform this analysis for straight and 'U' tubes. All the theoretical work underlying the code is analytical rather than numerical in nature. Highly accurate evaluation of the free vibration frequencies and mode shapes is therefore obtained. Using the latest experimentally determined parameters available, the free vibration analysis is followed by a forced vibration analysis. Tube response due to fluid turbulence and vortex shedding is determined, as well as critical fluid velocity associated with fluid-elastic instability

  19. Computer code ANISN multiplying media and shielding calculation 2. Code description (input/output)

    International Nuclear Information System (INIS)

    Maiorino, J.R.

    1991-01-01

    The new code CCC-0514-ANISN/PC is described, as well as a ''GENERAL DESCRIPTION OF ANISN/PC code''. In addition to the ANISN/PC code, the transmittal package includes an interactive input generation programme called APE (ANISN Processor and Evaluator), which facilitates the work of the user in giving input. Also, a 21 group photon cross section master library FLUNGP.LIB in ISOTX format, which can be edited by an executable file LMOD.EXE, is included in the package. The input and output subroutines are reviewed. 6 refs, 1 fig., 1 tab

  20. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  1. Quenching of the Gamow-Teller matrix element in closed LS-shell-plus-one nuclei

    International Nuclear Information System (INIS)

    Towner, I.S.

    1989-06-01

    It is evident that nuclear Gamow-Teller matrix elements determined from β-decay and charge-exchange reactions are significantly quenched compared to simple shell-model estimates based on one-body operators and free-nucleon coupling constants. Here we discuss the theoretical origins of this quenching giving examples from light nuclei near LS-closed shells, such as 16 0 and 40 Ca. (Author) 12 refs., 2 tabs

  2. Verification of SIGACE code for generating ACE format cross-section files with continuous energy at high temperature

    International Nuclear Information System (INIS)

    Li Zhifeng; Yu Tao; Xie Jinsen; Qin Mian

    2012-01-01

    Based on the recently released ENDF/B-VII. 1 library, high temperature neutron cross-section files are generated through SIGACE code using low temperature ACE format files. To verify the processed ACE file of SIGACE, benchmark calculations are performed in this paper. The calculated results of selected ICT, standard CANDU assembly, LWR Doppler coefficient and SEFOR benchmarks are well conformed with reference value, which indicates that high temperature ACE files processed by SIGACE can be used in related neutronics calculations. (authors)

  3. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  4. TIGER: Turbomachinery interactive grid generation

    Science.gov (United States)

    Soni, Bharat K.; Shih, Ming-Hsin; Janus, J. Mark

    1992-01-01

    A three dimensional, interactive grid generation code, TIGER, is being developed for analysis of flows around ducted or unducted propellers. TIGER is a customized grid generator that combines new technology with methods from general grid generation codes. The code generates multiple block, structured grids around multiple blade rows with a hub and shroud for either C grid or H grid topologies. The code is intended for use with a Euler/Navier-Stokes solver also being developed, but is general enough for use with other flow solvers. TIGER features a silicon graphics interactive graphics environment that displays a pop-up window, graphics window, and text window. The geometry is read as a discrete set of points with options for several industrial standard formats and NASA standard formats. Various splines are available for defining the surface geometries. Grid generation is done either interactively or through a batch mode operation using history files from a previously generated grid. The batch mode operation can be done either with a graphical display of the interactive session or with no graphics so that the code can be run on another computer system. Run time can be significantly reduced by running on a Cray-YMP.

  5. Development status of TUF code

    International Nuclear Information System (INIS)

    Liu, W.S.; Tahir, A.; Zaltsgendler

    1996-01-01

    An overview of the important development of the TUF code in 1995 is presented. The development in the following areas is presented: control of round-off error propagation, gas resolution and release models, and condensation induced water hammer. This development is mainly generated from station requests for operational support and code improvement. (author)

  6. Loft CIS analysis 2''-LS-118-AB outside containment penetration S5-D

    Energy Technology Data Exchange (ETDEWEB)

    Morton, D.K.

    1978-09-28

    A stress analysis was performed on the 2''-LS-118-AB pipe system outside containment penetration S5-D. Deadweight, thermal expansion, and seismic loads were considered. The results indicate that this piping will meet ASME Section III, Class 2 requirements provided a U-bolt (S4) is installed as indicated in this report.

  7. International Symposium: “Scientific School of L.S. Vygotsky: Traditions and Innovations” and International ISCAR Summer University for PhD Students

    Directory of Open Access Journals (Sweden)

    Baykovskaya N.A.,

    2016-12-01

    Full Text Available The article represents a brief report on the work of the International Symposium: «Scientific School of L.S. Vygotsky: Traditions and Innovations» and VI th International ISCAR Summer University for PhD Students and young scholars, that were held in Moscow State University of Psychology & Education on June, 28 — July, 3 in 2016 in commemoration of the 120th anniversary of the great Russian psychologist L.S. Vygotsky. The main goals of the events organised by MSUPE include: analysis of the basic principles and the system of concepts of L.S. Vygotsky’s scientific school, discussion of the current state and the prospect for the development of the cultural-historical theory in Russia and abroad, integration of the ideas of the cultural-historical psychology and activity approach in various kinds of social and educational practices, as well as conducting research in the international scientific space. Symposium gathered the world’s leading experts and young scholars in the field of cultural-historical theory and activity approach from 19 countries, including United Kingdom, Australia, Switzerland, Greece, Brasil and USA.

  8. On Code Parameters and Coding Vector Representation for Practical RLNC

    DEFF Research Database (Denmark)

    Heide, Janus; Pedersen, Morten Videbæk; Fitzek, Frank

    2011-01-01

    RLNC provides a theoretically efficient method for coding. The drawbacks associated with it are the complexity of the decoding and the overhead resulting from the encoding vector. Increasing the field size and generation size presents a fundamental trade-off between packet-based throughput...... to higher energy consumption. Therefore, the optimal trade-off is system and topology dependent, as it depends on the cost in energy of performing coding operations versus transmitting data. We show that moderate field sizes are the correct choice when trade-offs are considered. The results show that sparse...

  9. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  10. Using Peephole Optimization on Intermediate Code

    NARCIS (Netherlands)

    Tanenbaum, A.S.; van Staveren, H.; Stevenson, J.W.

    1982-01-01

    Many portable compilers generate an intermediate code that is subsequently translated into the target machine's assembly language. In this paper a stack-machine-based intermediate code suitable for algebraic languages (e.g., PASCAL, C, FORTRAN) and most byte-addressed mini- and microcomputers is

  11. Running codes through the web

    International Nuclear Information System (INIS)

    Clark, R.E.H.

    2001-01-01

    Dr. Clark presented a report and demonstration of running atomic physics codes through the WWW. The atomic physics data is generated from Los Alamos National Laboratory (LANL) codes that calculate electron impact excitation, ionization, photoionization, and autoionization, and inversed processes through detailed balance. Samples of Web interfaces, input and output are given in the report

  12. A method for modeling co-occurrence propensity of clinical codes with application to ICD-10-PCS auto-coding.

    Science.gov (United States)

    Subotin, Michael; Davis, Anthony R

    2016-09-01

    Natural language processing methods for medical auto-coding, or automatic generation of medical billing codes from electronic health records, generally assign each code independently of the others. They may thus assign codes for closely related procedures or diagnoses to the same document, even when they do not tend to occur together in practice, simply because the right choice can be difficult to infer from the clinical narrative. We propose a method that injects awareness of the propensities for code co-occurrence into this process. First, a model is trained to estimate the conditional probability that one code is assigned by a human coder, given than another code is known to have been assigned to the same document. Then, at runtime, an iterative algorithm is used to apply this model to the output of an existing statistical auto-coder to modify the confidence scores of the codes. We tested this method in combination with a primary auto-coder for International Statistical Classification of Diseases-10 procedure codes, achieving a 12% relative improvement in F-score over the primary auto-coder baseline. The proposed method can be used, with appropriate features, in combination with any auto-coder that generates codes with different levels of confidence. The promising results obtained for International Statistical Classification of Diseases-10 procedure codes suggest that the proposed method may have wider applications in auto-coding. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    Science.gov (United States)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and

  14. Construction of self-dual codes in the Rosenbloom-Tsfasman metric

    Science.gov (United States)

    Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin

    2017-12-01

    Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.

  15. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  16. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...... consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional...

  17. On the Toyota LS400 Car Electronic Air Suspension System (TEMS)%浅谈丰田LS400轿车电子空气悬架系统(TEMS)

    Institute of Scientific and Technical Information of China (English)

    池杜旺

    2014-01-01

    随着汽车电子技术的不断发展以及人们对汽车舒适性的要求越来越高,一般的悬架系统已经很难满足高要求的现代车主们,而在如奥迪A8,奔驰S600,LS400等一些高档的轿车上面就采用了更为舒适,控制功能更强的电子悬架系统.

  18. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  19. Model tests of a once-through steam generator for land-blocker assessment and THEDA code verification. Final report

    International Nuclear Information System (INIS)

    Carter, H.R.; Childerson, M.T.; Moskal, T.E.

    1983-06-01

    The Babcock and Wilcox Company (B and W) operating Once-Through Steam Generators (OTSGs) have experienced leaking tubes in a region adjacent to the untubed inspection lane. The tube leaks have been attributed to an environmentally-assisted fatigue mechanism with moisture transported up the inspection lane being a major factor in the tube-failure process. B and W has developed a hardware modification (lane blockers) to mitigate the detrimental effects of inspection lane moisture. A 30-tube Laboratory Once-through Steam Generator (Designated OTSGC) was designed, fabricated, and tested. Tests were performed with and without five flat-plate lane blockers installed on tube-support plates (TSPs) 10, 11, 12, 13, and 14. The test results were utilized to determine the effectiveness of lane blockers for eliminating moisture transport to the upper tubesheet in the inspection lanes and to benchmark the predictive capabilities of a three-dimensional steam-generator computer code, THEDA

  20. The arbitrary order design code Tlie 1.0

    International Nuclear Information System (INIS)

    Zeijts, J. van; Neri, Filippo

    1993-01-01

    We describe the arbitrary order charged particle transfer map code TLIE. This code is a general 6D relativistic design code with a MAD compatible input language and among others implements user defined functions and subroutines and nested fitting and optimization. First we describe the mathematics and physics in the code. Aside from generating maps for all the standard accelerator elements we describe an efficient method for generating nonlinear transfer maps for realistic magnet models. We have implemented the method to arbitrary order in our accelerator design code for cylindrical current sheet magnets. We also have implemented a self-consistent space-charge approach as in CHARLIE. Subsequently we give a description of the input language and finally, we give several examples from productions run, such as cases with stacked multipoles with overlapping fringe fields. (Author)

  1. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  2. Prospective study of toric IOL outcomes based on the Lenstar LS 900® dual zone automated keratometer

    Directory of Open Access Journals (Sweden)

    Gundersen Kjell

    2012-07-01

    Full Text Available Abstract Background To establish clinical expectations when using the Lenstar LS 900® dual-zone automated keratometer for surgery planning of toric intraocular lenses. Methods Fifty eyes were measured with the Lenstar LS 900® dual-zone automated keratometer . Surgical planning was performed with the data from this device and the known surgically induced astigmatism of the surgeon. Post-operative refractions and visual acuity were measured at 1 month and 3 months. Results Clinical outcomes from 43 uncomplicated surgeries showed an average post-operative refractive astigmatism of 0.44D ±0.25D. Over 70% of eyes had 0.50D or less of refractive astigmatism and no eye had more than 1.0D of refractive astigmatism. Uncorrected visual acuity was 20/32 or better in all eyes at 3 months, with 70% of eyes 20/20 or better. A significantly higher number of eyes had 0.75D or more of post-operative refractive astigmatism when the standard deviation of the pre-operative calculated corneal astigmatism angle, reported by the keratometer, was > 5 degrees. Conclusions In this single-site study investigating the use of the keratometry from the Lenstar LS 900® for toric IOL surgical planning, clinical outcomes appear equivalent to those reported in the literature for manual keratometry and somewhat better than has been reported for some previous automated instruments. A high standard deviation in the pre-operative calculated astigmatism angle, as reported by the keratometer, appears to increase the likelihood of higher post-operative refractive astigmatism.

  3. The LIONS code (version 1.0)

    International Nuclear Information System (INIS)

    Bertrand, P.

    1993-01-01

    The new LIONS code (Lancement d'IONS or Ion Launching), a dynamical code implemented in the SPIRaL project for the CIME cyclotron studies, is presented. The various software involves a 3D magnetostatic code, 2D or 3D electrostatic codes for generation of realistic field maps, and several dynamical codes for studying the behaviour of the reference particle from the cyclotron center up to the ejection and for launching particles packets complying with given correlations. Its interactions with the other codes are described. The LIONS code, written in Fortran 90 is already used in studying the CIME cyclotron, from the center to the ejection. It is designed to be used, with minor modifications, in other contexts such as for the simulation of mass spectrometer facilities

  4. FlexibleSUSY-A spectrum generator generator for supersymmetric models

    Science.gov (United States)

    Athron, Peter; Park, Jae-hyeon; Stöckinger, Dominik; Voigt, Alexander

    2015-05-01

    We introduce FlexibleSUSY, a Mathematica and C++ package, which generates a fast, precise C++ spectrum generator for any SUSY model specified by the user. The generated code is designed with both speed and modularity in mind, making it easy to adapt and extend with new features. The model is specified by supplying the superpotential, gauge structure and particle content in a SARAH model file; specific boundary conditions e.g. at the GUT, weak or intermediate scales are defined in a separate FlexibleSUSY model file. From these model files, FlexibleSUSY generates C++ code for self-energies, tadpole corrections, renormalization group equations (RGEs) and electroweak symmetry breaking (EWSB) conditions and combines them with numerical routines for solving the RGEs and EWSB conditions simultaneously. The resulting spectrum generator is then able to solve for the spectrum of the model, including loop-corrected pole masses, consistent with user specified boundary conditions. The modular structure of the generated code allows for individual components to be replaced with an alternative if available. FlexibleSUSY has been carefully designed to grow as alternative solvers and calculators are added. Predefined models include the MSSM, NMSSM, E6SSM, USSM, R-symmetric models and models with right-handed neutrinos.

  5. PUFF-IV, Code System to Generate Multigroup Covariance Matrices from ENDF/B-VI Uncertainty Files

    International Nuclear Information System (INIS)

    2007-01-01

    1 - Description of program or function: The PUFF-IV code system processes ENDF/B-VI formatted nuclear cross section covariance data into multigroup covariance matrices. PUFF-IV is the newest release in this series of codes used to process ENDF uncertainty information and to generate the desired multi-group correlation matrix for the evaluation of interest. This version includes corrections and enhancements over previous versions. It is written in Fortran 90 and allows for a more modular design, thus facilitating future upgrades. PUFF-IV enhances support for resonance parameter covariance formats described in the ENDF standard and now handles almost all resonance parameter covariance information in the resolved region, with the exception of the long range covariance sub-subsections. PUFF-IV is normally used in conjunction with an AMPX master library containing group averaged cross section data. Two utility modules are included in this package to facilitate the data interface. The module SMILER allows one to use NJOY generated GENDF files containing group averaged cross section data in conjunction with PUFF-IV. The module COVCOMP allows one to compare two files written in COVERX format. 2 - Methods: Cross section and flux values on a 'super energy grid,' consisting of the union of the required energy group structure and the energy data points in the ENDF/B-V file, are interpolated from the input cross sections and fluxes. Covariance matrices are calculated for this grid and then collapsed to the required group structure. 3 - Restrictions on the complexity of the problem: PUFF-IV cannot process covariance information for energy and angular distributions of secondary particles. PUFF-IV does not process covariance information in Files 34 and 35; nor does it process covariance information in File 40. These new formats will be addressed in a future version of PUFF

  6. Zebra: An advanced PWR lattice code

    Energy Technology Data Exchange (ETDEWEB)

    Cao, L.; Wu, H.; Zheng, Y. [School of Nuclear Science and Technology, Xi' an Jiaotong Univ., No. 28, Xianning West Road, Xi' an, ShannXi, 710049 (China)

    2012-07-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  7. Zebra: An advanced PWR lattice code

    International Nuclear Information System (INIS)

    Cao, L.; Wu, H.; Zheng, Y.

    2012-01-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  8. Measurements of HCl and HNO3 with the new research aircraft HALO - Quantification of the stratospheric contribution to the O3 and HNO3 budget in the UT/LS

    Science.gov (United States)

    Jurkat, Tina; Kaufmann, Stefan; Voigt, Christiane; Zahn, Andreas; Schlager, Hans; Engel, Andreas; Bönisch, Harald; Dörnbrack, Andreas

    2013-04-01

    Dynamic and chemical processes modify the ozone (O3) budget of the upper troposphere/lower stratosphere, leading to locally variable O3 trends. In this region, O3 acts as a strong greenhouse gas with a net positive radiative forcing. It has been suggested, that the correlation of the stratospheric tracer hydrochloric acid (HCl) with O3 can be used to quantify stratospheric O3 in the UT/LS region (Marcy et al., 2004). The question is, whether the stratospheric contribution to the nitric acid (HNO3) budget in the UT/LS can be determined by a similar approach in order to differentiate between tropospheric and stratospheric sources of HNO3. To this end, we performed in situ measurements of HCl and HNO3 with a newly developed Atmospheric chemical Ionization Mass Spectrometer (AIMS) during the TACTS (Transport and Composition in the UTLS) / ESMVal (Earth System Model Validation) mission in August/September 2012. The linear quadrupole mass spectrometer deployed aboard the new German research aircraft HALO was equipped with a new discharge source generating SF5- reagent ions and an in-flight calibration allowing for accurate, spatially highly resolved trace gas measurements. In addition, sulfur dioxide (SO2), nitrous acid (HONO) and chlorine nitrate (ClONO2) have been simultaneously detected with the AIMS instrument. Here, we show trace gas distributions of HCl and HNO3 measured during a North-South transect from Northern Europe to Antarctica (68° N to 65° S) at 8 to 15 km altitude and discuss their latitude dependence. In particular, we investigate the stratospheric ozone contribution to the ozone budget in the mid-latitude UT/LS using correlations of HCl with O3. Differences in these correlations in the subtropical and Polar regions are discussed. A similar approach is used to quantify the HNO3 budget of the UT/LS. We identify unpolluted atmospheric background distributions and various tropospheric HNO3 sources in specific regions. Our observations can be compared to

  9. Nonterminals and codings in defining variations of OL-systems

    DEFF Research Database (Denmark)

    Skyum, Sven

    1974-01-01

    The use of nonterminals versus the use of codings in variations of OL-systems is studied. It is shown that the use of nonterminals produces a comparatively low generative capacity in deterministic systems while it produces a comparatively high generative capacity in nondeterministic systems. Fina....... Finally it is proved that the family of context-free languages is contained in the family generated by codings on propagating OL-systems with a finite set of axioms, which was one of the open problems in [10]. All the results in this paper can be found in [71] and [72].......The use of nonterminals versus the use of codings in variations of OL-systems is studied. It is shown that the use of nonterminals produces a comparatively low generative capacity in deterministic systems while it produces a comparatively high generative capacity in nondeterministic systems...

  10. Round-Robin Streaming with Generations

    DEFF Research Database (Denmark)

    Li, Yao; Vingelmann, Peter; Pedersen, Morten Videbæk

    2012-01-01

    We consider three types of application layer coding for streaming over lossy links: random linear coding, systematic random linear coding, and structured coding. The file being streamed is divided into sub-blocks (generations). Code symbols are formed by combining data belonging to the same...

  11. Evaluation Codes from Order Domain Theory

    DEFF Research Database (Denmark)

    Andersen, Henning Ejnar; Geil, Hans Olav

    2008-01-01

    bound is easily extended to deal with any generalized Hamming weights. We interpret our methods into the setting of order domain theory. In this way we fill in an obvious gap in the theory of order domains. [28] T. Shibuya and K. Sakaniwa, A Dual of Well-Behaving Type Designed Minimum Distance, IEICE......The celebrated Feng-Rao bound estimates the minimum distance of codes defined by means of their parity check matrices. From the Feng-Rao bound it is clear how to improve a large family of codes by leaving out certain rows in their parity check matrices. In this paper we derive a simple lower bound...... on the minimum distance of codes defined by means of their generator matrices. From our bound it is clear how to improve a large family of codes by adding certain rows to their generator matrices. The new bound is very much related to the Feng-Rao bound as well as to Shibuya and Sakaniwa's bound in [28]. Our...

  12. QPS upgrade and machine protection during LS1

    International Nuclear Information System (INIS)

    Denz, R.

    2012-01-01

    The presentation will explain all the proposed changes and discuss the impact on other shutdown activities. The upgrade of the LHC Quench Protection System QPS during LS1 with respect to radiation to electronics will concern the re-location of equipment and installation of new radiation tolerant hardware. The mid-term plan for further R2E upgrades will be addressed. The protection systems for insertion region magnets and inner triplets will be equipped with a dedicated bus-bar splice supervision including some additional modifications in order to improve the EMC immunity. The extension of the supervision capabilities of the QPS will concern the quench heater circuits, the earth voltage feelers and some tools to ease the system maintenance. The protection of the undulators will be revised in order to allow more transparent operation. The installation of snubber capacitors and arc chambers for the main quad circuits will complete the upgrade of the energy extraction systems. Finally the re-commissioning of the protection systems prior to the powering tests will be addressed. (author)

  13. LS1 Report: PS Booster prepares for beam

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    With Linac2 already up and running, the countdown to beam in the LHC has begun! The next in line is the PS Booster, which will close up shop to engineers early next week. The injector will be handed over to the Operations Group who are tasked with getting it ready for active duty.   Taken as we approach the end of LS1 activities, this image shows where protons will soon be injected from Linac2 into the four PS Booster rings. Over the coming two months, the Operations Group will be putting the Booster's new elements through their paces. "Because of the wide range of upgrades and repairs carried out in the Booster, we have a very full schedule of tests planned for the machine," says Bettina Mikulec, PS Booster Engineer in Charge. "We will begin with cold checks; these are a wide range of tests carried out without beam, including system tests with power on/off and with varying settings, as well as verification of the controls system and timings." Amon...

  14. Status of SPACE Safety Analysis Code Development

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2009-01-01

    In 2006, the Korean the Korean nuclear industry started developing a thermal-hydraulic analysis code for safety analysis of PWR(Pressurized Water Reactor). The new code is named as SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). The SPACE code can solve two-fluid, three-field governing equations in one dimensional or three dimensional geometry. The SPACE code has many component models required for modeling a PWR, such as reactor coolant pump, safety injection tank, etc. The programming language used in the new code is C++, for new generation of engineers who are more comfortable with C/C++ than old FORTRAN language. This paper describes general characteristics of SPACE code and current status of SPACE code development

  15. Abstraction carrying code and resource-awareness

    OpenAIRE

    Hermenegildo, Manuel V.; Albert Albiol, Elvira; López García, Pedro; Puebla Sánchez, Alvaro Germán

    2005-01-01

    Proof-Carrying Code (PCC) is a general approach to mobile code safety in which the code supplier augments the program with a certifícate (or proof). The intended benefit is that the program consumer can locally validate the certifícate w.r.t. the "untrusted" program by means of a certifícate checker—a process which should be much simpler, eíñcient, and automatic than generating the original proof. Abstraction Carrying Code (ACC) is an enabling technology for PCC in which an abstract mod...

  16. Safety and pharmacokinetics of the Fc-modified HIV-1 human monoclonal antibody VRC01LS: A Phase 1 open-label clinical trial in healthy adults.

    Directory of Open Access Journals (Sweden)

    Martin R Gaudinski

    2018-01-01

    Full Text Available VRC01 is a human broadly neutralizing monoclonal antibody (bnMAb against the CD4-binding site of the HIV-1 envelope glycoprotein (Env that is currently being evaluated in a Phase IIb adult HIV-1 prevention efficacy trial. VRC01LS is a modified version of VRC01, designed for extended serum half-life by increased binding affinity to the neonatal Fc receptor.This Phase I dose-escalation study of VRC01LS in HIV-negative healthy adults was conducted by the Vaccine Research Center (VRC at the National Institutes of Health (NIH Clinical Center (Bethesda, MD. The age range of the study volunteers was 21-50 years; 51% of study volunteers were male and 49% were female. Primary objectives were safety and tolerability of VRC01LS intravenous (IV infusions at 5, 20, and 40 mg/kg infused once, 20 mg/kg given three times at 12-week intervals, and subcutaneous (SC delivery at 5 mg/kg delivered once, or three times at 12-week intervals. Secondary objectives were pharmacokinetics (PK, serum neutralization activity, and development of antidrug antibodies. Enrollment began on November 16, 2015, and concluded on August 23, 2017. This report describes the safety data for the first 37 volunteers who received administrations of VRC01LS. There were no serious adverse events (SAEs or dose-limiting toxicities. Mild malaise and myalgia were the most common adverse events (AEs. There were six AEs assessed as possibly related to VRC01LS administration, and all were mild in severity and resolved during the study. PK data were modeled based on the first dose of VRC01LS in the first 25 volunteers to complete their schedule of evaluations. The mean (±SD serum concentration 12 weeks after one IV administration of 20 mg/kg or 40 mg/kg were 180 ± 43 μg/mL (n = 7 and 326 ± 35 μg/mL (n = 5, respectively. The mean (±SD serum concentration 12 weeks after one IV and SC administration of 5 mg/kg were 40 ± 3 μg/mL (n = 2 and 25 ± 5 μg/mL (n = 9, respectively. Over the 5-40 mg

  17. Duals of Affine Grassmann Codes and Their Relatives

    DEFF Research Database (Denmark)

    Beelen, P.; Ghorpade, S. R.; Hoholdt, T.

    2012-01-01

    Affine Grassmann codes are a variant of generalized Reed-Muller codes and are closely related to Grassmann codes. These codes were introduced in a recent work by Beelen Here, we consider, more generally, affine Grassmann codes of a given level. We explicitly determine the dual of an affine...... Grassmann code of any level and compute its minimum distance. Further, we ameliorate the results by Beelen concerning the automorphism group of affine Grassmann codes. Finally, we prove that affine Grassmann codes and their duals have the property that they are linear codes generated by their minimum......-weight codewords. This provides a clean analogue of a corresponding result for generalized Reed-Muller codes....

  18. Steam generators of Phenix: Measurement of the hydrogen concentration in sodium for detecting water leaks in the steam generator tubes

    International Nuclear Information System (INIS)

    Cambillard, E.; Lacroix, A.; Langlois, J.; Viala, J.

    1975-01-01

    The Phenix secondary circuits are provided with measurement systems of hydrogen concentration in sodium, that allow for the detection of possible water leaks in steam generators and the location of a faulty module. A measurement device consists of : a detector with nickel membranes of 0, 3 mm wall thickness, an ion pump with a 200 l/s flow rate, a quadrupole mass spectrometer and a calibrated hydrogen leak. The temperature correction is made automatically. The main tests carried out on the leak detection systems are reported. Since the first system operation (October 24, 1973), the measurements allowed us to obtain the hydrogen diffusion rates through the steam generator tube walls. (author)

  19. Investigating the Simulink Auto-Coding Process

    Science.gov (United States)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of

  20. Development of the ion-acoustic turbulence in a magnetoactive plasma following induced ls-scattering near the lower hybrid resonance

    International Nuclear Information System (INIS)

    Batanov, G.M.; Kolik, L.V.; Sapozhnikov, A.V.; Sarksyan, K.A.; Skvortsova, N.N.

    1984-01-01

    The development and nonlinear saturation of ion-acoustic turbulent oscillat tions excited in a plasma by high frequency pumping wave have been experimentall investigated. As a result of investigations into the interaction between obliqu ue Langmuir waves and a magnetoactive plasma near the lower hybrid resonance performed under the regime of HF-pumping wave pulse generation the following c conclusions are drawn: 1) dynamic characteristics of the development of ion-acou tic turbulent oscillations point to the induced ls-scattering process and the de ependence of the rate of this process on the level of initial superthermal ion-acoustic noises, 2) a nonlinear process limiting the of ion-acoustic turbule ence intensity growth is probably the process of induced sound wave scattering on ions followed by the unstable wave energy transfer over the spectrum into the e lower frequency region. Various mechanisms are responsible for excitation of on acoustic waves and HF-waves near the pumping wave frequency (red satellite)

  1. Field programmable gate array (FPGA implementation of novel complex PN-code-generator- based data scrambler and descrambler

    Directory of Open Access Journals (Sweden)

    Shabir A. Parah

    2010-04-01

    Full Text Available A novel technique for the generation of complex and lengthy code sequences using low- length linear feedback shift registers (LFSRs for data scrambling and descrambling is proposed. The scheme has been implemented using VHSIC hardware description language (VHDL approach which allows the reconfigurability of the proposed system such that the length of the generated sequences can be changed as per the security requirements. In the present design consideration the power consumption and chip area requirements are small and the operating speed is high compared to conventional discrete I.C. design, which is a pre-requisite for any system designer. The design has been synthesised on device EP2S15F484C3 of Straitx II FPGA family, using Quarts Altera version 8.1. The simulation results have been found satisfactory and are in conformity with the theoretical observations.

  2. Hour time-scale QPOs in the X-ray and radio emission of LS I +61°303

    Science.gov (United States)

    Nösel, S.; Sharma, R.; Massi, M.; Cimò, G.; Chernyakova, M.

    2018-05-01

    LS I +61°303 is an X-ray binary with a radio outburst every ˜27 d. Previous studies of the stellar system revealed radio microflares superimposed on the large radio outburst. We present here new radio observations of LS I +61°303 at 2.2 GHz with the Westerbork Synthesis Radio Telescope (WSRT). Using various timing analysis methods, we find significant quasi-periodic oscillations (QPOs) of 55 min stable over the duration of 4 d. We also use archival data obtained from the Suzaku satellite at X-ray wavelengths. We report here for the first time significant X-ray QPOs of about 2 h present over the time span of 21 h. We compare our results with the previously reported QPO observations and we conclude that the QPOs seem to be associated with the radio outburst, independent of the amplitude of the outburst. Finally, the different QPO time-scales are discussed in the context of magnetic reconnection.

  3. The spin-orbit interaction and SU(3) generators in superdeformation

    Energy Technology Data Exchange (ETDEWEB)

    Sugawara-Tanabe, K [School of Social Information, Otsuma Women` s University, Tokyo (Japan); Arima, A [Tokyo Univ. (Japan). Dept. of Physics

    1992-08-01

    The authors found that the effect of spin-orbit coupling becomes smaller for the parity doublet level and for some other levels around superdeformation. This is because of the strongly deformed quadrupole field, which indicates the L-S coupling scheme is recovered for these levels. These levels can be described by an SU-3 group with eight generators and a Casimir operator. 6 refs., 3 figs.

  4. Fuzzy Pruning Based LS-SVM Modeling Development for a Fermentation Process

    Directory of Open Access Journals (Sweden)

    Weili Xiong

    2014-01-01

    Full Text Available Due to the complexity and uncertainty of microbial fermentation processes, data coming from the plants often contain some outliers. However, these data may be treated as the normal support vectors, which always deteriorate the performance of soft sensor modeling. Since the outliers also contaminate the correlation structure of the least square support vector machine (LS-SVM, the fuzzy pruning method is provided to deal with the problem. Furthermore, by assigning different fuzzy membership scores to data samples, the sensitivity of the model to the outliers can be reduced greatly. The effectiveness and efficiency of the proposed approach are demonstrated through two numerical examples as well as a simulator case of penicillin fermentation process.

  5. Migros-3: a code for the generation of group constants for reactor calculations from neutron nuclear data in KEDAK format

    International Nuclear Information System (INIS)

    Broeders, I.; Krieg, B.

    1977-01-01

    The code MIGROS-3 was developed from MIGROS-2. The main advantage of MIGROS-3 is its compatibility with the new conventions of the latest version of the Karlsruhe nuclear data library, KEDAK-3. Moreover, to some extent refined physical models were used and numerical methods were improved. MIGROS-3 allows the calculation of microscopic group cross sections of the ABBN type from isotopic neutron data given in KEDAK-format. All group constants, necessary for diffusion-, consistent P 1 - and Ssub(N)-calculations can be generated. Anisotropy of elastic scattering can be taken into account up to P 5 . A description of the code and the underlying theory is given. The input and output description, a sample problem and the program lists are provided. (orig.) [de

  6. Remote-Handled Transuranic Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2006-12-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  7. Lifting scheme-based method for joint coding 3D stereo digital cinema with luminace correction and optimized prediction

    Science.gov (United States)

    Darazi, R.; Gouze, A.; Macq, B.

    2009-01-01

    Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.

  8. R2E strategy and activities during LS1

    International Nuclear Information System (INIS)

    Perrot, A.L.

    2012-01-01

    The level of the flux of hadrons with energy in the multi MeV range expected from the collimation system at Point 7 and from the collisions at the interaction Points 1, 5 and 8 will induce Single Event Errors (SEEs) of the standard electronics present in the equipment located around these Points. Such events would perturb LHC operation. As a consequence, within the framework of the R2E (Radiation to Electronics) Mitigation Project, the sensitive equipment will be shielded or relocated to safer areas. These mitigation activities will be performed mainly during Long Shutdown 1 (LS1). About 15 groups (including equipment owners) will be involved in these activities with work periods from a few days to several months. Some of them will have to work in parallel in several LHC points. This document presents these mitigation activities with their associated planning, organization process, and main concerns as identified today. (author)

  9. ORIGEN-ARP 2.00, Isotope Generation and Depletion Code System-Matrix Exponential Method with GUI and Graphics Capability

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: ORIGEN-ARP was developed for the Nuclear Regulatory Commission and the Department of Energy to satisfy a need for an easy-to-use standardized method of isotope depletion/decay analysis for spent fuel, fissile material, and radioactive material. It can be used to solve for spent fuel characterization, isotopic inventory, radiation source terms, and decay heat. This release of ORIGEN-ARP is a standalone code package that contains an updated version of the SCALE-4.4a ORIGEN-S code. It contains a subset of the modules, data libraries, and miscellaneous utilities in SCALE-4.4a. This package is intended for users who do not need the entire SCALE package. ORIGEN-ARP 2.00 (2-12-2002) differs from the previous release ORIGEN-ARP 1.0 (July 2001) in the following ways: 1.The neutron source and energy spectrum routines were replaced with computational algorithms and data from the SOURCES-4B code (RSICC package CCC-661) to provide more accurate spontaneous fission and (alpha,n) neutron sources, and a delayed neutron source capability was added. 2.The printout of the fixed energy group structure photon tables was removed. Gamma sources and spectra are now printed for calculations using the Master Photon Library only. 2 - Methods: ORIGEN-ARP is an automated sequence to perform isotopic depletion / decay calculations using the ARP and ORIGEN-S codes of the SCALE system. The sequence includes the OrigenArp for Windows graphical user interface (GUI) that prepares input for ARP (Automated Rapid Processing) and ORIGEN-S. ARP automatically interpolates cross sections for the ORIGEN-S depletion/decay analysis using enrichment, burnup, and, optionally moderator density, from a set of libraries generated with the SCALE SAS2 depletion sequence. Library sets for four LWR fuel assembly designs (BWR 8 x 8, PWR 14 x 14, 15 x 15, 17 x 17) are included. The libraries span enrichments from 1.5 to 5 wt% U-235 and burnups of 0 to 60,000 MWD/MTU. Other

  10. Iterative optimization of performance libraries by hierarchical division of codes

    International Nuclear Information System (INIS)

    Donadio, S.

    2007-09-01

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  11. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  12. Development of a new generation solid rocket motor ignition computer code

    Science.gov (United States)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Ciucci, Alessandro; Johnson, Shelby D.

    1994-01-01

    This report presents the results of experimental and numerical investigations of the flow field in the head-end star grain slots of the Space Shuttle Solid Rocket Motor. This work provided the basis for the development of an improved solid rocket motor ignition transient code which is also described in this report. The correlation between the experimental and numerical results is excellent and provides a firm basis for the development of a fully three-dimensional solid rocket motor ignition transient computer code.

  13. CCFL in hot legs and steam generators and its prediction with the CATHARE code

    International Nuclear Information System (INIS)

    Geffraye, G.; Bazin, P.; Pichon, P.

    1995-01-01

    This paper presents a study about the Counter-Current Flow Limitation (CCFL) prediction in hot legs and steam generators (SG) in both system test facilities and pressurized water reactors. Experimental data are analyzed, particularly the recent MHYRESA test data. Geometrical and scale effects on the flooding behavior are shown. The CATHARE code modelling problems concerning the CCFL prediction are discussed. A method which gives the user the possibility of controlling the flooding limit at a given location is developed. In order to minimize the user effect, a methodology is proposed to the user in case of a calculation with a counter-current flow between the upper plenum and the SF U-tubes. The following questions have to be made clear for the user: when to use the CATHARE CCFL option, which correlation to use, and where to locate the flooding limit

  14. CCFL in hot legs and steam generators and its prediction with the CATHARE code

    Energy Technology Data Exchange (ETDEWEB)

    Geffraye, G.; Bazin, P.; Pichon, P. [CEA/DRN/STR, Grenoble (France)

    1995-09-01

    This paper presents a study about the Counter-Current Flow Limitation (CCFL) prediction in hot legs and steam generators (SG) in both system test facilities and pressurized water reactors. Experimental data are analyzed, particularly the recent MHYRESA test data. Geometrical and scale effects on the flooding behavior are shown. The CATHARE code modelling problems concerning the CCFL prediction are discussed. A method which gives the user the possibility of controlling the flooding limit at a given location is developed. In order to minimize the user effect, a methodology is proposed to the user in case of a calculation with a counter-current flow between the upper plenum and the SF U-tubes. The following questions have to be made clear for the user: when to use the CATHARE CCFL option, which correlation to use, and where to locate the flooding limit.

  15. RH-TRU Waste Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2007-07-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  16. Detailed study of spontaneous rotation generation in diverted H-mode plasma using the full-f gyrokinetic code XGC1

    Science.gov (United States)

    Seo, Janghoon; Chang, C. S.; Ku, S.; Kwon, J. M.; Yoon, E. S.

    2013-10-01

    The Full-f gyrokinetic code XGC1 is used to study the details of toroidal momentum generation in H-mode plasma. Diverted DIII-D geometry is used, with Monte Carlo neutral particles that are recycled at the limiter wall. Nonlinear Coulomb collisions conserve particle, momentum, and energy. Gyrokinetic ions and adiabatic electrons are used in the present simulation to include the effects from ion gyrokinetic turbulence and neoclassical physics, under self-consistent radial electric field generation. Ion orbit loss physics is automatically included. Simulations show a strong co-Ip flow in the H-mode layer at outside midplane, similarly to the experimental observation from DIII-D and ASDEX-U. The co-Ip flow in the edge propagates inward into core. It is found that the strong co-Ip flow generation is mostly from neoclassical physics. On the other hand, the inward momentum transport is from turbulence physics, consistently with the theory of residual stress from symmetry breaking. Therefore, interaction between the neoclassical and turbulence physics is a key factor in the spontaneous momentum generation.

  17. A gene encoding an abscisic acid biosynthetic enzyme (LsNCED4) collocates with the high temperature germination locus Htg6.1 in lettuce (Lactuca sp.).

    Science.gov (United States)

    Argyris, Jason; Truco, María José; Ochoa, Oswaldo; McHale, Leah; Dahal, Peetambar; Van Deynze, Allen; Michelmore, Richard W; Bradford, Kent J

    2011-01-01

    Thermoinhibition, or failure of seeds to germinate when imbibed at warm temperatures, can be a significant problem in lettuce (Lactuca sativa L.) production. The reliability of stand establishment would be improved by increasing the ability of lettuce seeds to germinate at high temperatures. Genes encoding germination- or dormancy-related proteins were mapped in a recombinant inbred line population derived from a cross between L. sativa cv. Salinas and L. serriola accession UC96US23. This revealed several candidate genes that are located in the genomic regions containing quantitative trait loci (QTLs) associated with temperature and light requirements for germination. In particular, LsNCED4, a temperature-regulated gene in the biosynthetic pathway for abscisic acid (ABA), a germination inhibitor, mapped to the center of a previously detected QTL for high temperature germination (Htg6.1) from UC96US23. Three sets of sister BC(3)S(2) near-isogenic lines (NILs) that were homozygous for the UC96US23 allele of LsNCED4 at Htg6.1 were developed by backcrossing to cv. Salinas and marker-assisted selection followed by selfing. The maximum temperature for germination of NIL seed lots with the UC96US23 allele at LsNCED4 was increased by 2-3°C when compared with sister NIL seed lots lacking the introgression. In addition, the expression of LsNCED4 was two- to threefold lower in the former NIL lines as compared to expression in the latter. Together, these data strongly implicate LsNCED4 as the candidate gene responsible for the Htg6.1 phenotype and indicate that decreased ABA biosynthesis at high imbibition temperatures is a major factor responsible for the increased germination thermotolerance of UC96US23 seeds.

  18. Improving system modeling accuracy with Monte Carlo codes

    International Nuclear Information System (INIS)

    Johnson, A.S.

    1996-01-01

    The use of computer codes based on Monte Carlo methods to perform criticality calculations has become common-place. Although results frequently published in the literature report calculated k eff values to four decimal places, people who use the codes in their everyday work say that they only believe the first two decimal places of any result. The lack of confidence in the computed k eff values may be due to the tendency of the reported standard deviation to underestimate errors associated with the Monte Carlo process. The standard deviation as reported by the codes is the standard deviation of the mean of the k eff values for individual generations in the computer simulation, not the standard deviation of the computed k eff value compared with the physical system. A more subtle problem with the standard deviation of the mean as reported by the codes is that all the k eff values from the separate generations are not statistically independent since the k eff of a given generation is a function of k eff of the previous generation, which is ultimately based on the starting source. To produce a standard deviation that is more representative of the physical system, statistically independent values of k eff are needed

  19. The generation of absorbed dose profiles of proton beam in water using Geant4 code

    International Nuclear Information System (INIS)

    Christovao, Marilia T.; Campos, Tarcisio Passos R. de

    2007-01-01

    The present article approaches simulations on the proton beam radiation therapy, using an application based on the code GEANT4, with Open GL as a visualization drive and JAS3 (Java Analysis Studio) analysis data tools systems, implementing the AIDA interfaces. The proton radiotherapy is adapted to treat cancer or other benign tumors that are close to sensitive structures, since it allows precise irradiation of the target with high doses, while the health tissues adjacent to vital organs and tissues are preserved, due to physical property of dose profile. GEANT4 is a toolkit for simulating the transport of particles through matter, in complex geometries. Taking advantage of the object-oriented project features, the user can adapt or extend the tool in all domain, due to the flexibility of the code, providing a subroutine's group for materials definition, geometries and particles properties in agreement with the user's needs to generate the Monte Carlo simulation. In this paper, the parameters of beam line used in the simulation possess adjustment elements, such as: the range shifter, composition and dimension; the beam line, energy, intensity, length, according with physic processes applied. The simulation result is the depth dose profiles on water, dependent on the various incident beam energy. Starting from those profiles, one can define appropriate conditions for proton radiotherapy in ocular region. (author)

  20. Investigation on the MOC with a linear source approximation scheme in three-dimensional assembly

    International Nuclear Information System (INIS)

    Zhu, Chenglin; Cao, Xinrong

    2014-01-01

    Method of characteristics (MOC) for solving neutron transport equation has already become one of the fundamental methods for lattice calculation of nuclear design code system. At present, MOC has three schemes to deal with the neutron source of the transport equation: the flat source approximation of the step characteristics (SC) scheme, the diamond difference (DD) scheme and the linear source (LS) characteristics scheme. The MOC for SC scheme and DD scheme need large storage space and long computing time when they are used to calculate large-scale three-dimensional neutron transport problems. In this paper, a LS scheme and its correction for negative source distribution were developed and added to DRAGON code. This new scheme was compared with the SC scheme and DD scheme which had been applied in this code. As an open source code, DRAGON could solve three-dimensional assembly with MOC method. Detailed calculation is conducted on two-dimensional VVER-1000 assembly under three schemes of MOC. The numerical results indicate that coarse mesh could be used in the LS scheme with the same accuracy. And the LS scheme applied in DRAGON is effective and expected results are achieved. Then three-dimensional cell problem and VVER-1000 assembly are calculated with LS scheme and SC scheme. The results show that less memory and shorter computational time are employed in LS scheme compared with SC scheme. It is concluded that by using LS scheme, DRAGON is able to calculate large-scale three-dimensional problems with less storage space and shorter computing time

  1. Benchmark problems for radiological assessment codes. Final report

    International Nuclear Information System (INIS)

    Mills, M.; Vogt, D.; Mann, B.

    1983-09-01

    This report describes benchmark problems to test computer codes used in the radiological assessment of high-level waste repositories. The problems presented in this report will test two types of codes. The first type of code calculates the time-dependent heat generation and radionuclide inventory associated with a high-level waste package. Five problems have been specified for this code type. The second code type addressed in this report involves the calculation of radionuclide transport and dose-to-man. For these codes, a comprehensive problem and two subproblems have been designed to test the relevant capabilities of these codes for assessing a high-level waste repository setting

  2. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  3. Performance of JPEG Image Transmission Using Proposed Asymmetric Turbo Code

    Directory of Open Access Journals (Sweden)

    Siddiqi Mohammad Umar

    2007-01-01

    Full Text Available This paper gives the results of a simulation study on the performance of JPEG image transmission over AWGN and Rayleigh fading channels using typical and proposed asymmetric turbo codes for error control coding. The baseline JPEG algorithm is used to compress a QCIF ( "Suzie" image. The recursive systematic convolutional (RSC encoder with generator polynomials , that is, (13/11 in decimal, and 3G interleaver are used for the typical WCDMA and CDMA2000 turbo codes. The proposed asymmetric turbo code uses generator polynomials , that is, (13/11; 13/9 in decimal, and a code-matched interleaver. The effect of interleaver in the proposed asymmetric turbo code is studied using weight distribution and simulation. The simulation results and performance bound for proposed asymmetric turbo code for the frame length , code rate with Log-MAP decoder over AWGN channel are compared with the typical system. From the simulation results, it is observed that the image transmission using proposed asymmetric turbo code performs better than that with the typical system.

  4. Impact of Convection and Long Range Transport on Short-Lived Trace Gases in the UT/LS

    Science.gov (United States)

    Atlas, E. L.; Schauffler, S.; Navarro, M. A.; Lueb, R.; Hendershot, R.; Ueyama, R.

    2017-12-01

    Chemical composition of the air in the upper troposphere/lower stratosphere is controlled by a balance of transport, photochemistry, and physical processes, such as interactions with clouds, ice, and aerosol. The chemistry of the air masses that reach the upper troposphere can potentially have profound impacts on the chemistry in the near tropopause region. For example, the transport of reactive organic halogens and their transformation to inorganic halogen species, e.g., Br, BrO, etc., can have a significant impact on ozone budgets in this region and even deeper the stratosphere. Trace gas measurements in the region near the tropopause can also indicate potential sources of surface emissions that are transported to high altitudes. Measurement of trace gases, including such compounds as non-methane hydrocarbons, hydrochlorofluorocarbons, halogenated solvents, methyl halides, etc., can be used to characterize source emissions from industrial, urban, biomass burning, or marine origins. Recent airborne research campaigns have been conducted to better characterize the chemical composition and variations in the UT/LS region. This presentation will discuss these measurements, with a special emphasis on the role of convection and transport in modifying the chemical composition of the UT/LS.

  5. Bi-level image compression with tree coding

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1996-01-01

    Presently, tree coders are the best bi-level image coders. The current ISO standard, JBIG, is a good example. By organising code length calculations properly a vast number of possible models (trees) can be investigated within reasonable time prior to generating code. Three general-purpose coders...... are constructed by this principle. A multi-pass free tree coding scheme produces superior compression results for all test images. A multi-pass fast free template coding scheme produces much better results than JBIG for difficult images, such as halftonings. Rissanen's algorithm `Context' is presented in a new...

  6. Effects of grit roughness and pitch oscillations on the LS(1)-0417MOD airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Janiszewska, J.M.; Ramsay, R.R.; Hoffman, M.J.; Gregorek, G.M. [Ohio State Univ., Columbus, OH (United States)

    1996-01-01

    Horizontal axis wind turbine rotors experience unsteady aerodynamics due to wind shear when the rotor is yawed, when rotor blades pass through the support tower wake, and when the wind is gusting. An understanding of this unsteady behavior is necessary to assist in the calculations of rotor performance and loads. The rotors also experience performance degradation caused by surface roughness. These surface irregularities are due to the accumulation of insect debris, ice, and/or the aging process. Wind tunnel studies that examine both the steady and unsteady behavior of airfoils can help define pertinent flow phenomena, and the resultant data can be used to validate analytical computer codes. An LS(l)-0417MOD airfoil model was tested in The Ohio State University Aeronautical and Astronautical Research Laboratory (OSU/AARL) 3{times}5 subsonic wind tunnel (3{times}5) under steady flow and stationary model conditions, as well as with the model undergoing pitch oscillations. To study the possible extent of performance loss due to surface roughness, a standard grit pattern (LEGR) was used to simulate leading edge contamination. After baseline cases were completed, the LEGR was applied for both steady state and model pitch oscillation cases. The Reynolds numbers for steady state conditions were 0.75, 1, 1.25, and 1.5 million, while the angle of attack ranged from {minus}20{degrees} to +40{degrees}. With the model undergoing pitch oscillations, data were acquired at Reynolds numbers of 0.75, 1, 1.25, and 1.5 million, at frequencies of 0.6, 1.2, and 1.8 Hz. Two sine wave forcing functions were used, {plus_minus} 5.5%{degrees} and {plus_minus} 10{degrees}, at mean angles of attack of 8{degrees}, 14{degrees}, and 20{degrees}. For purposes herein, any reference to unsteady conditions foil model was in pitch oscillation about the quarter chord.

  7. All you need is shape: Predicting shear banding in sand with LS-DEM

    Science.gov (United States)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2018-02-01

    This paper presents discrete element method (DEM) simulations with experimental comparisons at multiple length scales-underscoring the crucial role of particle shape. The simulations build on technological advances in the DEM furnished by level sets (LS-DEM), which enable the mathematical representation of the surface of arbitrarily-shaped particles such as grains of sand. We show that this ability to model shape enables unprecedented capture of the mechanics of granular materials across scales ranging from macroscopic behavior to local behavior to particle behavior. Specifically, the model is able to predict the onset and evolution of shear banding in sands, replicating the most advanced high-fidelity experiments in triaxial compression equipped with sequential X-ray tomography imaging. We present comparisons of the model and experiment at an unprecedented level of quantitative agreement-building a one-to-one model where every particle in the more than 53,000-particle array has its own avatar or numerical twin. Furthermore, the boundary conditions of the experiment are faithfully captured by modeling the membrane effect as well as the platen displacement and tilting. The results show a computational tool that can give insight into the physics and mechanics of granular materials undergoing shear deformation and failure, with computational times comparable to those of the experiment. One quantitative measure that is extracted from the LS-DEM simulations that is currently not available experimentally is the evolution of three dimensional force chains inside and outside of the shear band. We show that the rotations on the force chains are correlated to the rotations in stress principal directions.

  8. SWAT3.1 - the integrated burnup code system driving continuous energy Monte Carlo codes MVP and MCNP

    International Nuclear Information System (INIS)

    Suyama, Kenya; Mochizuki, Hiroki; Takada, Tomoyuki; Ryufuku, Susumu; Okuno, Hiroshi; Murazaki, Minoru; Ohkubo, Kiyoshi

    2009-05-01

    Integrated burnup calculation code system SWAT is a system that combines neutronics calculation code SRAC,which is widely used in Japan, and point burnup calculation code ORIGEN2. It has been used to evaluate the composition of the uranium, plutonium, minor actinides and the fission products in the spent nuclear fuel. Based on this idea, the integrated burnup calculation code system SWAT3.1 was developed by combining the continuous energy Monte Carlo code MVP and MCNP, and ORIGEN2. This enables us to treat the arbitrary fuel geometry and to generate the effective cross section data to be used in the burnup calculation with few approximations. This report describes the outline, input data instruction and several examples of the calculation. (author)

  9. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  10. Spatial-Aided Low-Delay Wyner-Ziv Video Coding

    Directory of Open Access Journals (Sweden)

    Bo Wu

    2009-01-01

    Full Text Available In distributed video coding, the side information (SI quality plays an important role in Wyner-Ziv (WZ frame coding. Usually, SI is generated at the decoder by the motion-compensated interpolation (MCI from the past and future key frames under the assumption that the motion trajectory between the adjacent frames is translational with constant velocity. However, this assumption is not always true and thus, the coding efficiency for WZ coding is often unsatisfactory in video with high and/or irregular motion. This situation becomes more serious in low-delay applications since only motion-compensated extrapolation (MCE can be applied to yield SI. In this paper, a spatial-aided Wyner-Ziv video coding (WZVC in low-delay application is proposed. In SA-WZVC, at the encoder, each WZ frame is coded as performed in the existing common Wyner-Ziv video coding scheme and meanwhile, the auxiliary information is also coded with the low-complexity DPCM. At the decoder, for the WZ frame decoding, auxiliary information should be decoded firstly and then SI is generated with the help of this auxiliary information by the spatial-aided motion-compensated extrapolation (SA-MCE. Theoretical analysis proved that when a good tradeoff between the auxiliary information coding and WZ frame coding is achieved, SA-WZVC is able to achieve better rate distortion performance than the conventional MCE-based WZVC without auxiliary information. Experimental results also demonstrate that SA-WZVC can efficiently improve the coding performance of WZVC in low-delay application.

  11. Library of neutron cross sections of the Thermos code

    International Nuclear Information System (INIS)

    Alonso V, G.; Hernandez L, H.

    1991-10-01

    The present work is the complement of the IT.SN/DFR-017 report in which the structure and the generation of the library of the Thermos code is described. In this report the comparison among the values of the cross sections that has the current library of the Thermos code and those generated by means of the ENDF-B/NJOY it is shown. (Author)

  12. Higher-Order Program Generation

    DEFF Research Database (Denmark)

    Rhiger, Morten

    for OCaml, a dialect of ML, that provides run-time code generation for OCaml programs. We apply these byte-code combinators in semantics-directed compilation for an imperative language and in run-time specialization using type-directed partial evaluation. Finally, we present an approach to compiling goal......This dissertation addresses the challenges of embedding programming languages, specializing generic programs to specific parameters, and generating specialized instances of programs directly as executable code. Our main tools are higher-order programming techniques and automatic program generation....... It is our thesis that they synergize well in the development of customizable software. Recent research on domain-specific languages propose to embed them into existing general-purpose languages. Typed higher-order languages have proven especially useful as meta languages because they provide a rich...

  13. Tokamak plasma power balance calculation code (TPC code) outline and operation manual

    International Nuclear Information System (INIS)

    Fujieda, Hirobumi; Murakami, Yoshiki; Sugihara, Masayoshi.

    1992-11-01

    This report is a detailed description on the TPC code, that calculates the power balance of a tokamak plasma according to the ITER guidelines. The TPC code works on a personal computer (Macintosh or J-3100/ IBM-PC). Using input data such as the plasma shape, toroidal magnetic field, plasma current, electron temperature, electron density, impurities and heating power, TPC code can determine the operation point of the fusion reactor (Ion temperature is assumed to be equal to the electron temperature). Supplied flux (Volt · sec) and burn time are also estimated by coil design parameters. Calculated energy confinement time is compared with various L-mode scaling laws and the confinement enhancement factor (H-factor) is evaluated. Divertor heat load is predicted by using simple scaling models (constant-χ, Bohm-type-χ and JT-60U empirical scaling models). Frequently used data can be stored in a 'device file' and used as the default values. TPC code can generate 2-D mesh data and the POPCON plot is drawn by a contour line plotting program (CONPLT). The operation manual about CONPLT code is also described. (author)

  14. Information rates of next-generation long-haul optical fiber systems using coded modulation

    NARCIS (Netherlands)

    Liga, G.; Alvarado, A.; Agrell, E.; Bayvel, P.

    2017-01-01

    A comprehensive study of the coded performance of long-haul spectrally-efficient WDM optical fiber transmission systems with different coded modulation decoding structures is presented. Achievable information rates are derived for three different square QAM formats and the optimal format is

  15. Gene expression profiling upon 212Pb-TCMC-trastuzumab treatment in the LS-174T i.p. xenograft model

    International Nuclear Information System (INIS)

    Yong, Kwon J; Milenic, Diane E; Baidoo, Kwamena E; Kim, Young-Seung; Brechbiel, Martin W

    2013-01-01

    Recent studies have demonstrated that therapy with 212 Pb-TCMC-trastuzumab resulted in (1) induction of apoptosis, (2) G2/M arrest, and (3) blockage of double-strand DNA damage repair in LS-174T i.p. (intraperitoneal) xenografts. To further understand the molecular basis of the cell killing efficacy of 212 Pb-TCMC-trastuzumab, gene expression profiling was performed with LS-174T xenografts 24 h after exposure to 212 Pb-TCMC-trastuzumab. DNA damage response genes (84) were screened using a quantitative real-time polymerase chain reaction array (qRT-PCR array). Differentially regulated genes were identified following exposure to 212 Pb-TCMC-trastuzumab. These included genes involved in apoptosis (ABL, GADD45α, GADD45γ, PCBP4, and p73), cell cycle (ATM, DDIT3, GADD45α, GTSE1, MKK6, PCBP4, and SESN1), and damaged DNA binding (DDB) and repair (ATM and BTG2). The stressful growth arrest conditions provoked by 212 Pb-TCMC-trastuzumab were found to induce genes involved in apoptosis and cell cycle arrest in the G2/M phase. The expression of genes involved in DDB and single-strand DNA breaks was also enhanced by 212 Pb-TCMC-trastuzumab while no modulation of genes involved in double-strand break repair was apparent. Furthermore, the p73/GADD45 signaling pathway mediated by p38 kinase signaling may be involved in the cellular response, as evidenced by the enhanced expression of genes and proteins of this pathway. These results further support the previously described cell killing mechanism by 212 Pb-TCMC-trastuzumab in the same LS-174T i.p. xenograft. Insight into these mechanisms could lead to improved strategies for rational application of radioimmunotherapy using α-particle emitters. The apoptotic response and associated gene modulations have not been clearly defined following exposure of cells to α-particle radioimmunotherapy (RIT). Gene expression profiling was performed with LS-174T i.p. (intraperitoneal) xenografts after exposure to 212 Pb

  16. An effective parameter optimization technique for vibration flow field characterization of PP melts via LS-SVM combined with SALS in an electromagnetism dynamic extruder

    Science.gov (United States)

    Xian, Guangming

    2018-03-01

    A method for predicting the optimal vibration field parameters by least square support vector machine (LS-SVM) is presented in this paper. One convenient and commonly used technique for characterizing the the vibration flow field of polymer melts films is small angle light scattering (SALS) in a visualized slit die of the electromagnetism dynamic extruder. The optimal value of vibration vibration frequency, vibration amplitude, and the maximum light intensity projection area can be obtained by using LS-SVM for prediction. For illustrating this method and show its validity, the flowing material is used with polypropylene (PP) and fifteen samples are tested at the rotation speed of screw at 36rpm. This paper first describes the apparatus of SALS to perform the experiments, then gives the theoretical basis of this new method, and detail the experimental results for parameter prediction of vibration flow field. It is demonstrated that it is possible to use the method of SALS and obtain detailed information on optimal parameter of vibration flow field of PP melts by LS-SVM.

  17. LS1 Report: the electric atmosphere of the LHC

    CERN Multimedia

    Simon Baird

    2013-01-01

    In the LHC, testing of the main magnet (dipole and quadrupole) circuits has been completed. At the same time, the extensive tests of all the other circuits up to current levels corresponding to 7 TeV beam operation have been performed, and now the final ElQA (Electrical Quality Assurance) tests of the electrical circuits are proceeding.   In Sectors 4-5 and 5-6, where the ElQA checks have been finished, the process of removing and storing the helium has started (see the article Heatwave warning for the LHC, in this issue). This is the first step in warming up the whole machine to room temperature so that the main LS1 activities, SMACC (Super Conducting Magnet and Circuit Consolidation) and the R2E (Radiation Two Electronics) programmes, which are scheduled to start on 19 April and 22 March respectively, can get under way. As far as the LHC injectors are concerned, LINAC2 and the PS Booster are in shutdown mode, having completed their preparatory hardware test programmes, and shutdown work has alr...

  18. LS1 Report: antimatter research on the starting blocks

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    The consolidation work at the Antiproton Decelerator (AD) has been very intensive and the operators now have a basically new machine to “drive”. Thanks to the accurate preparation work still ongoing, the machine will soon deliver its first beam of antiprotons to the experiments. The renewed efficiency of the whole complex will ensure the best performance of the whole of CERN’s antimatter research programme in the long term.   The test bench for the new Magnetic Horn stripline. On the left, high voltage cables are connected to the stripline, which then feeds a 6 kV 400 kA pulse to the Horn. The Horn itself (the cylindrical object on the right) can be seen mounted on its chariot. The consolidation programme at the AD planned during LS1 has involved some of the most vital parts of the decelerator such as the target area, the ring magnets, the stochastic cooling system, vacuum system, control system and various aspects of the instrumentation. In addit...

  19. LS1 Report: on the home straight in 2014

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    At 7.24 a.m. on 14 February 2013 the last beams for physics were absorbed into the LHC, marking the end of Run 1. The achievements since then over the first ten months of LS1 have been remarkable. The excellent progress of the maintenance work on CERN's accelerators, which is overwhelmingly on schedule – and even ahead of schedule in some cases! – was praised by the CERN Council last week.   That being said, there is still a long way to go before the LHC re-start, with many challenges and potential pitfalls to be overcome. An overview of what still lies ahead: For the injectors (Linac 2, PS booster, LEIR, PS and AD), 2014 will begin with the recommissioning of all the access systems (scheduled for mid-February). The first power tests (to check the magnets and the power converters) will follow hot on its heels, starting in early April in the case of the PS booster. The final power tests of the injectors will be carried out at the Antiproton Decelerator in June. Th...

  20. The RETRAN-03 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; McFadden, J.H.; Peterson, C.E.; McClure, J.A.; Gose, G.C.; Jensen, P.J.

    1991-01-01

    The RETRAN-03 code development effort is designed to overcome the major theoretical and practical limitations associated with the RETRAN-02 computer code. The major objectives of the development program are to extend the range of analyses that can be performed with RETRAN, to make the code more dependable and faster running, and to have a more transportable code. The first two objectives are accomplished by developing new models and adding other models to the RETRAN-02 base code. The major model additions for RETRAN-03 are as follows: implicit solution methods for the steady-state and transient forms of the field equations; additional options for the velocity difference equation; a new steady-state initialization option for computer low-power steam generator initial conditions; models for nonequilibrium thermodynamic conditions; and several special-purpose models. The source code and the environmental library for RETRAN-03 are written in standard FORTRAN 77, which allows the last objective to be fulfilled. Some models in RETRAN-02 have been deleted in RETRAN-03. In this paper the changes between RETRAN-02 and RETRAN-03 are reviewed

  1. Computer codes for designing proton linear accelerators

    International Nuclear Information System (INIS)

    Kato, Takao

    1992-01-01

    Computer codes for designing proton linear accelerators are discussed from the viewpoint of not only designing but also construction and operation of the linac. The codes are divided into three categories according to their purposes: 1) design code, 2) generation and simulation code, and 3) electric and magnetic fields calculation code. The role of each category is discussed on the basis of experience at KEK (the design of the 40-MeV proton linac and its construction and operation, and the design of the 1-GeV proton linac). We introduce our recent work relevant to three-dimensional calculation and supercomputer calculation: 1) tuning of MAFIA (three-dimensional electric and magnetic fields calculation code) for supercomputer, 2) examples of three-dimensional calculation of accelerating structures by MAFIA, 3) development of a beam transport code including space charge effects. (author)

  2. Progress on China nuclear data processing code system

    Science.gov (United States)

    Liu, Ping; Wu, Xiaofei; Ge, Zhigang; Li, Songyang; Wu, Haicheng; Wen, Lili; Wang, Wenming; Zhang, Huanyu

    2017-09-01

    China is developing the nuclear data processing code Ruler, which can be used for producing multi-group cross sections and related quantities from evaluated nuclear data in the ENDF format [1]. The Ruler includes modules for reconstructing cross sections in all energy range, generating Doppler-broadened cross sections for given temperature, producing effective self-shielded cross sections in unresolved energy range, calculating scattering cross sections in thermal energy range, generating group cross sections and matrices, preparing WIMS-D format data files for the reactor physics code WIMS-D [2]. Programming language of the Ruler is Fortran-90. The Ruler is tested for 32-bit computers with Windows-XP and Linux operating systems. The verification of Ruler has been performed by comparison with calculation results obtained by the NJOY99 [3] processing code. The validation of Ruler has been performed by using WIMSD5B code.

  3. Ballistic Simulation Method for Lithium Ion Batteries (BASIMLIB) Using Thick Shell Composites (TSC) in LS-DYNA

    Science.gov (United States)

    2016-08-04

    BAllistic SImulation Method for Lithium Ion Batteries (BASIMLIB) using Thick Shell Composites (TSC) in LS-DYNA Venkatesh Babu, Dr. Matt Castanier, Dr...and behavior of the cells through experimental and modeling at their crash worthiness laboratory • Most of the simulation work on the batteries are...at a single cell level and gap exists to simulate the batteries at their full pack capacity - Firstly, requires an enormous amount of computational

  4. Status of reactor physics activities on cross section generation and functionalization for the prismatic very high temperature reactor, and development of spatially-heterogeneous codes

    International Nuclear Information System (INIS)

    Lee, C. H.; Zhong, Z.; Taiwo, T. A.; Yang, W. S.; Smith, M. A.; Palmiotti, G.

    2006-01-01

    The cross section generation methodology and procedure for design and analysis of the prismatic Very High Temperature Gas-cooled Reactor (VHTR) core have been addressed for the DRAGON and REBUS-3/DIF3D code suite. Approaches for tabulation and functionalization of cross sections have been investigated and implemented. The cross sections are provided at different burnup and fuel and moderator temperature states. In the tabulation approach, the multigroup cross sections are tabulated as a function of the state variables so that a cross section file is able to cover the range of core operating conditions. Cross sections for points between tabulated data points are fitted simply by linear interpolation. For the functionalization approach, an investigation of the applicability of quadratic polynomials and linear coupling for fuel and moderator temperature changes has been conducted, based on the observation that cross sections are monotonically changing with fuel or moderator temperatures. Preliminary results show that the functionalization makes it possible to cover a wide range of operating temperature conditions with only six sets of data per burnup, while maintaining a good accuracy and significantly reducing the size of the cross section file. In these approaches, the number of fission products has been minimized to a few nuclides (I/Xe/Pm/Sm and a lumped fission product) to reduce the overall computation time without sacrificing solution accuracy. Discontinuity factors (DFs) based on nodal equivalence theory have been introduced to accurately represent the significant change in neutron spectrum at the interface of the fuel and reflector regions as well as between different fuel blocks (e.g., fuel elements with burnable poisons or control rods). Using the DRAGON code, procedures have been established for generating cross sections for fuel and reflector blocks with and without control absorbers. The preliminary results indicate that the solution accuracy is improved

  5. Aerosols in the tropical and subtropical UT/LS: in-situ measurements of submicron particle abundance and volatility

    Directory of Open Access Journals (Sweden)

    S. Borrmann

    2010-06-01

    Full Text Available Processes occurring in the tropical upper troposphere (UT, the Tropical Transition Layer (TTL, and the lower stratosphere (LS are of importance for the global climate, for stratospheric dynamics and air chemistry, and for their influence on the global distribution of water vapour, trace gases and aerosols. In this contribution we present aerosol and trace gas (in-situ measurements from the tropical UT/LS over Southern Brazil, Northern Australia, and West Africa. The instruments were operated on board of the Russian high altitude research aircraft M-55 "Geophysica" and the DLR Falcon-20 during the campaigns TROCCINOX (Araçatuba, Brazil, February 2005, SCOUT-O3 (Darwin, Australia, December 2005, and SCOUT-AMMA (Ouagadougou, Burkina Faso, August 2006. The data cover submicron particle number densities and volatility from the COndensation PArticle counting System (COPAS, as well as relevant trace gases like N2O, ozone, and CO. We use these trace gas measurements to place the aerosol data into a broader atmospheric context. Also a juxtaposition of the submicron particle data with previous measurements over Costa Rica and other tropical locations between 1999 and 2007 (NASA DC-8 and NASA WB-57F is provided. The submicron particle number densities, as a function of altitude, were found to be remarkably constant in the tropical UT/LS altitude band for the two decades after 1987. Thus, a parameterisation suitable for models can be extracted from these measurements. Compared to the average levels in the period between 1987 and 2007 a slight increase of particle abundances was found for 2005/2006 at altitudes with potential temperatures, Θ, above 430 K. The origins of this increase are unknown except for increases measured during SCOUT-AMMA. Here the eruption of the Soufrière Hills volcano in the Caribbean caused elevated particle mixing ratios. The vertical profiles from Northern hemispheric mid-latitudes between 1999 and 2006 also are

  6. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  7. Automatic mesh generation with QMESH program

    International Nuclear Information System (INIS)

    Ise, Takeharu; Tsutsui, Tsuneo

    1977-05-01

    Usage of the two-dimensional self-organizing mesh generation program, QMESH, is presented together with the descriptions and the experience, as it has recently been converted and reconstructed from the NEACPL version to the FACOM. The program package consists of the QMESH code to generate quadrilaterial meshes with smoothing techniques, the QPLOT code to plot the data obtained from the QMESH on the graphic COM, and the RENUM code to renumber the meshes by using a bandwidth minimization procedure. The technique of mesh reconstructuring coupled with smoothing techniques is especially useful when one generates the meshes for computer codes based on the finite element method. Several typical examples are given for easy access to the QMESH program, which is registered in the R.B-disks of JAERI for users. (auth.)

  8. Safety and pharmacokinetics of the Fc-modified HIV-1 human monoclonal antibody VRC01LS: A Phase 1 open-label clinical trial in healthy adults

    OpenAIRE

    Gaudinski, Martin R.; Coates, Emily E.; Houser, Katherine V.; Chen, Grace L.; Yamshchikov, Galina; Saunders, Jamie G.; Holman, LaSonji A.; Gordon, Ingelise; Plummer, Sarah; Hendel, Cynthia S.; Conan-Cibotti, Michelle; Lorenzo, Margarita Gomez; Sitar, Sandra; Carlton, Kevin; Laurencot, Carolyn

    2018-01-01

    Background VRC01 is a human broadly neutralizing monoclonal antibody (bnMAb) against the CD4-binding site of the HIV-1 envelope glycoprotein (Env) that is currently being evaluated in a Phase IIb adult HIV-1 prevention efficacy trial. VRC01LS is a modified version of VRC01, designed for extended serum half-life by increased binding affinity to the neonatal Fc receptor. Methods and findings This Phase I dose-escalation study of VRC01LS in HIV-negative healthy adults was conducted by the Vaccin...

  9. Effects of PKM2 Gene Silencing on the Proliferation and Apoptosis of Colorectal Cancer LS-147T and SW620 Cells

    Directory of Open Access Journals (Sweden)

    Ran Ao

    2017-07-01

    Full Text Available Background/Aims: This paper aims to explore the effects of pyruvate kinase (PK M2 gene silencing on the proliferation and apoptosis of colorectal cancer (CRC LS-147T and SW620 cells. Methods: CRC LS-147T and SW620 cells highly expressing PKM2 were randomly selected by quantitative real-time polymerase chain reaction (qRT-PCR and then assigned into the blank (no transfection, PKM2-shRNA (transfection with shRNA and empty plasmid (transfection with empty plasmid groups. Immunofluorescence was applied to detect PKM2 protein expression. qRT-PCR and Western blotting were conducted to assess mRNA and protein expression of PKM2, p53 and p21. The cell counting kit-8 (CCK-8 assay was used to assess cell proliferation. Flow cytometry was used to assess the cell cycle and apoptosis rate, and a senescence-associated β-galactosidase staining kit was used to assess cell senescence. Results: PKM2 exhibited high mRNA expression among CRC LS-147T and SW620 cells with remarkable protein expression noted in the cytoplasm and nucleus. The PKM2-shRNA group exhibited reduced PKM2 mRNA and protein expression, whereas p53 and p21 expression was increased compared with the blank and empty plasmid groups. Cell proliferation in PKM2-shRNA cells decreased significantly compared with the blank group and empty plasmid groups. The PKM2-shRNA group exhibited more cells in the G1 phase and fewer cells in the G2/M phase compared with the blank and empty plasmid groups. In addition, the PKM2-shRNA group exhibited significantly increased apoptosis rates and β-galactosidase activity compared with the blank and empty plasmid groups. Conclusion: Our study demonstrates that PKM2 gene silencing suppresses proliferation and promotes apoptosis in LS-147T and SW620 cells.

  10. Radioiodinated VEGF to image tumor angiogenesis in a LS180 tumor xenograft model

    International Nuclear Information System (INIS)

    Yoshimoto, Mitsuyoshi; Kinuya, Seigo; Kawashima, Atsuhiro; Nishii, Ryuichi; Yokoyama, Kunihiko; Kawai, Keiichi

    2006-01-01

    Introduction: Angiogenesis is essential for tumor growth or metastasis. A method involving noninvasive detection of angiogenic activity in vivo would provide diagnostic information regarding antiangiogenic therapy targeting vascular endothelial cells as well as important insight into the role of vascular endothelial growth factor (VEGF) and its receptor (flt-1 and KDR) system in tumor biology. We evaluated radioiodinated VEGF 121 , which displays high binding affinity for KDR, and VEGF 165 , which possesses high binding affinity for flt-1 and low affinity for KDR, as angiogenesis imaging agents using the LS180 tumor xenograft model. Methods: VEGF 121 and VEGF 165 were labeled with 125 I by the chloramine-T method. Biodistribution was observed in an LS180 human colon cancer xenograft model. Additionally, autoradiographic imaging and immunohistochemical staining of tumors were performed with 125 I-VEGF 121 . Results: 125 I-VEGF 121 and 125 I-VEGF 165 exhibited strong, continuous uptake by tumors and the uterus, an organ characterized by angiogenesis. 125 I-VEGF 121 uptake in tumors was twofold higher than that of 125 I-VEGF 165 (9.12±98 and 4.79±1.08 %ID/g at 2 h, respectively). 125 I-VEGF 121 displayed higher tumor to nontumor (T/N) ratios in most normal organs in comparison with 125 I-VEGF 165 . 125 I-VEGF 121 accumulation in tumors decreased with increasing tumor volume. Autoradiographic and immunohistochemical analyses confirmed that the difference in 125 I-VEGF 121 tumor accumulation correlated with degree of tumor vascularity. Conclusion: Radioiodinated VEGF 121 is a promising tracer for noninvasive delineation of angiogenesis in vivo

  11. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  12. Convolutional cylinder-type block-circulant cycle codes

    Directory of Open Access Journals (Sweden)

    Mohammad Gholami

    2013-06-01

    Full Text Available In this paper, we consider a class of column-weight two quasi-cyclic low-density paritycheck codes in which the girth can be large enough, as an arbitrary multiple of 8. Then we devote a convolutional form to these codes, such that their generator matrix can be obtained by elementary row and column operations on the parity-check matrix. Finally, we show that the free distance of the convolutional codes is equal to the minimum distance of their block counterparts.

  13. VLSI architectures for modern error-correcting codes

    CERN Document Server

    Zhang, Xinmiao

    2015-01-01

    Error-correcting codes are ubiquitous. They are adopted in almost every modern digital communication and storage system, such as wireless communications, optical communications, Flash memories, computer hard drives, sensor networks, and deep-space probing. New-generation and emerging applications demand codes with better error-correcting capability. On the other hand, the design and implementation of those high-gain error-correcting codes pose many challenges. They usually involve complex mathematical computations, and mapping them directly to hardware often leads to very high complexity. VLSI

  14. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  15. Workflow Generation from the Two-Hemisphere Model

    Directory of Open Access Journals (Sweden)

    Gusarovs Konstantīns

    2017-12-01

    Full Text Available Model-Driven Software Development (MDSD is a trend in Software Development that focuses on code generation from various kinds of models. To perform such a task, it is necessary to develop an algorithm that performs source model transformation into the target model, which ideally is an actual software code written in some kind of a programming language. However, at present a lot of methods focus on Unified Modelling Language (UML diagram generation. The present paper describes a result of authors’ research on Two-Hemisphere Model (2HM processing for easier code generation.

  16. Survey Of Lossless Image Coding Techniques

    Science.gov (United States)

    Melnychuck, Paul W.; Rabbani, Majid

    1989-04-01

    Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.

  17. SERPENT Monte Carlo reactor physics code

    International Nuclear Information System (INIS)

    Leppaenen, J.

    2010-01-01

    SERPENT is a three-dimensional continuous-energy Monte Carlo reactor physics burnup calculation code, developed at VTT Technical Research Centre of Finland since 2004. The code is specialized in lattice physics applications, but the universe-based geometry description allows transport simulation to be carried out in complicated three-dimensional geometries as well. The suggested applications of SERPENT include generation of homogenized multi-group constants for deterministic reactor simulator calculations, fuel cycle studies involving detailed assembly-level burnup calculations, validation of deterministic lattice transport codes, research reactor applications, educational purposes and demonstration of reactor physics phenomena. The Serpent code has been publicly distributed by the OECD/NEA Data Bank since May 2009 and RSICC in the U. S. since March 2010. The code is being used in some 35 organizations in 20 countries around the world. This paper presents an overview of the methods and capabilities of the Serpent code, with examples in the modelling of WWER-440 reactor physics. (Author)

  18. L.S. Vygotsky's Principle "One Step in Learning — A Hundred Steps in Development": In Search of Evidence

    Directory of Open Access Journals (Sweden)

    V.K. Zaretsky

    2015-10-01

    Full Text Available On the basis of L.S. Vygotsky's published works the paper attempts to trace the dynamics of his concepts of child development and to provide evidence supporting Vygotsky's statement that one step in learning equals a hundred in development, which is one of the key principles of culturalhistorical theory in its application to child development. This statement is put in a row with two other major principles: one arguing that learning precedes development and the other referring to the zone of proximal development. The paper outlines a multivector model of the zone of proximal development as one of the conceptual tools of the reflective and activity approach to helping children overcome learning difficulties and promoting their development. The paper also describes a case study in which an orphan child with a disability received psychological and educational support that obviously contributed to the child's development. It is argued that L.S. Vygotsky's idea of the specific relationship between learning and development has fundamental theoretical and practical implications, in particular, for working with children with special needs

  19. CÁLCULO AUTOMÁTICO DO FATOR TOPOGRÁFICO (LS DA EUPS, NA BACIA DO RIO PARACATU AUTOMATIC CALCULATION OF THE TOPOGRAPHIC FACTOR (LS OF THE USLE, IN THE PARACATU RIVER BASIN

    Directory of Open Access Journals (Sweden)

    Valtercides Cavalcante da Silva

    2007-09-01

    Full Text Available

    Embora a equação universal de perda de solo (EUPS seja amplamente utilizada na predição de perda de solo, é difícil obter certos fatores dessa equação em bacias hidrográficas, como é o caso do fator de comprimento de vertente (fator L. Diante disso, este trabalho objetivou determinar de maneira informatizada (automática, o fator topográfico (LS da EUPS, utilizando para o cálculo do fator L o algoritmo de Desmet & Govers (1996, com o suporte de um Sistema de Informações Geográficas (SIG. Verificou-se a viabilidade do cálculo do fator de comprimento de vertente, na escala de 1:100.000, por meio da metodologia desses autores, que considera o fluxo acumulado.

    Although the Universal Soil Loss Equation (USLE is widely used all over the world in the prediction of soil loss, a few factors of the equation are difficult to obtain, such as the slope length factor (L factor, particularly in watersheds. For this reason, the purpose of the present research was to apply the methodology for automatic calculation of the topographic factor (LS factor using the algorithm developed by Desmet and Govers (1996 which defines the slope length factor (L factor through the Geographic Information System (GIS. It was verified that the slope length factor (L, as developed by Desmet and Govers (1996, which accounts for accumulated flow, showed feasible results.

    KEY-WORDS: USLE; L factor; topographic factor; GIS.

  20. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.