WorldWideScience

Sample records for public domain code

  1. Fast resolution of the neutron diffusion equation through public domain Ode codes

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, V.M.; Vidal, V.; Garayoa, J. [Universidad Politecnica de Valencia, Departamento de Sistemas Informaticos, Valencia (Spain); Verdu, G. [Universidad Politecnica de Valencia, Departamento de Ingenieria Quimica y Nuclear, Valencia (Spain); Gomez, R. [I.E.S. de Tavernes Blanques, Valencia (Spain)

    2003-07-01

    The time-dependent neutron diffusion equation is a partial differential equation with source terms. The resolution method usually includes discretizing the spatial domain, obtaining a large system of linear, stiff ordinary differential equations (ODEs), whose resolution is computationally very expensive. Some standard techniques use a fixed time step to solve the ODE system. This can result in errors (if the time step is too large) or in long computing times (if the time step is too little). To speed up the resolution method, two well-known public domain codes have been selected: DASPK and FCVODE that are powerful codes for the resolution of large systems of stiff ODEs. These codes can estimate the error after each time step, and, depending on this estimation can decide which is the new time step and, possibly, which is the integration method to be used in the next step. With these mechanisms, it is possible to keep the overall error below the chosen tolerances, and, when the system behaves smoothly, to take large time steps increasing the execution speed. In this paper we address the use of the public domain codes DASPK and FCVODE for the resolution of the time-dependent neutron diffusion equation. The efficiency of these codes depends largely on the preconditioning of the big systems of linear equations that must be solved. Several pre-conditioners have been programmed and tested; it was found that the multigrid method is the best of the pre-conditioners tested. Also, it has been found that DASPK has performed better than FCVODE, being more robust for our problem.We can conclude that the use of specialized codes for solving large systems of ODEs can reduce drastically the computational work needed for the solution; and combining them with appropriate pre-conditioners, the reduction can be still more important. It has other crucial advantages, since it allows the user to specify the allowed error, which cannot be done in fixed step implementations; this, of course

  2. Time-domain modeling of electromagnetic diffusion with a frequency-domain code

    NARCIS (Netherlands)

    Mulder, W.A.; Wirianto, M.; Slob, E.C.

    2007-01-01

    We modeled time-domain EM measurements of induction currents for marine and land applications with a frequency-domain code. An analysis of the computational complexity of a number of numerical methods shows that frequency-domain modeling followed by a Fourier transform is an attractive choice if a

  3. Evaluation Codes from Order Domain Theory

    DEFF Research Database (Denmark)

    Andersen, Henning Ejnar; Geil, Hans Olav

    2008-01-01

    bound is easily extended to deal with any generalized Hamming weights. We interpret our methods into the setting of order domain theory. In this way we fill in an obvious gap in the theory of order domains. [28] T. Shibuya and K. Sakaniwa, A Dual of Well-Behaving Type Designed Minimum Distance, IEICE......The celebrated Feng-Rao bound estimates the minimum distance of codes defined by means of their parity check matrices. From the Feng-Rao bound it is clear how to improve a large family of codes by leaving out certain rows in their parity check matrices. In this paper we derive a simple lower bound...... on the minimum distance of codes defined by means of their generator matrices. From our bound it is clear how to improve a large family of codes by adding certain rows to their generator matrices. The new bound is very much related to the Feng-Rao bound as well as to Shibuya and Sakaniwa's bound in [28]. Our...

  4. The missing evaluation codes from order domain theory

    DEFF Research Database (Denmark)

    Andersen, Henning Ejnar; Geil, Olav

    The Feng-Rao bound gives a lower bound on the minimum distance of codes defined by means of their parity check matrices. From the Feng-Rao bound it is clear how to improve a large family of codes by leaving out certain rows in their parity check matrices. In this paper we derive a simple lower...... generalized Hamming weight. We interpret our methods into the setting of order domain theory. In this way we fill in an obvious gap in the theory of order domains. The improved codes from the present paper are not in general equal to the Feng-Rao improved codes but the constructions are very much related....

  5. PUBLIC DOMAIN PROTECTION. USES AND REUSES OF PUBLIC DOMAIN WORKS

    Directory of Open Access Journals (Sweden)

    Monica Adriana LUPAȘCU

    2015-07-01

    Full Text Available This study tries to highlight the necessity of an awareness of the right of access to the public domain, particularly using the example of works whose protection period has expired, as well as the ones which the law considers to be excluded from protection. Such works are used not only by large libraries from around the world, but also by rights holders, via different means of use, including incorporations into original works or adaptations. However, the reuse that follows these uses often only remains at the level of concept, as the notion of the public’s right of access to public domain works is not substantiated, nor is the notion of the correct or legal use of such works.

  6. Towards development of a high quality public domain global roads database

    Directory of Open Access Journals (Sweden)

    Andrew Nelson

    2006-12-01

    Full Text Available There is clear demand for a global spatial public domain roads data set with improved geographic and temporal coverage, consistent coding of road types, and clear documentation of sources. The currently best available global public domain product covers only one-quarter to one-third of the existing road networks, and this varies considerably by region. Applications for such a data set span multiple sectors and would be particularly valuable for the international economic development, disaster relief, and biodiversity conservation communities, not to mention national and regional agencies and organizations around the world. The building blocks for such a global product are available for many countries and regions, yet thus far there has been neither strategy nor leadership for developing it. This paper evaluates the best available public domain and commercial data sets, assesses the gaps in global coverage, and proposes a number of strategies for filling them. It also identifies stakeholder organizations with an interest in such a data set that might either provide leadership or funding for its development. It closes with a proposed set of actions to begin the process.

  7. The International River Interface Cooperative: Public Domain Software for River Flow and Morphodynamics (Invited)

    Science.gov (United States)

    Nelson, J. M.; Shimizu, Y.; McDonald, R.; Takebayashi, H.

    2009-12-01

    The International River Interface Cooperative is an informal organization made up of academic faculty and government scientists with the goal of developing, distributing and providing education for a public-domain software interface for modeling river flow and morphodynamics. Formed in late 2007, the group released the first version of this interface (iRIC) in late 2009. iRIC includes models for two and three-dimensional flow, sediment transport, bed evolution, groundwater-surface water interaction, topographic data processing, and habitat assessment, as well as comprehensive data and model output visualization, mapping, and editing tools. All the tools in iRIC are specifically designed for use in river reaches and utilize common river data sets. The models are couched within a single graphical user interface so that a broad spectrum of models are available to users without learning new pre- and post-processing tools. The first version of iRIC was developed by combining the USGS public-domain Multi-Dimensional Surface Water Modeling System (MD_SWMS), developed at the USGS Geomorphology and Sediment Transport Laboratory in Golden, Colorado, with the public-domain river modeling code NAYS developed by the Universities of Hokkaido and Kyoto, Mizuho Corporation, and the Foundation of the River Disaster Prevention Research Institute in Sapporo, Japan. Since this initial effort, other Universities and Agencies have joined the group, and the interface has been expanded to allow users to integrate their own modeling code using Executable Markup Language (XML), which provides easy access and expandability to the iRIC software interface. In this presentation, the current components of iRIC are described and results from several practical modeling applications are presented to illustrate the capabilities and flexibility of the software. In addition, some future extensions to iRIC are demonstrated, including software for Lagrangian particle tracking and the prediction of

  8. Central Decoding for Multiple Description Codes based on Domain Partitioning

    Directory of Open Access Journals (Sweden)

    M. Spiertz

    2006-01-01

    Full Text Available Multiple Description Codes (MDC can be used to trade redundancy against packet loss resistance for transmitting data over lossy diversity networks. In this work we focus on MD transform coding based on domain partitioning. Compared to Vaishampayan’s quantizer based MDC, domain based MD coding is a simple approach for generating different descriptions, by using different quantizers for each description. Commonly, only the highest rate quantizer is used for reconstruction. In this paper we investigate the benefit of using the lower rate quantizers to enhance the reconstruction quality at decoder side. The comparison is done on artificial source data and on image data. 

  9. Computing the Feng-Rao distances for codes from order domains

    DEFF Research Database (Denmark)

    Ruano Benito, Diego

    2007-01-01

    We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis.......We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis....

  10. Generalized Sudan's List Decoding for Order Domain Codes

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Matsumoto, Ryutaroh

    2007-01-01

    We generalize Sudan's list decoding algorithm without multiplicity to evaluation codes coming from arbitrary order domains. The number of correctable errors by the proposed method is larger than the original list decoding without multiplicity....

  11. Agents unleashed a public domain look at agent technology

    CERN Document Server

    Wayner, Peter

    1995-01-01

    Agents Unleashed: A Public Domain Look at Agent Technology covers details of building a secure agent realm. The book discusses the technology for creating seamlessly integrated networks that allow programs to move from machine to machine without leaving a trail of havoc; as well as the technical details of how an agent will move through the network, prove its identity, and execute its code without endangering the host. The text also describes the organization of the host's work processing an agent; error messages, bad agent expulsion, and errors in XLISP-agents; and the simulators of errors, f

  12. Linear dispersion codes in space-frequency domain for SCFDE

    DEFF Research Database (Denmark)

    Marchetti, Nicola; Cianca, Ernestina; Prasad, Ramjee

    2007-01-01

    This paper presents a general framework for applying the Linear Dispersion Codes (LDC) in the space and frequency domains to Single Carrier - Frequency Domain Equalization (SCFDE) systems. Space-Frequency (SF)LDC are more suitable than Space-Time (ST)-LDC in high mobility environment. However......, the application of LDC in space-frequency domain in SCFDE systems is not straightforward as in Orthogonal Frequency Division Multiplexing (OFDM), since there is no direct access to the subcarriers at the transmitter. This paper describes how to build the space-time dispersion matrices to be used...

  13. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  14. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  15. Public Domain; Public Interest; Public Funding: Focussing on the ‘three Ps’ in Scientific Research

    Directory of Open Access Journals (Sweden)

    Mags McGinley

    2005-03-01

    Full Text Available The purpose of this paper is to discuss the ‘three Ps’ of scientific research: Public Domain; Public Interest; Public Funding. This is done by examining some of the difficulties faced by scientists engaged in scientific research who may have problems working within the constraints of current copyright and database legislation, where property claims can place obstacles in the way of research, in other words, the public domain. The article then looks at perceptions of the public interest and asks whether copyright and the database right reflect understandings of how this concept should operate. Thirdly, it considers the relevance of public funding for scientific research in the context of both the public domain and of the public interest. Finally, some recent initiatives seeking to change the contours of the legal framework are be examined.

  16. Multiplicative Structure and Hecke Rings of Generator Matrices for Codes over Quotient Rings of Euclidean Domains

    Directory of Open Access Journals (Sweden)

    Hajime Matsui

    2017-12-01

    Full Text Available In this study, we consider codes over Euclidean domains modulo their ideals. In the first half of the study, we deal with arbitrary Euclidean domains. We show that the product of generator matrices of codes over the rings mod a and mod b produces generator matrices of all codes over the ring mod a b , i.e., this correspondence is onto. Moreover, we show that if a and b are coprime, then this correspondence is one-to-one, i.e., there exist unique codes over the rings mod a and mod b that produce any given code over the ring mod a b through the product of their generator matrices. In the second half of the study, we focus on the typical Euclidean domains such as the rational integer ring, one-variable polynomial rings, rings of Gaussian and Eisenstein integers, p-adic integer rings and rings of one-variable formal power series. We define the reduced generator matrices of codes over Euclidean domains modulo their ideals and show their uniqueness. Finally, we apply our theory of reduced generator matrices to the Hecke rings of matrices over these Euclidean domains.

  17. Digital coherent detection research on Brillouin optical time domain reflectometry with simplex pulse codes

    International Nuclear Information System (INIS)

    Hao Yun-Qi; Ye Qing; Pan Zheng-Qing; Cai Hai-Wen; Qu Rong-Hui

    2014-01-01

    The digital coherent detection technique has been investigated without any frequency-scanning device in the Brillouin optical time domain reflectometry (BOTDR), where the simplex pulse codes are applied in the sensing system. The time domain signal of every code sequence is collected by the data acquisition card (DAQ). A shift-averaging technique is applied in the frequency domain for the reason that the local oscillator (LO) in the coherent detection is fix-frequency deviated from the primary source. With the 31-bit simplex code, the signal-to-noise ratio (SNR) has 3.5-dB enhancement with the same single pulse traces, accordant with the theoretical analysis. The frequency fluctuation for simplex codes is 14.01 MHz less than that for a single pulse as to 4-m spatial resolution. The results are believed to be beneficial for the BOTDR performance improvement. (general)

  18. Violence defied? : A review of prevention of violence in public and semi-public domain

    NARCIS (Netherlands)

    Knaap, L.M. van der; Nijssen, L.T.J.; Bogaerts, S.

    2006-01-01

    This report provides a synthesis of 48 studies of the effects of the prevention of violence in the public and semi-public domain. The following research questions were states for this study:What measures for the prevention of violence in the public and semi-public domain are known and have been

  19. Public licenses and public domain as alternatives to copyright

    OpenAIRE

    Köppel, Petr

    2012-01-01

    The work first introduces the area of public licenses as a space between the copyright law and public domain. After that, consecutively for proprietary software, free and open source software, open hardware and open content, it maps particular types of public licenses and the accompanying social and cultural movements, puts them in mutual as well as historical context, examines their characteristics and compares them to each other, shows how the public licenses are defined by various accompan...

  20. Preserving the positive functions of the public domain in science

    Directory of Open Access Journals (Sweden)

    Pamela Samuelson

    2003-11-01

    Full Text Available Science has advanced in part because data and scientific methodologies have traditionally not been subject to intellectual property protection. In recent years, intellectual property has played a greater role in scientific work. While intellectual property rights may have a positive role to play in some fields of science, so does the public domain. This paper will discuss some of the positive functions of the public domain and ways in which certain legal developments may negatively impact the public domain. It suggests some steps that scientists can take to preserve the positive functions of the public domain for science.

  1. Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains

    Science.gov (United States)

    Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao

    2017-11-01

    A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.

  2. Architecture for time or transform domain decoding of reed-solomon codes

    Science.gov (United States)

    Shao, Howard M. (Inventor); Truong, Trieu-Kie (Inventor); Hsu, In-Shek (Inventor); Deutsch, Leslie J. (Inventor)

    1989-01-01

    Two pipeline (255,233) RS decoders, one a time domain decoder and the other a transform domain decoder, use the same first part to develop an errata locator polynomial .tau.(x), and an errata evaluator polynominal A(x). Both the time domain decoder and transform domain decoder have a modified GCD that uses an input multiplexer and an output demultiplexer to reduce the number of GCD cells required. The time domain decoder uses a Chien search and polynomial evaluator on the GCD outputs .tau.(x) and A(x), for the final decoding steps, while the transform domain decoder uses a transform error pattern algorithm operating on .tau.(x) and the initial syndrome computation S(x), followed by an inverse transform algorithm in sequence for the final decoding steps prior to adding the received RS coded message to produce a decoded output message.

  3. Interoperable domain-specific languages families for code generation

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Plášil, F.; Bureš, Tomáš; Hnětynka, P.

    2013-01-01

    Roč. 43, č. 5 (2013), s. 479-499 ISSN 0038-0644 R&D Projects: GA ČR GD201/09/H057 EU Projects: European Commission(XE) ASCENS 257414 Grant - others:GA AV ČR(CZ) GAP103/11/1489 Program:FP7 Institutional research plan: CEZ:AV0Z10300504 Keywords : code generation * domain specific languages * models reuse * extensible languages * specification * program synthesis Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.148, year: 2013

  4. 48 CFR 1.105-1 - Publication and code arrangement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  5. 48 CFR 501.105-1 - Publication and code arrangement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...

  6. 48 CFR 3001.105-1 - Publication and code arrangement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...

  7. Improved virtual channel noise model for transform domain Wyner-Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2009-01-01

    Distributed video coding (DVC) has been proposed as a new video coding paradigm to deal with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. A virtual channel noise model is utilized at the decoder to estimate...... the noise distribution between the side information frame and the original frame. This is one of the most important aspects influencing the coding performance of DVC. Noise models with different granularity have been proposed. In this paper, an improved noise model for transform domain Wyner-Ziv video...... coding is proposed, which utilizes cross-band correlation to estimate the Laplacian parameters more accurately. Experimental results show that the proposed noise model can improve the rate-distortion (RD) performance....

  8. Efficient Dual Domain Decoding of Linear Block Codes Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed Azouaoui

    2012-01-01

    Full Text Available A computationally efficient algorithm for decoding block codes is developed using a genetic algorithm (GA. The proposed algorithm uses the dual code in contrast to the existing genetic decoders in the literature that use the code itself. Hence, this new approach reduces the complexity of decoding the codes of high rates. We simulated our algorithm in various transmission channels. The performance of this algorithm is investigated and compared with competitor decoding algorithms including Maini and Shakeel ones. The results show that the proposed algorithm gives large gains over the Chase-2 decoding algorithm and reach the performance of the OSD-3 for some quadratic residue (QR codes. Further, we define a new crossover operator that exploits the domain specific information and compare it with uniform and two point crossover. The complexity of this algorithm is also discussed and compared to other algorithms.

  9. The Definition, Dimensions, and Domain of Public Relations.

    Science.gov (United States)

    Hutton, James G.

    1999-01-01

    Discusses how the field of public relations has left itself vulnerable to other fields that are making inroads into public relations' traditional domain, and to critics who are filling in their own definitions of public relations. Proposes a definition and a three-dimensional framework to compare competing philosophies of public relations and to…

  10. Mutiple LDPC Decoding using Bitplane Correlation for Transform Domain Wyner-Ziv Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Huang, Xin; Forchhammer, Søren

    2011-01-01

    Distributed video coding (DVC) is an emerging video coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. This paper considers a Low Density Parity Check (LDPC) based Transform Domain Wyner-Ziv (TDWZ) video...... codec. To improve the LDPC coding performance in the context of TDWZ, this paper proposes a Wyner-Ziv video codec using bitplane correlation through multiple parallel LDPC decoding. The proposed scheme utilizes inter bitplane correlation to enhance the bitplane decoding performance. Experimental results...

  11. Fast Convolutional Sparse Coding in the Dual Domain

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2017-01-01

    Convolutional sparse coding (CSC) is an important building block of many computer vision applications ranging from image and video compression to deep learning. We present two contributions to the state of the art in CSC. First, we significantly speed up the computation by proposing a new optimization framework that tackles the problem in the dual domain. Second, we extend the original formulation to higher dimensions in order to process a wider range of inputs, such as color inputs, or HOG features. Our results show a significant speedup compared to the current state of the art in CSC.

  12. Fast Convolutional Sparse Coding in the Dual Domain

    KAUST Repository

    Affara, Lama Ahmed

    2017-09-27

    Convolutional sparse coding (CSC) is an important building block of many computer vision applications ranging from image and video compression to deep learning. We present two contributions to the state of the art in CSC. First, we significantly speed up the computation by proposing a new optimization framework that tackles the problem in the dual domain. Second, we extend the original formulation to higher dimensions in order to process a wider range of inputs, such as color inputs, or HOG features. Our results show a significant speedup compared to the current state of the art in CSC.

  13. Domain Decomposition strategy for pin-wise full-core Monte Carlo depletion calculation with the reactor Monte Carlo Code

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Jingang; Wang, Kan; Qiu, Yishu [Dept. of Engineering Physics, LiuQing Building, Tsinghua University, Beijing (China); Chai, Xiao Ming; Qiang, Sheng Long [Science and Technology on Reactor System Design Technology Laboratory, Nuclear Power Institute of China, Chengdu (China)

    2016-06-15

    Because of prohibitive data storage requirements in large-scale simulations, the memory problem is an obstacle for Monte Carlo (MC) codes in accomplishing pin-wise three-dimensional (3D) full-core calculations, particularly for whole-core depletion analyses. Various kinds of data are evaluated and quantificational total memory requirements are analyzed based on the Reactor Monte Carlo (RMC) code, showing that tally data, material data, and isotope densities in depletion are three major parts of memory storage. The domain decomposition method is investigated as a means of saving memory, by dividing spatial geometry into domains that are simulated separately by parallel processors. For the validity of particle tracking during transport simulations, particles need to be communicated between domains. In consideration of efficiency, an asynchronous particle communication algorithm is designed and implemented. Furthermore, we couple the domain decomposition method with MC burnup process, under a strategy of utilizing consistent domain partition in both transport and depletion modules. A numerical test of 3D full-core burnup calculations is carried out, indicating that the RMC code, with the domain decomposition method, is capable of pin-wise full-core burnup calculations with millions of depletion regions.

  14. Cultural Heritage and the Public Domain

    Directory of Open Access Journals (Sweden)

    Bas Savenije

    2012-09-01

    by providing their resources on the Internet” (Berlin Declaration 2003. Therefore, in the spirit of the Berlin Declaration, the ARL encourages its members’ libraries to grant all non-commercial users “a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship”. And: “If fees are to be assessed for the use of digitised public domain works, those fees should only apply to commercial uses” (ARL Principles July 2010. In our view, cultural heritage institutions should make public domain material digitised with public funding as widely available as possible for access and reuse. The public sector has the primary responsibility to fund digitisation. The involvement of private partners, however, is encouraged by ARL as well as the Comité des Sages. Private funding for digitisation is a complement to the necessary public investment, especially in times of economic crisis, but should not be seen as a substitute for public funding. As we can see from these reports there are a number of arguments in favour of digitisation and also of providing maximum accessibility to the digitised cultural heritage. In this paper we will investigate the legal aspects of digitisation of cultural heritage, especially public domain material. On the basis of these we will make an inventory of policy considerations regarding reuse. Furthermore, we will describe the conclusions the National Library of the Netherlands (hereafter: KB has formulated and the arguments that support these. In this context we will review public-private partnerships and also the policy of the KB. We will conclude with recommendations for cultural heritage institutions concerning a reuse policy for digitised public domain material.

  15. Computationally Efficient Amplitude Modulated Sinusoidal Audio Coding using Frequency-Domain Linear Prediction

    DEFF Research Database (Denmark)

    Christensen, M. G.; Jensen, Søren Holdt

    2006-01-01

    A method for amplitude modulated sinusoidal audio coding is presented that has low complexity and low delay. This is based on a subband processing system, where, in each subband, the signal is modeled as an amplitude modulated sum of sinusoids. The envelopes are estimated using frequency......-domain linear prediction and the prediction coefficients are quantized. As a proof of concept, we evaluate different configurations in a subjective listening test, and this shows that the proposed method offers significant improvements in sinusoidal coding. Furthermore, the properties of the frequency...

  16. Characterization of high-power RF structures using time-domain field codes

    International Nuclear Information System (INIS)

    Shang, C.C.; DeFord, J.F.; Swatloski, T.L.

    1992-01-01

    We have modeled gyrotron windows and gyrotron amplifier sever structures for TE modes in the 100--150 GHz range and have computed the reflection and transmission characteristics from the field data. Good agreement with frequency domain codes and analytic analysis have been obtained for some simple geometries. We present results for realistic structures with lousy coatings and describe implementation of microwave diagnostics

  17. Characterization of high-power RF structures using time-domain field codes

    International Nuclear Information System (INIS)

    Shang, C.C.; DeFord, J.F.; Swatloski, T.L.

    1992-01-01

    We have modeled gyrotron windows and gyrotron amplifier sever structures for TE modes in the 100-150 GHz range and have computed the reflection and transmission characteristics from the field data. Good agreement with frequency domain codes and analytic analysis have been obtained for some simple geometries. We present results for realistic structures with lossy coatings and describe implementation of microwave diagnostics. (Author) 5 figs., 7 refs

  18. Female-biased expression of long non-coding RNAs in domains that escape X-inactivation in mouse

    Directory of Open Access Journals (Sweden)

    Lu Lu

    2010-11-01

    Full Text Available Abstract Background Sexual dimorphism in brain gene expression has been recognized in several animal species. However, the relevant regulatory mechanisms remain poorly understood. To investigate whether sex-biased gene expression in mammalian brain is globally regulated or locally regulated in diverse brain structures, and to study the genomic organisation of brain-expressed sex-biased genes, we performed a large scale gene expression analysis of distinct brain regions in adult male and female mice. Results This study revealed spatial specificity in sex-biased transcription in the mouse brain, and identified 173 sex-biased genes in the striatum; 19 in the neocortex; 12 in the hippocampus and 31 in the eye. Genes located on sex chromosomes were consistently over-represented in all brain regions. Analysis on a subset of genes with sex-bias in more than one tissue revealed Y-encoded male-biased transcripts and X-encoded female-biased transcripts known to escape X-inactivation. In addition, we identified novel coding and non-coding X-linked genes with female-biased expression in multiple tissues. Interestingly, the chromosomal positions of all of the female-biased non-coding genes are in close proximity to protein-coding genes that escape X-inactivation. This defines X-chromosome domains each of which contains a coding and a non-coding female-biased gene. Lack of repressive chromatin marks in non-coding transcribed loci supports the possibility that they escape X-inactivation. Moreover, RNA-DNA combined FISH experiments confirmed the biallelic expression of one such novel domain. Conclusion This study demonstrated that the amount of genes with sex-biased expression varies between individual brain regions in mouse. The sex-biased genes identified are localized on many chromosomes. At the same time, sexually dimorphic gene expression that is common to several parts of the brain is mostly restricted to the sex chromosomes. Moreover, the study uncovered

  19. Implementation of domain decomposition and data decomposition algorithms in RMC code

    International Nuclear Information System (INIS)

    Liang, J.G.; Cai, Y.; Wang, K.; She, D.

    2013-01-01

    The applications of Monte Carlo method in reactor physics analysis is somewhat restricted due to the excessive memory demand in solving large-scale problems. Memory demand in MC simulation is analyzed firstly, it concerns geometry data, data of nuclear cross-sections, data of particles, and data of tallies. It appears that tally data is dominant in memory cost and should be focused on in solving the memory problem. Domain decomposition and tally data decomposition algorithms are separately designed and implemented in the reactor Monte Carlo code RMC. Basically, the domain decomposition algorithm is a strategy of 'divide and rule', which means problems are divided into different sub-domains to be dealt with separately and some rules are established to make sure the whole results are correct. Tally data decomposition consists in 2 parts: data partition and data communication. Two algorithms with differential communication synchronization mechanisms are proposed. Numerical tests have been executed to evaluate performance of the new algorithms. Domain decomposition algorithm shows potentials to speed up MC simulation as a space parallel method. As for tally data decomposition algorithms, memory size is greatly reduced

  20. Transform domain Wyner-Ziv video coding with refinement of noise residue and side information

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2010-01-01

    are successively updating the estimated noise residue for noise modeling and side information frame quality during decoding. Experimental results show that the proposed decoder can improve the Rate- Distortion (RD) performance of a state-of-the-art Wyner Ziv video codec for the set of test sequences.......Distributed Video Coding (DVC) is a video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of side information at the decoder. This paper considers feedback channel based Transform Domain Wyner-Ziv (TDWZ) DVC. The coding efficiency of TDWZ video...... coding does not match that of conventional video coding yet, mainly due to the quality of side information and inaccurate noise estimation. In this context, a novel TDWZ video decoder with noise residue refinement (NRR) and side information refinement (SIR) is proposed. The proposed refinement schemes...

  1. 48 CFR 401.105-1 - Publication and code arrangement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...

  2. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  3. Error analysis of a public domain pronunciation dictionary

    CSIR Research Space (South Africa)

    Martirosian, O

    2007-11-01

    Full Text Available ], a popular public domain resource that is widely used in English speech processing systems. The techniques being investigated are applied to the lexicon and the results of each step are illustrated using sample entries. The authors found that as many...

  4. Privacy with Public Access: Digital Memorials on QR Codes

    DEFF Research Database (Denmark)

    Gotved, Stine

    2015-01-01

    takes the departure in gravestones with QR-codes; objects at once physical and digital, underhandedly putting presumably private content within public reach. A plethora of issues of privacy and publicness are at play within the study's two connected but rather different empirical spaces: the physical...... in the borderland between private and public is exemplified, and with the presentation, we are ensuring a continued discussion on privacy as well as legacy in our digital society....

  5. Coded Statutory Data Sets for Evaluation of Public Health Law

    Science.gov (United States)

    Costich, Julia Field

    2012-01-01

    Background and objectives: The evaluation of public health law requires reliable accounts of underlying statutes and regulations. States often enact public health-related statutes with nonuniform provisions, and variation in the structure of state legal codes can foster inaccuracy in evaluating the impact of specific categories of law. The optimal…

  6. 3D scene reconstruction based on multi-view distributed video coding in the Zernike domain for mobile applications

    Science.gov (United States)

    Palma, V.; Carli, M.; Neri, A.

    2011-02-01

    In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.

  7. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  8. A novel domain overlapping strategy for the multiscale coupling of CFD with 1D system codes with applications to transient flows

    International Nuclear Information System (INIS)

    Grunloh, T.P.; Manera, A.

    2016-01-01

    Highlights: • A novel domain overlapping coupling method is presented. • Method calculates closure coefficients for system codes based on CFD results. • Convergence and stability are compared with a domain decomposition implementation. • Proposed method is tested in several 1D cases. • Proposed method found to exhibit more favorable convergence and stability behavior. - Abstract: A novel multiscale coupling methodology based on a domain overlapping approach has been developed to couple a computational fluid dynamics code with a best-estimate thermal hydraulic code. The methodology has been implemented in the coupling infrastructure code Janus, developed at the University of Michigan, providing methods for the online data transfer between the commercial computational fluid dynamics code STAR-CCM+ and the US NRC best-estimate thermal hydraulic system code TRACE. Coupling between these two software packages is motivated by the desire to extend the range of applicability of TRACE to scenarios in which local momentum and energy transfer are important, such as three-dimensional mixing. These types of flows are relevant, for example, in the simulation of passive safety systems including large containment pools, or for flow mixing in the reactor pressure vessel downcomer of current light water reactors and integral small modular reactors. The intrafluid shear forces neglected by TRACE equations of motion are readily calculated from computational fluid dynamics solutions. Consequently, the coupling methods used in this study are built around correcting TRACE solutions with data from a corresponding STAR-CCM+ solution. Two coupling strategies are discussed in the paper: one based on a novel domain overlapping approach specifically designed for transient operation, and a second based on the well-known domain decomposition approach. In the present paper, we discuss the application of the two coupling methods to the simulation of open and closed loops in both steady

  9. Codes of conduct in public schools: a legal perspective

    African Journals Online (AJOL)

    Erna Kinsey

    cation change in South Africa, particularly the transformation of public schools ... been granted legal personality to act as "juristic persons" (i.e. legal persons ..... cess, a decision is made to amend, or repeal, the code of conduct, de- pending on ...

  10. The DarkStars code: a publicly available dark stellar evolution package

    CERN Document Server

    Scott, Pat; Fairbairn, Malcolm

    2009-01-01

    We announce the public release of the 'dark' stellar evolution code DarkStars. The code simultaneously solves the equations of WIMP capture and annihilation in a star with those of stellar evolution assuming approximate hydrostatic equilibrium. DarkStars includes the most extensive WIMP microphysics of any dark evolution code to date. The code employs detailed treatments of the capture process from a range of WIMP velocity distributions, as well as composite WIMP distribution and conductive energy transport schemes based on the WIMP mean-free path in the star. We give a brief description of the input physics and practical usage of the code, as well as examples of its application to dark stars at the Galactic centre.

  11. Domain decomposition with local refinement for flow simulation around a nuclear waste disposal site: direct computation versus simulation using code coupling with OCamlP3L

    Energy Technology Data Exchange (ETDEWEB)

    Clement, F.; Vodicka, A.; Weis, P. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Martin, V. [Institut National de Recherches Agronomiques (INRA), 92 - Chetenay Malabry (France); Di Cosmo, R. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Paris-7 Univ., 75 (France)

    2003-07-01

    We consider the application of a non-overlapping domain decomposition method with non-matching grids based on Robin interface conditions to the problem of flow surrounding an underground nuclear waste disposal. We show with a simple example how one can refine the mesh locally around the storage with this technique. A second aspect is studied in this paper. The coupling between the sub-domains can be achieved by computing in two ways: either directly (i.e. the domain decomposition algorithm is included in the code that solves the problems on the sub-domains) or using code coupling. In the latter case, each sub-domain problem is solved separately and the coupling is performed by another program. We wrote a coupling program in the functional language Ocaml, using the OcamIP31 environment devoted to ease the parallelism. This at the same time we test the code coupling and we use the natural parallel property of domain decomposition methods. Some simple 2D numerical tests show promising results, and further studies are under way. (authors)

  12. Domain decomposition with local refinement for flow simulation around a nuclear waste disposal site: direct computation versus simulation using code coupling with OCamlP3L

    International Nuclear Information System (INIS)

    Clement, F.; Vodicka, A.; Weis, P.; Martin, V.; Di Cosmo, R.

    2003-01-01

    We consider the application of a non-overlapping domain decomposition method with non-matching grids based on Robin interface conditions to the problem of flow surrounding an underground nuclear waste disposal. We show with a simple example how one can refine the mesh locally around the storage with this technique. A second aspect is studied in this paper. The coupling between the sub-domains can be achieved by computing in two ways: either directly (i.e. the domain decomposition algorithm is included in the code that solves the problems on the sub-domains) or using code coupling. In the latter case, each sub-domain problem is solved separately and the coupling is performed by another program. We wrote a coupling program in the functional language Ocaml, using the OcamIP31 environment devoted to ease the parallelism. This at the same time we test the code coupling and we use the natural parallel property of domain decomposition methods. Some simple 2D numerical tests show promising results, and further studies are under way. (authors)

  13. A Public Domain Software Library for Reading and Language Arts.

    Science.gov (United States)

    Balajthy, Ernest

    A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…

  14. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  15. Data and code for the exploratory data analysis of the electrical energy demand in the time domain in Greece.

    Science.gov (United States)

    Tyralis, Hristos; Karakatsanis, Georgios; Tzouka, Katerina; Mamassis, Nikos

    2017-08-01

    We present data and code for visualizing the electrical energy data and weather-, climate-related and socioeconomic variables in the time domain in Greece. The electrical energy data include hourly demand, weekly-ahead forecasted values of the demand provided by the Greek Independent Power Transmission Operator and pricing values in Greece. We also present the daily temperature in Athens and the Gross Domestic Product of Greece. The code combines the data to a single report, which includes all visualizations with combinations of all variables in multiple time scales. The data and code were used in Tyralis et al. (2017) [1].

  16. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Science.gov (United States)

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... GENERAL PROVISIONS § 201.26 Recordation of documents pertaining to computer shareware and donation of public domain computer software. (a) General. This section prescribes the procedures for submission of...

  17. Code-To-Code Benchmarking Of The Porflow And GoldSim Contaminant Transport Models Using A Simple 1-D Domain - 11191

    International Nuclear Information System (INIS)

    Hiergesell, R.; Taylor, G.

    2010-01-01

    An investigation was conducted to compare and evaluate contaminant transport results of two model codes, GoldSim and Porflow, using a simple 1-D string of elements in each code. Model domains were constructed to be identical with respect to cell numbers and dimensions, matrix material, flow boundary and saturation conditions. One of the codes, GoldSim, does not simulate advective movement of water; therefore the water flux term was specified as a boundary condition. In the other code, Porflow, a steady-state flow field was computed and contaminant transport was simulated within that flow-field. The comparisons were made solely in terms of the ability of each code to perform contaminant transport. The purpose of the investigation was to establish a basis for, and to validate follow-on work that was conducted in which a 1-D GoldSim model developed by abstracting information from Porflow 2-D and 3-D unsaturated and saturated zone models and then benchmarked to produce equivalent contaminant transport results. A handful of contaminants were selected for the code-to-code comparison simulations, including a non-sorbing tracer and several long- and short-lived radionuclides exhibiting both non-sorbing to strongly-sorbing characteristics with respect to the matrix material, including several requiring the simulation of in-growth of daughter radionuclides. The same diffusion and partitioning coefficients associated with each contaminant and the half-lives associated with each radionuclide were incorporated into each model. A string of 10-elements, having identical spatial dimensions and properties, were constructed within each code. GoldSim's basic contaminant transport elements, Mixing cells, were utilized in this construction. Sand was established as the matrix material and was assigned identical properties (e.g. bulk density, porosity, saturated hydraulic conductivity) in both codes. Boundary conditions applied included an influx of water at the rate of 40 cm/yr at one

  18. Lessons Learned through the Development and Publication of AstroImageJ

    Science.gov (United States)

    Collins, Karen

    2018-01-01

    As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.

  19. Single-shot secure quantum network coding on butterfly network with free public communication

    Science.gov (United States)

    Owari, Masaki; Kato, Go; Hayashi, Masahito

    2018-01-01

    Quantum network coding on the butterfly network has been studied as a typical example of quantum multiple cast network. We propose a secure quantum network code for the butterfly network with free public classical communication in the multiple unicast setting under restricted eavesdropper’s power. This protocol certainly transmits quantum states when there is no attack. We also show the secrecy with shared randomness as additional resource when the eavesdropper wiretaps one of the channels in the butterfly network and also derives the information sending through public classical communication. Our protocol does not require verification process, which ensures single-shot security.

  20. The Astrophysics Source Code Library: Supporting software publication and citation

    Science.gov (United States)

    Allen, Alice; Teuben, Peter

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.

  1. Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project

    Science.gov (United States)

    Bolstad, Rachel

    2016-01-01

    This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…

  2. An advanced frequency-domain code for boiling water reactor (BWR) stability analysis and design

    International Nuclear Information System (INIS)

    Behrooz, A.

    2008-01-01

    The two-phase flow instability is of interest for the design and operation of many industrial systems such as boiling water reactors (BWRs), chemical reactors, and steam generators. In case of BWRs, the flow instabilities are coupled to the power instabilities via neutronic-thermal hydraulic feedbacks. Since these instabilities produce also local pressure oscillations, the coolant flashing plays a very important role at low pressure. Many frequency-domain codes have been used for two-phase flow stability analysis of thermal hydraulic industrial systems with particular emphasis to BWRs. Some were ignoring the effect of the local pressure, or the effect of 3D power oscillations, and many were not able to deal with the neutronics-thermal hydraulics problems considering the entire core and all its fuel assemblies. The new frequency domain tool uses the best available nuclear, thermal hydraulic, algebraic and control theory methods for simulating BWRs and analyzing their stability in either off-line or on-line fashion. The novel code takes all necessary information from plant files via an interface, solves and integrates, for all reactor fuel assemblies divided into a number of segments, the thermal-hydraulic non-homogenous non-equilibrium coupled linear differential equations, and solves the 3D, two-energy-group diffusion equations for the entire core (with spatial expansion of the neutron fluxes in Legendre polynomials).It is important to note that the neutronics equations written in terms of flux harmonics for a discretized system (nodal-modal equations) generate a set of large sparse matrices. The eigenvalue problem associated to the discretized core statics equations is solved by the implementation of the implicit restarted Arnoldi method (IRAM) with implicit shifted QR mechanism. The results of the steady state are then used for the calculation of the local transfer functions and system transfer matrices. The later are large-dense and complex matrices, (their size

  3. The Moral Reasoning of Public Accountants in the Development of a Code of Ethics: the Case of Indonesia

    Directory of Open Access Journals (Sweden)

    A. S. L. Lindawati

    2012-03-01

    Full Text Available The objective of this study is to explore the user’s perceptions of the role of moral reasoning in influencing the implementation of codes of ethics as standards and guidance for professional audit practice by Indonesian public accountants. The study focuses on two important aspects of influence: (i the key factors influencing professional public accountants in implementing a code of ethics as a standard for audit practice, and (ii the key activities performed by public accountants as moral agents for establishing awareness of professional values. Two theoretical approaches/models are used as guides for exploring the influence of moral reasoning of public accountants: first, Kolhberg’s model of moral development (Kolhberg 1982 and, secondly, the American Institute of Certified Public Accountants (AICPA’s Code of Conduct, especially the five principles of the code of ethics (1992, 2004. The study employs a multiple case study model to analyse the data collected from interviewing 15 financial managers of different company categories (as users. The findings indicate that (i moral development is an important component in influencing the moral reasoning of the individual public accountants, (ii the degree of professionalism of public accountants is determined by the degree of the development of their moral reasoning, and (iii moral reasoning of individuals influences both Indonesian public accountants and company financial managers in building and improving the effectiveness of the implementation of codes of conduct. It is concluded that the role of moral reasoning is an important influence on achieving ethical awareness in public accountants and financial managers. The development of a full code of ethics and an effective compliance monitoring system is essential for Indonesia if it is to play a role in the emerging global economy.

  4. Accumulation of Domain-Specific Physical Inactivity and Presence of Hypertension in Brazilian Public Healthcare System.

    Science.gov (United States)

    Turi, Bruna Camilo; Codogno, Jamile S; Fernandes, Romulo A; Sui, Xuemei; Lavie, Carl J; Blair, Steven N; Monteiro, Henrique Luiz

    2015-11-01

    Hypertension is one of the most common noncommunicable diseases worldwide, and physical inactivity is a risk factor predisposing to its occurrence and complications. However, it is still unclear the association between physical inactivity domains and hypertension, especially in public healthcare systems. Thus, this study aimed to investigate the association between physical inactivity aggregation in different domains and prevalence of hypertension among users of Brazilian public health system. 963 participants composed the sample. Subjects were divided into quartiles groups according to 3 different domains of physical activity (occupational; physical exercises; and leisure-time and transportation). Hypertension was based on physician diagnosis. Physical inactivity in occupational domain was significantly associated with higher prevalence of hypertension (OR = 1.52 [1.05 to 2.21]). The same pattern occurred for physical inactivity in leisure-time (OR = 1.63 [1.11 to 2.39]) and aggregation of physical inactivity in 3 domains (OR = 2.46 [1.14 to 5.32]). However, the multivariate-adjusted model showed significant association between hypertension and physical inactivity in 3 domains (OR = 2.57 [1.14 to 5.79]). The results suggest an unequal prevalence of hypertension according to physical inactivity across different domains and increasing the promotion of physical activity in the healthcare system is needed.

  5. Understanding Mixed Code and Classroom Code-Switching: Myths and Realities

    Science.gov (United States)

    Li, David C. S.

    2008-01-01

    Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…

  6. Language Choice and Use of Malaysian Public University Lecturers in the Education Domain

    Directory of Open Access Journals (Sweden)

    Tam Lee Mei

    2016-02-01

    Full Text Available It is a norm for people from a multilingual and multicultural country such as Malaysia to speak at least two or more languages. Thus, the Malaysian multilingual situation resulted in speakers having to make decisions about which languages are to be used for different purposes in different domains. In order to explain the phenomenon of language choice, Fishman domain analysis (1964 was adapted into this research. According to Fishman’s domain analysis, language choice and use may depend on the speaker’s experiences situated in different settings, different language repertoires that are available to the speaker, different interlocutors and different topics. Such situations inevitably cause barriers and difficulties to those professionals who work in the education domain. Therefore, the purpose of this research is to explore the language choice and use of Malaysian public university lecturers in the education domain and to investigate whether any significant differences exist between ethnicity and field of study with the English language choice and use of the lecturers. 200 survey questionnaires were distributed to examine the details of the lecturers’ language choice and use. The findings of this research reveal that all of the respondents generally preferred to choose and use English language in both formal and informal education domain. Besides, all of the respondents claimed that they chose and used more than one language. It is also found that ethnicity and field of study of the respondents influence the language choice and use in the education domain. In addition, this research suggested that the language and educational policy makers have been largely successful in raising the role and status of the English language as the medium of instruction in tertiary education while maintaining the Malay language as having an important role in the communicative acts, thus characterizing the lecturers’ language choice and use. Keywords: Language

  7. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  8. Regulatory context and evolutions - Public Health Code

    International Nuclear Information System (INIS)

    Rodde, S.

    2009-01-01

    After having recalled that numerous laws, decrees and orders have been published between 2001 and 2007 due to the transposition of EURATOM directives defining standards for population and worker health protection against dangers resulting from ionizing radiations, the author reviews the regulatory evolutions which occurred in 2008 and 2009, and those currently in progress. They concern the protection of people exposed to radon, the field of radiotherapy, authorizations issued in application of the French public health code. Some decisions are about to be finalized. They concern the activities submitted to a declaration, the modalities of prolongation of the lifetime of sealed sources, a list of apparatus categories the handling of which requires an ability certificate

  9. Blockchain-based Public Key Infrastructure for Inter-Domain Secure Routing

    OpenAIRE

    de la Rocha Gómez-Arevalillo , Alfonso; Papadimitratos , Panos

    2017-01-01

    International audience; A gamut of secure inter-domain routing protocols has been proposed in the literature. They use traditional PGP-like and centralized Public Key Infrastructures for trust management. In this paper, we propose our alternative approach for managing security associations, Secure Blockchain Trust Management (SBTM), a trust management system that instantiates a blockchain-based PKI for the operation of securerouting protocols. A main motivation for SBTM is to facilitate gradu...

  10. Remotely Piloted Aircraft and War in the Public Relations Domain

    Science.gov (United States)

    2014-10-01

    the terms as they appear in quoted texts. 2. Peter Kreeft, Socratic Logic: A Logic Text Using Socratic Method , Platonic Questions, and Aristotelian...Ronald Brooks.22 This method of refuting an argu- ment reflects option C (above), demonstrating that the conclusion does not follow from the premises...and War in the Public Relations Domain Feature tional Security Assistance Force (ISAF) met to discuss methods of elim- inating civilian casualties in

  11. Combinatorial geometry domain decomposition strategies for Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z. [Institute of Applied Physics and Computational Mathematics, Beijing, 100094 (China)

    2013-07-01

    Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)

  12. Combinatorial geometry domain decomposition strategies for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z.

    2013-01-01

    Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)

  13. The added value of international benchmarks for fuel performance codes: an illustration on the basis of TRANSURANUS

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Schubert, A.; Gyeori, C.; Van De Laar, J.

    2009-01-01

    Safety authorities and fuel designers, as well as nuclear research centers rely heavily on fuel performance codes for predicting the behaviour and life-time of fuel rods. The simulation tools are developed and validated on the basis of experimental results, some of which is in the public domain such as the International Fuel Performance Experiments database of the OECD/NEA and IAEA. Publicly available data constitute an excellent basis for assessing codes themselves, but also to compare codes that are being developed by independent teams. The present report summarises the advantages for the TRANSURANUS code by taking part in previous benchmarks organised by the IAEA, and outlines the preliminary results along with the perspectives of our participation in the current coordinated research project FUMEXIII

  14. ADLIB: A simple database framework for beamline codes

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1993-01-01

    There are many well developed codes available for beamline design and analysis. A significant fraction of each of these codes is devoted to processing its own unique input language for describing the problem. None of these large, complex, and powerful codes does everything. Adding a new bit of specialized physics can be a difficult task whose successful completion makes the code even larger and more complex. This paper describes an attempt to move in the opposite direction, toward a family of small, simple, single purpose physics and utility modules, linked by an open, portable, public domain database framework. These small specialized physics codes begin with the beamline parameters already loaded in the database, and accessible via the handful of subroutines that constitute ADLIB. Such codes are easier to write, and inherently organized in a manner suitable for incorporation in model based control system algorithms. Examples include programs for analyzing beamline misalignment sensitivities, for simulating and fitting beam steering data, and for translating among MARYLIE, TRANSPORT, and TRACE3D formats

  15. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  16. Suburban development – a search for public domains in Danish suburban neighbourhoods

    DEFF Research Database (Denmark)

    Melgaard, Bente; Bech-Danielsen, Claus

    These years some of the post-war Danish suburbs are facing great challenges – social segregation, demographic changes and challenges in building technology. In particular, segregation prevents social life from unfolding across social, economic and cultural borders. Therefore, in this paper......, potentials for bridge-building across the enclaves of the suburb are looked for through a combined architectural-anthropological mapping of public spaces in a specific suburb in Denmark, the analyses being carried out in the light of Hajer & Reijndorp’s definition of public domains and the term exchange...

  17. Beyond cross-domain learning: Multiple-domain nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Traditional cross-domain learning methods transfer learning from a source domain to a target domain. In this paper, we propose the multiple-domain learning problem for several equally treated domains. The multiple-domain learning problem assumes that samples from different domains have different distributions, but share the same feature and class label spaces. Each domain could be a target domain, while also be a source domain for other domains. A novel multiple-domain representation method is proposed for the multiple-domain learning problem. This method is based on nonnegative matrix factorization (NMF), and tries to learn a basis matrix and coding vectors for samples, so that the domain distribution mismatch among different domains will be reduced under an extended variation of the maximum mean discrepancy (MMD) criterion. The novel algorithm - multiple-domain NMF (MDNMF) - was evaluated on two challenging multiple-domain learning problems - multiple user spam email detection and multiple-domain glioma diagnosis. The effectiveness of the proposed algorithm is experimentally verified. © 2013 Elsevier Ltd. All rights reserved.

  18. Beyond cross-domain learning: Multiple-domain nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-02-01

    Traditional cross-domain learning methods transfer learning from a source domain to a target domain. In this paper, we propose the multiple-domain learning problem for several equally treated domains. The multiple-domain learning problem assumes that samples from different domains have different distributions, but share the same feature and class label spaces. Each domain could be a target domain, while also be a source domain for other domains. A novel multiple-domain representation method is proposed for the multiple-domain learning problem. This method is based on nonnegative matrix factorization (NMF), and tries to learn a basis matrix and coding vectors for samples, so that the domain distribution mismatch among different domains will be reduced under an extended variation of the maximum mean discrepancy (MMD) criterion. The novel algorithm - multiple-domain NMF (MDNMF) - was evaluated on two challenging multiple-domain learning problems - multiple user spam email detection and multiple-domain glioma diagnosis. The effectiveness of the proposed algorithm is experimentally verified. © 2013 Elsevier Ltd. All rights reserved.

  19. The Value of Privacy and Surveillance Drones in the Public Domain : Scrutinizing the Dutch Flexible Deployment of Mobile Cameras Act

    NARCIS (Netherlands)

    Gerdo Kuiper; Quirine Eijkman

    2017-01-01

    The flexible deployment of drones in the public domain, is in this article assessed from a legal philosophical perspective. On the basis of theories of Dworkin and Moore the distinction between individual rights and collective security policy goals is discussed. Mobile cameras in the public domain

  20. Joint Frequency-Domain Equalization and Despreading for Multi-Code DS-CDMA Using Cyclic Delay Transmit Diversity

    Science.gov (United States)

    Yamamoto, Tetsuya; Takeda, Kazuki; Adachi, Fumiyuki

    Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can provide a better bit error rate (BER) performance than rake combining. To further improve the BER performance, cyclic delay transmit diversity (CDTD) can be used. CDTD simultaneously transmits the same signal from different antennas after adding different cyclic delays to increase the number of equivalent propagation paths. Although a joint use of CDTD and MMSE-FDE for direct sequence code division multiple access (DS-CDMA) achieves larger frequency diversity gain, the BER performance improvement is limited by the residual inter-chip interference (ICI) after FDE. In this paper, we propose joint FDE and despreading for DS-CDMA using CDTD. Equalization and despreading are simultaneously performed in the frequency-domain to suppress the residual ICI after FDE. A theoretical conditional BER analysis is presented for the given channel condition. The BER analysis is confirmed by computer simulation.

  1. Domains of State-Owned, Privately Held, and Publicly Traded Firms in International Competition.

    Science.gov (United States)

    Mascarenhas, Briance

    1989-01-01

    Hypotheses relating ownership to domain differences among state-owned, publicly traded, and privately held firms in international competition were examined in a controlled field study of the offshore drilling industry. Ownership explained selected differences in domestic market dominance, international presence, and customer orientation, even…

  2. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  3. DATABASES AND THE SUI-GENERIS RIGHT – PROTECTION OUTSIDE THE ORIGINALITY. THE DISREGARD OF THE PUBLIC DOMAIN

    Directory of Open Access Journals (Sweden)

    Monica LUPAȘCU

    2018-05-01

    Full Text Available This study focuses on databases as they are regulated by Directive no.96/9/EC regarding the protection of databases. There are also several references to Romanian Law no.8/1996 on copyright and neighbouring rights which implements the mentioned European Directive. The study analyses certain effects that the sui-generis protection has on public domain. The study tries to demonstrate that the reglementation specific to databases neglects the interests correlated with the public domain. The effect of such a regulation is the abusive creation of some databases in which the public domain (meaning information not protected by copyright such as news, ideas, procedures, methods, systems, processes, concepts, principles, discoveries ends up being encapsulated and made available only to some private interests, the access to public domain being regulated indirectly. The study begins by explaining the sui- generis right and its origin. The first mention of databases can be found in “Green Paper on Copyright (1998,” a document that clearly shows, the database protection was thought to cover a sphere of information non-protectable from the scientific and industrial fields. Several arguments are made by the author, most of them based on the report of the Public Consultation sustained in 2014 in regards to the necessity of the sui-generis right. There are some references made to a specific case law, namely British Houseracing Board vs William Hill and Fixture Marketing Ldt. The ECJ’s decision în that case is of great importance for the support of public interest to access information corresponding to some restrictive fields that are derived as a result of the maker’s activities, because in the absence of the sui-generis right, all this information can be freely accessed and used.

  4. The complexity of changes in the domain of managing public expenditures

    Directory of Open Access Journals (Sweden)

    Dimitrijević Marina

    2016-01-01

    Full Text Available Public expenditures are a huge problem in contemporary states. In the conditions of a global economic crisis and the circumstances involving high level of citizen dissatisfaction related to the former methods of funding and managing the public sector (reflected in ruining the funding sources, irrational spending of public expenditure funds, increase in the budget deficit and the level of public debt, the changes in the domain of managing public expenditures have become a priority. By their nature, these changes are complex and long-lasting, and they should bring significant improvements in the field of public expenditure; they have to provide for lawful and purposeful spending of public funds. It is expected to lower the needed public incomes for financing public expenditure, to improve production and competition in the market economy, and to increase personal consumption, living standard and the quality of life of the population. Regardless of the social, economic, legal or political environment in each of state, the topical issue of reforming the management of public expenditures seems to imply a return to a somewhat neglected need for the public sector to function within its own financial possibilities. The state modernisation processes and advancement in the process of managing public expenditures call for a realistic evaluation of the existing condition and circumstances in which these processes occur, as well as the assessment of potential and actual risks that may hinder their effectiveness. Otherwise, it seems that the establishment of a significant level of responsibility in spending the budget funds and a greater transparency of public expenditure may be far-fetched goals.

  5. Combating Identity Fraud in the Public Domain: Information Strategies for Healthcare and Criminal Justice

    NARCIS (Netherlands)

    Plomp, M.G.A.; Grijpink, J.H.A.M.

    2011-01-01

    Two trends are present in both the private and public domain: increasing interorganisational co-operation and increasing digitisation. Nowadays, more and more processes within and between organisations take place electronically. These developments are visible on local, national and European scale.

  6. Application of multi-thread computing and domain decomposition to the 3-D neutronics Fem code Cronos

    International Nuclear Information System (INIS)

    Ragusa, J.C.

    2003-01-01

    The purpose of this paper is to present the parallelization of the flux solver and the isotopic depletion module of the code, either using Message Passing Interface (MPI) or OpenMP. Thread parallelism using OpenMP was used to parallelize the mixed dual FEM (finite element method) flux solver MINOS. Investigations regarding the opportunity of mixing parallelism paradigms will be discussed. The isotopic depletion module was parallelized using domain decomposition and MPI. An attempt at using OpenMP was unsuccessful and will be explained. This paper is organized as follows: the first section recalls the different types of parallelism. The mixed dual flux solver and its parallelization are then presented. In the third section, we describe the isotopic depletion solver and its parallelization; and finally conclude with some future perspectives. Parallel applications are mandatory for fine mesh 3-dimensional transport and simplified transport multigroup calculations. The MINOS solver of the FEM neutronics code CRONOS2 was parallelized using the directive based standard OpenMP. An efficiency of 80% (resp. 60%) was achieved with 2 (resp. 4) threads. Parallelization of the isotopic depletion solver was obtained using domain decomposition principles and MPI. Efficiencies greater than 90% were reached. These parallel implementations were tested on a shared memory symmetric multiprocessor (SMP) cluster machine. The OpenMP implementation in the solver MINOS is only the first step towards fully using the SMPs cluster potential with a mixed mode parallelism. Mixed mode parallelism can be achieved by combining message passing interface between clusters with OpenMP implicit parallelism within a cluster

  7. Application of multi-thread computing and domain decomposition to the 3-D neutronics Fem code Cronos

    Energy Technology Data Exchange (ETDEWEB)

    Ragusa, J.C. [CEA Saclay, Direction de l' Energie Nucleaire, Service d' Etudes des Reacteurs et de Modelisations Avancees (DEN/SERMA), 91 - Gif sur Yvette (France)

    2003-07-01

    The purpose of this paper is to present the parallelization of the flux solver and the isotopic depletion module of the code, either using Message Passing Interface (MPI) or OpenMP. Thread parallelism using OpenMP was used to parallelize the mixed dual FEM (finite element method) flux solver MINOS. Investigations regarding the opportunity of mixing parallelism paradigms will be discussed. The isotopic depletion module was parallelized using domain decomposition and MPI. An attempt at using OpenMP was unsuccessful and will be explained. This paper is organized as follows: the first section recalls the different types of parallelism. The mixed dual flux solver and its parallelization are then presented. In the third section, we describe the isotopic depletion solver and its parallelization; and finally conclude with some future perspectives. Parallel applications are mandatory for fine mesh 3-dimensional transport and simplified transport multigroup calculations. The MINOS solver of the FEM neutronics code CRONOS2 was parallelized using the directive based standard OpenMP. An efficiency of 80% (resp. 60%) was achieved with 2 (resp. 4) threads. Parallelization of the isotopic depletion solver was obtained using domain decomposition principles and MPI. Efficiencies greater than 90% were reached. These parallel implementations were tested on a shared memory symmetric multiprocessor (SMP) cluster machine. The OpenMP implementation in the solver MINOS is only the first step towards fully using the SMPs cluster potential with a mixed mode parallelism. Mixed mode parallelism can be achieved by combining message passing interface between clusters with OpenMP implicit parallelism within a cluster.

  8. Public health accreditation and metrics for ethics: a case study on environmental health and community engagement.

    Science.gov (United States)

    Bernheim, Ruth Gaare; Stefanak, Matthew; Brandenburg, Terry; Pannone, Aaron; Melnick, Alan

    2013-01-01

    As public health departments around the country undergo accreditation using the Public Health Accreditation Board standards, the process provides a new opportunity to integrate ethics metrics into day-to-day public health practice. While the accreditation standards do not explicitly address ethics, ethical tools and considerations can enrich the accreditation process by helping health departments and their communities understand what ethical principles underlie the accreditation standards and how to use metrics based on these ethical principles to support decision making in public health practice. We provide a crosswalk between a public health essential service, Public Health Accreditation Board community engagement domain standards, and the relevant ethical principles in the Public Health Code of Ethics (Code). A case study illustrates how the accreditation standards and the ethical principles in the Code together can enhance the practice of engaging the community in decision making in the local health department.

  9. Code domain steganography in video tracks

    Science.gov (United States)

    Rymaszewski, Sławomir

    2008-01-01

    This article is dealing with a practical method of hiding secret information in video stream. Method is dedicated for MPEG-2 stream. The algorithm takes to consider not only MPEG video coding scheme described in standard but also bits PES-packets encapsulation in MPEG-2 Program Stream (PS). This modification give higher capacity and more effective bit rate control for output stream than previously proposed methods.

  10. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  11. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  12. Domain Specific Language Support for Exascale

    Energy Technology Data Exchange (ETDEWEB)

    Sadayappan, Ponnuswamy [The Ohio State Univ., Columbus, OH (United States)

    2017-02-24

    Domain-Specific Languages (DSLs) offer an attractive path to Exascale software since they provide expressive power through appropriate abstractions and enable domain-specific optimizations. But the advantages of a DSL compete with the difficulties of implementing a DSL, even for a narrowly defined domain. The DTEC project addresses how a variety of DSLs can be easily implemented to leverage existing compiler analysis and transformation capabilities within the ROSE open source compiler as part of a research program focusing on Exascale challenges. The OSU contributions to the DTEC project are in the area of code generation from high-level DSL descriptions, as well as verification of the automatically-generated code.

  13. Epitaxial Growth of Hetero-Ln-MOF Hierarchical Single Crystals for Domain- and Orientation-Controlled Multicolor Luminescence 3D Coding Capability.

    Science.gov (United States)

    Pan, Mei; Zhu, Yi-Xuan; Wu, Kai; Chen, Ling; Hou, Ya-Jun; Yin, Shao-Yun; Wang, Hai-Ping; Fan, Ya-Nan; Su, Cheng-Yong

    2017-11-13

    Core-shell or striped heteroatomic lanthanide metal-organic framework hierarchical single crystals were obtained by liquid-phase anisotropic epitaxial growth, maintaining identical periodic organization while simultaneously exhibiting spatially segregated structure. Different types of domain and orientation-controlled multicolor photophysical models are presented, which show either visually distinguishable or visible/near infrared (NIR) emissive colors. This provides a new bottom-up strategy toward the design of hierarchical molecular systems, offering high-throughput and multiplexed luminescence color tunability and readability. The unique capability of combining spectroscopic coding with 3D (three-dimensional) microscale spatial coding is established, providing potential applications in anti-counterfeiting, color barcoding, and other types of integrated and miniaturized optoelectronic materials and devices. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Epitaxial growth of hetero-Ln-MOF hierarchical single crystals for domain- and orientation-controlled multicolor luminescence 3D coding capability

    International Nuclear Information System (INIS)

    Pan, Mei; Zhu, Yi-Xuan; Wu, Kai; Chen, Ling; Hou, Ya-Jun; Yin, Shao-Yun; Wang, Hai-Ping; Fan, Ya-Nan; Su, Cheng-Yong

    2017-01-01

    Core-shell or striped heteroatomic lanthanide metal-organic framework hierarchical single crystals were obtained by liquid-phase anisotropic epitaxial growth, maintaining identical periodic organization while simultaneously exhibiting spatially segregated structure. Different types of domain and orientation-controlled multicolor photophysical models are presented, which show either visually distinguishable or visible/near infrared (NIR) emissive colors. This provides a new bottom-up strategy toward the design of hierarchical molecular systems, offering high-throughput and multiplexed luminescence color tunability and readability. The unique capability of combining spectroscopic coding with 3D (three-dimensional) microscale spatial coding is established, providing potential applications in anti-counterfeiting, color barcoding, and other types of integrated and miniaturized optoelectronic materials and devices. (copyright 2017 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim)

  15. Epitaxial growth of hetero-Ln-MOF hierarchical single crystals for domain- and orientation-controlled multicolor luminescence 3D coding capability

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Mei; Zhu, Yi-Xuan; Wu, Kai; Chen, Ling; Hou, Ya-Jun; Yin, Shao-Yun; Wang, Hai-Ping; Fan, Ya-Nan [MOE Laboratory of Bioinorganic and Synthetic Chemistry, Lehn Institute of Functional Materials, School of Chemistry, Sun Yat-Sen University, Guangzhou (China); Su, Cheng-Yong [MOE Laboratory of Bioinorganic and Synthetic Chemistry, Lehn Institute of Functional Materials, School of Chemistry, Sun Yat-Sen University, Guangzhou (China); State Key Laboratory of Applied Organic Chemistry, Lanzhou University, Lanzhou (China)

    2017-11-13

    Core-shell or striped heteroatomic lanthanide metal-organic framework hierarchical single crystals were obtained by liquid-phase anisotropic epitaxial growth, maintaining identical periodic organization while simultaneously exhibiting spatially segregated structure. Different types of domain and orientation-controlled multicolor photophysical models are presented, which show either visually distinguishable or visible/near infrared (NIR) emissive colors. This provides a new bottom-up strategy toward the design of hierarchical molecular systems, offering high-throughput and multiplexed luminescence color tunability and readability. The unique capability of combining spectroscopic coding with 3D (three-dimensional) microscale spatial coding is established, providing potential applications in anti-counterfeiting, color barcoding, and other types of integrated and miniaturized optoelectronic materials and devices. (copyright 2017 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim)

  16. Parallelization of 2-D lattice Boltzmann codes

    International Nuclear Information System (INIS)

    Suzuki, Soichiro; Kaburaki, Hideo; Yokokawa, Mitsuo.

    1996-03-01

    Lattice Boltzmann (LB) codes to simulate two dimensional fluid flow are developed on vector parallel computer Fujitsu VPP500 and scalar parallel computer Intel Paragon XP/S. While a 2-D domain decomposition method is used for the scalar parallel LB code, a 1-D domain decomposition method is used for the vector parallel LB code to be vectorized along with the axis perpendicular to the direction of the decomposition. High parallel efficiency of 95.1% by the vector parallel calculation on 16 processors with 1152x1152 grid and 88.6% by the scalar parallel calculation on 100 processors with 800x800 grid are obtained. The performance models are developed to analyze the performance of the LB codes. It is shown by our performance models that the execution speed of the vector parallel code is about one hundred times faster than that of the scalar parallel code with the same number of processors up to 100 processors. We also analyze the scalability in keeping the available memory size of one processor element at maximum. Our performance model predicts that the execution time of the vector parallel code increases about 3% on 500 processors. Although the 1-D domain decomposition method has in general a drawback in the interprocessor communication, the vector parallel LB code is still suitable for the large scale and/or high resolution simulations. (author)

  17. Parallelization of 2-D lattice Boltzmann codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Soichiro; Kaburaki, Hideo; Yokokawa, Mitsuo

    1996-03-01

    Lattice Boltzmann (LB) codes to simulate two dimensional fluid flow are developed on vector parallel computer Fujitsu VPP500 and scalar parallel computer Intel Paragon XP/S. While a 2-D domain decomposition method is used for the scalar parallel LB code, a 1-D domain decomposition method is used for the vector parallel LB code to be vectorized along with the axis perpendicular to the direction of the decomposition. High parallel efficiency of 95.1% by the vector parallel calculation on 16 processors with 1152x1152 grid and 88.6% by the scalar parallel calculation on 100 processors with 800x800 grid are obtained. The performance models are developed to analyze the performance of the LB codes. It is shown by our performance models that the execution speed of the vector parallel code is about one hundred times faster than that of the scalar parallel code with the same number of processors up to 100 processors. We also analyze the scalability in keeping the available memory size of one processor element at maximum. Our performance model predicts that the execution time of the vector parallel code increases about 3% on 500 processors. Although the 1-D domain decomposition method has in general a drawback in the interprocessor communication, the vector parallel LB code is still suitable for the large scale and/or high resolution simulations. (author).

  18. YNOGK: A NEW PUBLIC CODE FOR CALCULATING NULL GEODESICS IN THE KERR SPACETIME

    Energy Technology Data Exchange (ETDEWEB)

    Yang Xiaolin; Wang Jiancheng, E-mail: yangxl@ynao.ac.cn [National Astronomical Observatories, Yunnan Observatory, Chinese Academy of Sciences, Kunming 650011 (China)

    2013-07-01

    Following the work of Dexter and Agol, we present a new public code for the fast calculation of null geodesics in the Kerr spacetime. Using Weierstrass's and Jacobi's elliptic functions, we express all coordinates and affine parameters as analytical and numerical functions of a parameter p, which is an integral value along the geodesic. This is the main difference between our code and previous similar ones. The advantage of this treatment is that the information about the turning points does not need to be specified in advance by the user, and many applications such as imaging, the calculation of line profiles, and the observer-emitter problem, become root-finding problems. All elliptic integrations are computed by Carlson's elliptic integral method as in Dexter and Agol, which guarantees the fast computational speed of our code. The formulae to compute the constants of motion given by Cunningham and Bardeen have been extended, which allow one to readily handle situations in which the emitter or the observer has an arbitrary distance from, and motion state with respect to, the central compact object. The validation of the code has been extensively tested through applications to toy problems from the literature. The source FORTRAN code is freely available for download on our Web site http://www1.ynao.ac.cn/{approx}yangxl/yxl.html.

  19. YNOGK: A NEW PUBLIC CODE FOR CALCULATING NULL GEODESICS IN THE KERR SPACETIME

    International Nuclear Information System (INIS)

    Yang Xiaolin; Wang Jiancheng

    2013-01-01

    Following the work of Dexter and Agol, we present a new public code for the fast calculation of null geodesics in the Kerr spacetime. Using Weierstrass's and Jacobi's elliptic functions, we express all coordinates and affine parameters as analytical and numerical functions of a parameter p, which is an integral value along the geodesic. This is the main difference between our code and previous similar ones. The advantage of this treatment is that the information about the turning points does not need to be specified in advance by the user, and many applications such as imaging, the calculation of line profiles, and the observer-emitter problem, become root-finding problems. All elliptic integrations are computed by Carlson's elliptic integral method as in Dexter and Agol, which guarantees the fast computational speed of our code. The formulae to compute the constants of motion given by Cunningham and Bardeen have been extended, which allow one to readily handle situations in which the emitter or the observer has an arbitrary distance from, and motion state with respect to, the central compact object. The validation of the code has been extensively tested through applications to toy problems from the literature. The source FORTRAN code is freely available for download on our Web site http://www1.ynao.ac.cn/~yangxl/yxl.html.

  20. Compressed Domain Packet Loss Concealment of Sinusoidally Coded Speech

    DEFF Research Database (Denmark)

    Rødbro, Christoffer A.; Christensen, Mads Græsbøll; Andersen, Søren Vang

    2003-01-01

    We consider the problem of packet loss concealment for voice over IP (VoIP). The speech signal is compressed at the transmitter using a sinusoidal coding scheme working at 8 kbit/s. At the receiver, packet loss concealment is carried out working directly on the quantized sinusoidal parameters......, based on time-scaling of the packets surrounding the missing ones. Subjective listening tests show promising results indicating the potential of sinusoidal speech coding for VoIP....

  1. LPIC++. A parallel one-dimensional relativistic electromagnetic particle-in-cell code for simulating laser-plasma-interaction

    International Nuclear Information System (INIS)

    Lichters, R.; Pfund, R.E.W.; Meyer-ter-Vehn, J.

    1997-08-01

    The code LPIC++ presented here, is based on a one-dimensional, electromagnetic, relativistic PIC code that has originally been developed by one of the authors during a PhD thesis at the Max-Planck-Institut fuer Quantenoptik for kinetic simulations of high harmonic generation from overdense plasma surfaces. The code uses essentially the algorithm of Birdsall and Langdon and Villasenor and Bunemann. It is written in C++ in order to be easily extendable and has been parallelized to be able to grow in power linearly with the size of accessable hardware, e.g. massively parallel machines like Cray T3E. The parallel LPIC++ version uses PVM for communication between processors. PVM is public domain software, can be downloaded from the world wide web. A particular strength of LPIC++ lies in its clear program and data structure, which uses chained lists for the organization of grid cells and enables dynamic adjustment of spatial domain sizes in a very convenient way, and therefore easy balancing of processor loads. Also particles belonging to one cell are linked in a chained list and are immediately accessable from this cell. In addition to this convenient type of data organization in a PIC code, the code shows excellent performance in both its single processor and parallel version. (orig.)

  2. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    Science.gov (United States)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third

  3. THE PUBLIC OFFICER - THE ACTIVE SUBJECT OF A CRIME

    Directory of Open Access Journals (Sweden)

    NICOLETA-ELENA BUZATU

    2011-04-01

    Full Text Available The present study intends to analyze the active subject of the crime committed by an individual - the public officer, for example - during his daily duty program or with reference to the attributions he has versus the public office he holds, in the light of the regulations provided not only by the Penal Code in force but also by the future New Penal Code, as, among the important amendments it provides, the definition of the public officer is also mentioned. In the case of such a trespassing, the active subject shall hold the quality of a public officer the way this quality is regulated by the Penal Code, even if the definition is much ampler as compared to the one given by the Statute of the Public Officers. According to Art 147, paragraph 1 Penal Code, a public officer is any individual who permanently or temporarily exercises - irrespective of his/her rank or of the way this office was appointed, a paid or unpaid task of no matter what nature or importance - in the service of a department Art 145 refers to. The regulation proposed in perfect agreement with the solutions offered by other international legislations and conventions in the domain, the definition of a public officer refers to the individual who - permanently or temporarily appointed, paid or unpaid - shall exercise attributions specific to the legislative, executive or judiciary powers, a function of public dignity or a function of any other type - alone or in a group - within a self-governing management of another economic agent or of a legal person with a whole or a greater capital, or belonging to a legally declared person capital or to a legal person considered to be of public utility - attributions connected with the object of the latter’s activity.

  4. A general concurrent algorithm for plasma particle-in-cell simulation codes

    International Nuclear Information System (INIS)

    Liewer, P.C.; Decyk, V.K.

    1989-01-01

    We have developed a new algorithm for implementing plasma particle-in-cell (PIC) simulation codes on concurrent processors with distributed memory. This algorithm, named the general concurrent PIC algorithm (GCPIC), has been used to implement an electrostatic PIC code on the 33-node JPL Mark III Hypercube parallel computer. To decompose at PIC code using the GCPIC algorithm, the physical domain of the particle simulation is divided into sub-domains, equal in number to the number of processors, such that all sub-domains have roughly equal numbers of particles. For problems with non-uniform particle densities, these sub-domains will be of unequal physical size. Each processor is assigned a sub-domain and is responsible for updating the particles in its sub-domain. This algorithm has led to a a very efficient parallel implementation of a well-benchmarked 1-dimensional PIC code. The dominant portion of the code, updating the particle positions and velocities, is nearly 100% efficient when the number of particles is increased linearly with the number of hypercube processors used so that the number of particles per processor is constant. For example, the increase in time spent updating particles in going from a problem with 11,264 particles run on 1 processor to 360,448 particles on 32 processors was only 3% (parallel efficiency of 97%). Although implemented on a hypercube concurrent computer, this algorithm should also be efficient for PIC codes on other parallel architectures and for large PIC codes on sequential computers where part of the data must reside on external disks. copyright 1989 Academic Press, Inc

  5. Project of decree relative to the licensing and statement system of nuclear activities and to their control and bearing various modifications of the public health code and working code

    International Nuclear Information System (INIS)

    2005-01-01

    This decree concerns the control of high level sealed radioactive sources and orphan sources. It has for objective to introduce administrative simplification, especially the radiation sources licensing and statement system, to reinforce the control measures planed by the public health code and by the employment code, to bring precision and complements in the editing of several already existing arrangements. (N.C.)

  6. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set. ...

  7. Modeling Guidelines for Code Generation in the Railway Signaling Context

    Science.gov (United States)

    Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo

    2009-01-01

    Modeling guidelines constitute one of the fundamental cornerstones for Model Based Development. Their relevance is essential when dealing with code generation in the safety-critical domain. This article presents the experience of a railway signaling systems manufacturer on this issue. Introduction of Model-Based Development (MBD) and code generation in the industrial safety-critical sector created a crucial paradigm shift in the development process of dependable systems. While traditional software development focuses on the code, with MBD practices the focus shifts to model abstractions. The change has fundamental implications for safety-critical systems, which still need to guarantee a high degree of confidence also at code level. Usage of the Simulink/Stateflow platform for modeling, which is a de facto standard in control software development, does not ensure by itself production of high-quality dependable code. This issue has been addressed by companies through the definition of modeling rules imposing restrictions on the usage of design tools components, in order to enable production of qualified code. The MAAB Control Algorithm Modeling Guidelines (MathWorks Automotive Advisory Board)[3] is a well established set of publicly available rules for modeling with Simulink/Stateflow. This set of recommendations has been developed by a group of OEMs and suppliers of the automotive sector with the objective of enforcing and easing the usage of the MathWorks tools within the automotive industry. The guidelines have been published in 2001 and afterwords revisited in 2007 in order to integrate some additional rules developed by the Japanese division of MAAB [5]. The scope of the current edition of the guidelines ranges from model maintainability and readability to code generation issues. The rules are conceived as a reference baseline and therefore they need to be tailored to comply with the characteristics of each industrial context. Customization of these

  8. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  9. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  10. A repository of codes of ethics and technical standards in health informatics.

    Science.gov (United States)

    Samuel, Hamman W; Zaïane, Osmar R

    2014-01-01

    We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository.

  11. The public understanding of nanotechnology in the food domain: the hidden role of views on science, technology, and nature.

    Science.gov (United States)

    Vandermoere, Frederic; Blanchemanche, Sandrine; Bieberstein, Andrea; Marette, Stephan; Roosen, Jutta

    2011-03-01

    In spite of great expectations about the potential of nanotechnology, this study shows that people are rather ambiguous and pessimistic about nanotechnology applications in the food domain. Our findings are drawn from a survey of public perceptions about nanotechnology food and nanotechnology food packaging (N = 752). Multinomial logistic regression analyses further reveal that knowledge about food risks and nanotechnology significantly influences people's views about nanotechnology food packaging. However, knowledge variables were unrelated to support for nanofood, suggesting that an increase in people's knowledge might not be sufficient to bridge the gap between the excitement some business leaders in the food sector have and the restraint of the public. Additionally, opposition to nanofood was not related to the use of heuristics but to trust in governmental agencies. Furthermore, the results indicate that public perceptions of nanoscience in the food domain significantly relate to views on science, technology, and nature.

  12. Improving developer productivity with C++ embedded domain specific languages

    Science.gov (United States)

    Kozacik, Stephen; Chao, Evenie; Paolini, Aaron; Bonnett, James; Kelmelis, Eric

    2017-05-01

    Domain-specific languages are a useful tool for productivity allowing domain experts to program using familiar concepts and vocabulary while benefiting from performance choices made by computing experts. Embedding the domain specific language into an existing language allows easy interoperability with non-domain-specific code and use of standard compilers and build systems. In C++, this is enabled through the template and preprocessor features. C++ embedded domain specific languages (EDSLs) allow the user to write simple, safe, performant, domain specific code that has access to all the low-level functionality that C and C++ offer as well as the diverse set of libraries available in the C/C++ ecosystem. In this paper, we will discuss several tools available for building EDSLs in C++ and show examples of projects successfully leveraging EDSLs. Modern C++ has added many useful new features to the language which we have leveraged to further extend the capability of EDSLs. At EM Photonics, we have used EDSLs to allow developers to transparently benefit from using high performance computing (HPC) hardware. We will show ways EDSLs combine with existing technologies and EM Photonics high performance tools and libraries to produce clean, short, high performance code in ways that were not previously possible.

  13. Cross-band noise model refinement for transform domain Wyner–Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2012-01-01

    TDWZ video coding trails that of conventional video coding solutions, mainly due to the quality of side information, inaccurate noise modeling and loss in the final coding step. The major goal of this paper is to enhance the accuracy of the noise modeling, which is one of the most important aspects...... influencing the coding performance of DVC. A TDWZ video decoder with a novel cross-band based adaptive noise model is proposed, and a noise residue refinement scheme is introduced to successively update the estimated noise residue for noise modeling after each bit-plane. Experimental results show...... that the proposed noise model and noise residue refinement scheme can improve the rate-distortion (RD) performance of TDWZ video coding significantly. The quality of the side information modeling is also evaluated by a measure of the ideal code length....

  14. Expanding the landscape of chromatin modification (CM-related functional domains and genes in human.

    Directory of Open Access Journals (Sweden)

    Shuye Pu

    2010-11-01

    Full Text Available Chromatin modification (CM plays a key role in regulating transcription, DNA replication, repair and recombination. However, our knowledge of these processes in humans remains very limited. Here we use computational approaches to study proteins and functional domains involved in CM in humans. We analyze the abundance and the pair-wise domain-domain co-occurrences of 25 well-documented CM domains in 5 model organisms: yeast, worm, fly, mouse and human. Results show that domains involved in histone methylation, DNA methylation, and histone variants are remarkably expanded in metazoan, reflecting the increased demand for cell type-specific gene regulation. We find that CM domains tend to co-occur with a limited number of partner domains and are hence not promiscuous. This property is exploited to identify 47 potentially novel CM domains, including 24 DNA-binding domains, whose role in CM has received little attention so far. Lastly, we use a consensus Machine Learning approach to predict 379 novel CM genes (coding for 329 proteins in humans based on domain compositions. Several of these predictions are supported by very recent experimental studies and others are slated for experimental verification. Identification of novel CM genes and domains in humans will aid our understanding of fundamental epigenetic processes that are important for stem cell differentiation and cancer biology. Information on all the candidate CM domains and genes reported here is publicly available.

  15. Australasian code for reporting of mineral resources and ore reserves (the JORC code)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The latest revision of the Code first published in 1989 becomes effective in September 1999. It was prepared by the Joint Ores Reserves Committee of the Australasian Institute of Mining and Metallurgy, Australian Institute of Geoscientists and Minerals Council of Australia (JORC). It sets out minimum standards, recommendations and guidelines for public reporting of exploration results, mineral resources and ore reserves in Australasia. In this edition, the guidelines, which were previously separated from the Code, have been placed after the respective Code clauses. The Code is applicable to all solid minerals, including diamonds, other gemstones and coal for which public reporting is required by the Australian and New Zealand Stock Exchanges.

  16. On Domain Registries and Website Content

    DEFF Research Database (Denmark)

    Schwemer, Sebastian Felix

    2018-01-01

    such as Internet access service providers, hosting platforms, and websites that link to content. This article shows that in recent years, however, that the (secondary) liability of domain registries and registrars, and more specifically country code top-level domain registries (ccTLDs) for website content, has...... been tested in several EU Member States. The article investigates tendencies in the national lower-court jurisprudence and explores to what extent the liability exemption regime of the E-Commerce Directive applies to domain registries. The analysis concludes that whereas domain registries fall under...

  17. Parallelization characteristics of the DeCART code

    International Nuclear Information System (INIS)

    Cho, J. Y.; Joo, H. G.; Kim, H. Y.; Lee, C. C.; Chang, M. H.; Zee, S. Q.

    2003-12-01

    This report is to describe the parallelization characteristics of the DeCART code and also examine its parallel performance. Parallel computing algorithms are implemented to DeCART to reduce the tremendous computational burden and memory requirement involved in the three-dimensional whole core transport calculation. In the parallelization of the DeCART code, the axial domain decomposition is first realized by using MPI (Message Passing Interface), and then the azimuthal angle domain decomposition by using either MPI or OpenMP. When using the MPI for both the axial and the angle domain decomposition, the concept of MPI grouping is employed for convenient communication in each communication world. For the parallel computation, most of all the computing modules except for the thermal hydraulic module are parallelized. These parallelized computing modules include the MOC ray tracing, CMFD, NEM, region-wise cross section preparation and cell homogenization modules. For the distributed allocation, most of all the MOC and CMFD/NEM variables are allocated only for the assigned planes, which reduces the required memory by a ratio of the number of the assigned planes to the number of all planes. The parallel performance of the DeCART code is evaluated by solving two problems, a rodded variation of the C5G7 MOX three-dimensional benchmark problem and a simplified three-dimensional SMART PWR core problem. In the aspect of parallel performance, the DeCART code shows a good speedup of about 40.1 and 22.4 in the ray tracing module and about 37.3 and 20.2 in the total computing time when using 48 CPUs on the IBM Regatta and 24 CPUs on the LINUX cluster, respectively. In the comparison between the MPI and OpenMP, OpenMP shows a somewhat better performance than MPI. Therefore, it is concluded that the first priority in the parallel computation of the DeCART code is in the axial domain decomposition by using MPI, and then in the angular domain using OpenMP, and finally the angular

  18. Drug-domain interaction networks in myocardial infarction.

    Science.gov (United States)

    Wang, Haiying; Zheng, Huiru; Azuaje, Francisco; Zhao, Xing-Ming

    2013-09-01

    It has been well recognized that the pace of the development of new drugs and therapeutic interventions lags far behind biological knowledge discovery. Network-based approaches have emerged as a promising alternative to accelerate the discovery of new safe and effective drugs. Based on the integration of several biological resources including two recently published datasets i.e., Drug-target interactions in myocardial infarction (My-DTome) and drug-domain interaction network, this paper reports the association between drugs and protein domains in the context of myocardial infarction (MI). A MI drug-domain interaction network, My-DDome, was firstly constructed, followed by topological analysis and functional characterization of the network. The results show that My-DDome has a very clear modular structure, where drugs interacting with the same domain(s) within each module tend to have similar therapeutic effects. Moreover it has been found that drugs acting on blood and blood forming organs (ATC code B) and sensory organs (ATC code S) are significantly enriched in My-DDome (p drugs, their known targets, and seemingly unrelated proteins can be revealed.

  19. THE McELIECE CRYPTOSYSTEM WITH ARRAY CODES

    Directory of Open Access Journals (Sweden)

    Vedat Şiap

    2011-12-01

    Full Text Available Public-key cryptosystems form an important part of cryptography. In these systems, every user has a public and a private key. The public key allows other users to encrypt messages, which can only be decoded using the secret private key. In that way, public-key cryptosystems allow easy and secure communication between all users without the need to actually meet and exchange keys. One such system is the McEliece Public-Key cryptosystem, sometimes also called McEliece Scheme. However, as we live in the information age, coding is used in order to protecet or correct the messages in the transferring or the storing processes. So, linear codes are important in the transferring or the storing. Due to richness of their structure array codes which are linear are also an important codes. However, the information is then transferred into the source more securely by increasing the error correction capability with array codes. In this paper, we combine two interesting topics, McEliece cryptosystem and array codes.

  20. Towards an information strategy for combating identity fraud in the public domain: Cases from healthcare and criminal justice

    NARCIS (Netherlands)

    Plomp, M.G.A.; Grijpink, J.H.A.M.

    2011-01-01

    Two trends are present in both the private and public domain: increasing interorganisational co-operation and increasing digitisation. More and more processes within and between organisations take place electronically, on local, national and European scale. The technological and organisational

  1. Collection of regulatory texts relative to radiation protection. Part 2: orders and decisions taken in application of the Public Health Code and Labour Code concerning the protection of populations, patients and workers against the risks of ionizing radiations

    International Nuclear Information System (INIS)

    2007-05-01

    This collection of texts includes the general measures of population protection, exposure to natural radiations, general system of authorizations and statements, protection of persons exposed to ionizing radiations for medical purpose, situations of radiological emergency and long exposure to ionizing radiations, penal dispositions, application of the Public Health code and application of the Labour code. Chronological contents by date of publication is given. (N.C.)

  2. User Instructions for the CiderF Individual Dose Code and Associated Utility Codes

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-08-30

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. The RATCHET code modeled movement of 131I in the atmosphere (Ramsdell Jr. et al. 1994). The DECARTES code modeled accumulation of 131I in environmental media (Miley et al. 1994). The CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. This document provides user instructions for computer codes calculating doses to members of the public from atmospheric 131I that have two major differences from the

  3. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien; Ruehl, Kelley; Roy, Andre; Costello, Ronan; Laporte Weywada, Pauline; Bailey, Helen

    2017-01-01

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to model hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.

  4. Hybrid Video Coding Based on Bidimensional Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Lorenzo Granai

    2004-12-01

    Full Text Available Hybrid video coding combines together two stages: first, motion estimation and compensation predict each frame from the neighboring frames, then the prediction error is coded, reducing the correlation in the spatial domain. In this work, we focus on the latter stage, presenting a scheme that profits from some of the features introduced by the standard H.264/AVC for motion estimation and replaces the transform in the spatial domain. The prediction error is so coded using the matching pursuit algorithm which decomposes the signal over an appositely designed bidimensional, anisotropic, redundant dictionary. Comparisons are made among the proposed technique, H.264, and a DCT-based coding scheme. Moreover, we introduce fast techniques for atom selection, which exploit the spatial localization of the atoms. An adaptive coding scheme aimed at optimizing the resource allocation is also presented, together with a rate-distortion study for the matching pursuit algorithm. Results show that the proposed scheme outperforms the standard DCT, especially at very low bit rates.

  5. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  6. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  7. Parallel and vector implementation of APROS simulator code

    International Nuclear Information System (INIS)

    Niemi, J.; Tommiska, J.

    1990-01-01

    In this paper the vector and parallel processing implementation of a general purpose simulator code is discussed. In this code the utilization of vector processing is straightforward. In addition to the loop level parallel processing, the functional decomposition and the domain decomposition have been considered. Results represented for a PWR-plant simulation illustrate the potential speed-up factors of the alternatives. It turns out that the loop level parallelism and the domain decomposition are the most promising alternative to employ the parallel processing. (author)

  8. Combining Public Domain and Professional Panoramic Imagery for the Accurate and Dense 3d Reconstruction of the Destroyed Bel Temple in Palmyra

    Science.gov (United States)

    Wahbeh, W.; Nebiker, S.; Fangi, G.

    2016-06-01

    This paper exploits the potential of dense multi-image 3d reconstruction of destroyed cultural heritage monuments by either using public domain touristic imagery only or by combining the public domain imagery with professional panoramic imagery. The focus of our work is placed on the reconstruction of the temple of Bel, one of the Syrian heritage monuments, which was destroyed in September 2015 by the so called "Islamic State". The great temple of Bel is considered as one of the most important religious buildings of the 1st century AD in the East with a unique design. The investigations and the reconstruction were carried out using two types of imagery. The first are freely available generic touristic photos collected from the web. The second are panoramic images captured in 2010 for documenting those monuments. In the paper we present a 3d reconstruction workflow for both types of imagery using state-of-the art dense image matching software, addressing the non-trivial challenges of combining uncalibrated public domain imagery with panoramic images with very wide base-lines. We subsequently investigate the aspects of accuracy and completeness obtainable from the public domain touristic images alone and from the combination with spherical panoramas. We furthermore discuss the challenges of co-registering the weakly connected 3d point cloud fragments resulting from the limited coverage of the touristic photos. We then describe an approach using spherical photogrammetry as a virtual topographic survey allowing the co-registration of a detailed and accurate single 3d model of the temple interior and exterior.

  9. FISH: A THREE-DIMENSIONAL PARALLEL MAGNETOHYDRODYNAMICS CODE FOR ASTROPHYSICAL APPLICATIONS

    International Nuclear Information System (INIS)

    Kaeppeli, R.; Whitehouse, S. C.; Scheidegger, S.; Liebendoerfer, M.; Pen, U.-L.

    2011-01-01

    FISH is a fast and simple ideal magnetohydrodynamics code that scales to ∼10,000 processes for a Cartesian computational domain of ∼1000 3 cells. The simplicity of FISH has been achieved by the rigorous application of the operator splitting technique, while second-order accuracy is maintained by the symmetric ordering of the operators. Between directional sweeps, the three-dimensional data are rotated in memory so that the sweep is always performed in a cache-efficient way along the direction of contiguous memory. Hence, the code only requires a one-dimensional description of the conservation equations to be solved. This approach also enables an elegant novel parallelization of the code that is based on persistent communications with MPI for cubic domain decomposition on machines with distributed memory. This scheme is then combined with an additional OpenMP parallelization of different sweeps that can take advantage of clusters of shared memory. We document the detailed implementation of a second-order total variation diminishing advection scheme based on flux reconstruction. The magnetic fields are evolved by a constrained transport scheme. We show that the subtraction of a simple estimate of the hydrostatic gradient from the total gradients can significantly reduce the dissipation of the advection scheme in simulations of gravitationally bound hydrostatic objects. Through its simplicity and efficiency, FISH is as well suited for hydrodynamics classes as for large-scale astrophysical simulations on high-performance computer clusters. In preparation for the release of a public version, we demonstrate the performance of FISH in a suite of astrophysically orientated test cases.

  10. Quantifying the mechanisms of domain gain in animal proteins.

    Science.gov (United States)

    Buljan, Marija; Frankish, Adam; Bateman, Alex

    2010-01-01

    Protein domains are protein regions that are shared among different proteins and are frequently functionally and structurally independent from the rest of the protein. Novel domain combinations have a major role in evolutionary innovation. However, the relative contributions of the different molecular mechanisms that underlie domain gains in animals are still unknown. By using animal gene phylogenies we were able to identify a set of high confidence domain gain events and by looking at their coding DNA investigate the causative mechanisms. Here we show that the major mechanism for gains of new domains in metazoan proteins is likely to be gene fusion through joining of exons from adjacent genes, possibly mediated by non-allelic homologous recombination. Retroposition and insertion of exons into ancestral introns through intronic recombination are, in contrast to previous expectations, only minor contributors to domain gains and have accounted for less than 1% and 10% of high confidence domain gain events, respectively. Additionally, exonization of previously non-coding regions appears to be an important mechanism for addition of disordered segments to proteins. We observe that gene duplication has preceded domain gain in at least 80% of the gain events. The interplay of gene duplication and domain gain demonstrates an important mechanism for fast neofunctionalization of genes.

  11. Assessment of current cybersecurity practices in the public domain : cyber indications and warnings domain.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    This report assesses current public domain cyber security practices with respect to cyber indications and warnings. It describes cybersecurity industry and government activities, including cybersecurity tools, methods, practices, and international and government-wide initiatives known to be impacting current practice. Of particular note are the U.S. Government's Trusted Internet Connection (TIC) and 'Einstein' programs, which are serving to consolidate the Government's internet access points and to provide some capability to monitor and mitigate cyber attacks. Next, this report catalogs activities undertaken by various industry and government entities. In addition, it assesses the benchmarks of HPC capability and other HPC attributes that may lend themselves to assist in the solution of this problem. This report draws few conclusions, as it is intended to assess current practice in preparation for future work, however, no explicit references to HPC usage for the purpose of analyzing cyber infrastructure in near-real-time were found in the current practice. This report and a related SAND2010-4766 National Cyber Defense High Performance Computing and Analysis: Concepts, Planning and Roadmap report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  12. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  13. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  14. Energy information data base: report number codes

    International Nuclear Information System (INIS)

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used

  15. Numeral series hidden in the distribution of atomic mass of amino acids to codon domains in the genetic code.

    Science.gov (United States)

    Wohlin, Åsa

    2015-03-21

    The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  16. Development of 2-d cfd code

    International Nuclear Information System (INIS)

    Mirza, S.A.

    1999-01-01

    In the present study, a two-dimensional computer code has been developed in FORTRAN using CFD technique, which is basically a numerical scheme. This computer code solves the Navier Stokes equations and continuity equation to find out the velocity and pressure fields within a given domain. This analysis has been done for the developed within a square cavity driven by the upper wall which has become a bench mark for testing and comparing the newly developed numerical schemes. Before to handle this task, different one-dimensional cases have been studied by CFD technique and their FORTRAN programs written. The cases studied are Couette flow, Poiseuille flow with and without using symmetric boundary condition. Finally a comparison between CFD results and analytical results has also been made. For the cavity flow the results from the developed code have been obtained for different Reynolds numbers which are finally presented in the form of velocity vectors. The comparison of the developed code results have been made with the results obtained from the share ware version of a commercially available code for Reynolds number of 10.0. The disagreement in the results quantitatively and qualitatively at some grid points of the calculation domain have been discussed and future recommendations in this regard have also been made. (author)

  17. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  18. Enhancing public access to legal information : A proposal for a new official legal information generic top-level domain

    NARCIS (Netherlands)

    Mitee, Leesi Ebenezer

    2017-01-01

    Abstract: This article examines the use of a new legal information generic Top-Level Domain (gTLD) as a viable tool for easy identification of official legal information websites (OLIWs) and enhancing global public access to their resources. This intervention is necessary because of the existence of

  19. Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health

    Science.gov (United States)

    Gorsky, Martin

    2015-01-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996–2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the ‘declinist’ historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour. PMID:26217072

  20. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    Science.gov (United States)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  1. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...

  2. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  3. Conversion of the agent-oriented domain-specific language ALAS into JavaScript

    Science.gov (United States)

    Sredojević, Dejan; Vidaković, Milan; Okanović, Dušan; Mitrović, Dejan; Ivanović, Mirjana

    2016-06-01

    This paper shows generation of JavaScript code from code written in agent-oriented domain-specific language ALAS. ALAS is an agent-oriented domain-specific language for writing software agents that are executed within XJAF middleware. Since the agents can be executed on various platforms, they must be converted into a language of the target platform. We also try to utilize existing tools and technologies to make the whole conversion process as simple as possible, as well as faster and more efficient. We use the Xtext framework that is compatible with Java to implement ALAS infrastructure - editor and code generator. Since Xtext supports Java, generation of Java code from ALAS code is straightforward. To generate a JavaScript code that will be executed within the target JavaScript XJAF implementation, Google Web Toolkit (GWT) is used.

  4. A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings.

    Science.gov (United States)

    Zhang, Yinsheng; Zhang, Guoming

    2018-01-01

    A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.

  5. A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings

    Directory of Open Access Journals (Sweden)

    Yinsheng Zhang

    2018-01-01

    Full Text Available A terminology (or coding system is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity. Based on a collection of historical ROP patients’ data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system—ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.

  6. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    Science.gov (United States)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  7. Patterns, principles, and practices of domain-driven design

    CERN Document Server

    Millett, Scott

    2015-01-01

    Methods for managing complex software construction following the practices, principles and patterns of Domain-Driven Design with code examples in C# This book presents the philosophy of Domain-Driven Design (DDD) in a down-to-earth and practical manner for experienced developers building applications for complex domains. A focus is placed on the principles and practices of decomposing a complex problem space as well as the implementation patterns and best practices for shaping a maintainable solution space. You will learn how to build effective domain models through the use of tactical pat

  8. Enhancements to the Combinatorial Geometry Particle Tracker in the Mercury Monte Carlo Transport Code: Embedded Meshes and Domain Decomposition

    International Nuclear Information System (INIS)

    Greenman, G.M.; O'Brien, M.J.; Procassini, R.J.; Joy, K.I.

    2009-01-01

    Two enhancements to the combinatorial geometry (CG) particle tracker in the Mercury Monte Carlo transport code are presented. The first enhancement is a hybrid particle tracker wherein a mesh region is embedded within a CG region. This method permits efficient calculations of problems with contain both large-scale heterogeneous and homogeneous regions. The second enhancement relates to the addition of parallelism within the CG tracker via spatial domain decomposition. This permits calculations of problems with a large degree of geometric complexity, which are not possible through particle parallelism alone. In this method, the cells are decomposed across processors and a particles is communicated to an adjacent processor when it tracks to an interprocessor boundary. Applications that demonstrate the efficacy of these new methods are presented

  9. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    International Nuclear Information System (INIS)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements. (paper)

  10. Locality-preserving logical operators in topological stabilizer codes

    Science.gov (United States)

    Webster, Paul; Bartlett, Stephen D.

    2018-01-01

    Locality-preserving logical operators in topological codes are naturally fault tolerant, since they preserve the correctability of local errors. Using a correspondence between such operators and gapped domain walls, we describe a procedure for finding all locality-preserving logical operators admitted by a large and important class of topological stabilizer codes. In particular, we focus on those equivalent to a stack of a finite number of surface codes of any spatial dimension, where our procedure fully specifies the group of locality-preserving logical operators. We also present examples of how our procedure applies to codes with different boundary conditions, including color codes and toric codes, as well as more general codes such as Abelian quantum double models and codes with fermionic excitations in more than two dimensions.

  11. (Nearly) portable PIC code for parallel computers

    International Nuclear Information System (INIS)

    Decyk, V.K.

    1993-01-01

    As part of the Numerical Tokamak Project, the author has developed a (nearly) portable, one dimensional version of the GCPIC algorithm for particle-in-cell codes on parallel computers. This algorithm uses a spatial domain decomposition for the fields, and passes particles from one domain to another as the particles move spatially. With only minor changes, the code has been run in parallel on the Intel Delta, the Cray C-90, the IBM ES/9000 and a cluster of workstations. After a line by line translation into cmfortran, the code was also run on the CM-200. Impressive speeds have been achieved, both on the Intel Delta and the Cray C-90, around 30 nanoseconds per particle per time step. In addition, the author was able to isolate the data management modules, so that the physics modules were not changed much from their sequential version, and the data management modules can be used as open-quotes black boxes.close quotes

  12. Validation of a CFD code for Unsteady Flows with cyclic boundary Conditions

    International Nuclear Information System (INIS)

    Kim, Jong-Tae; Kim, Sang-Baik; Lee, Won-Jae

    2006-01-01

    Currently Lilac code is under development to analyze thermo-hydraulics of a high-temperature gas-cooled reactor (GCR). Interesting thermo-hydraulic phenomena in a nuclear reactor are usually unsteady and turbulent. The analysis of the unsteady flows by using a three dimension CFD code is time-consuming if the flow domain is very large. Hopefully, flow domains commonly encountered in the nuclear thermo-hydraulics is periodic. So it is better to use the geometrical characteristics in order to reduce the computational resources. To get the benefits from reducing the computation domains especially for the calculations of unsteady flows, the cyclic boundary conditions are implemented in the parallelized CFD code LILAC. In this study, the parallelized cyclic boundary conditions are validated by solving unsteady laminar and turbulent flows past a circular cylinder

  13. Domain Specific Language Support for Exascale

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2017-10-20

    A multi-institutional project known as D-TEC (short for “Domain- specific Technology for Exascale Computing”) set out to explore technologies to support the construction of Domain Specific Languages (DSLs) to map application programs to exascale architectures. DSLs employ automated code transformation to shift the burden of delivering portable performance from application programmers to compilers. Two chief properties contribute: DSLs permit expression at a high level of abstraction so that a programmer’s intent is clear to a compiler and DSL implementations encapsulate human domain-specific optimization knowledge so that a compiler can be smart enough to achieve good results on specific hardware. Domain specificity is what makes these properties possible in a programming language. If leveraging domain specificity is the key to keep exascale software tractable, a corollary is that many different DSLs will be needed to encompass the full range of exascale computing applications; moreover, a single application may well need to use several different DSLs in conjunction. As a result, developing a general toolkit for building domain-specific languages was a key goal for the D-TEC project. Different aspects of the D-TEC research portfolio were the focus of work at each of the partner institutions in the multi-institutional project. D-TEC research and development work at Rice University focused on on three principal topics: understanding how to automate the tuning of code for complex architectures, research and development of the Rosebud DSL engine, and compiler technology to support complex execution platforms. This report provides a summary of the research and development work on the D-TEC project at Rice University.

  14. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  15. Collection of regulatory texts related to radiation protection (collection of legal and regulatory measures related to radiation protection). Part 1: laws and decrees (Extracts of the Public Health Code and of the Labour Code dealing with the protection of population, patients and workers against the hazards of ionizing radiations); Part 2: orders, decisions, non codified decrees (Orders and decisions taken in application of the Public Health Code and of the Labour Code dealing with the protection of population, patients and workers against the hazards of ionizing radiations)

    International Nuclear Information System (INIS)

    Rivas, R.; Saad, N.; Niel, X.; Cottin, V.; Lachaume, J.L.; Feries, J.

    2011-01-01

    The first part contains legal and regulatory texts extracted from the Public Health Code and related to health general protection and to health products (medical devices), from the Social Security Code, and from the Labour Code related to individual work relationships, to health and safety at work, to work places, to work equipment and means of protection, to the prevention of some exposure risks and of risks related to some activities. The second part gathers texts extracted from the Public Health Code and related to ionizing radiations (general measures for the protection of the population, exposure to natural radiations, general regime of authorizations and declarations, purchase, retailing, importation, exportation, transfer and elimination of radioactive sources, protection of persons exposed to ionizing radiations for medical or forensics purposes, situations of radiological emergency and of sustained exposure to ionizing radiations, control), to the safety of waters and food products, and to the control of medical devices, to the protection of patients. It also contains extracts for the Labour Code related to workers protection

  16. A multiscale numerical algorithm for heat transfer simulation between multidimensional CFD and monodimensional system codes

    Science.gov (United States)

    Chierici, A.; Chirco, L.; Da Vià, R.; Manservisi, S.; Scardovelli, R.

    2017-11-01

    Nowadays the rapidly-increasing computational power allows scientists and engineers to perform numerical simulations of complex systems that can involve many scales and several different physical phenomena. In order to perform such simulations, two main strategies can be adopted: one may develop a new numerical code where all the physical phenomena of interest are modelled or one may couple existing validated codes. With the latter option, the creation of a huge and complex numerical code is avoided but efficient methods for data exchange are required since the performance of the simulation is highly influenced by its coupling techniques. In this work we propose a new algorithm that can be used for volume and/or boundary coupling purposes for both multiscale and multiphysics numerical simulations. The proposed algorithm is used for a multiscale simulation involving several CFD domains and monodimensional loops. We adopt the overlapping domain strategy, so the entire flow domain is simulated with the system code. We correct the system code solution by matching averaged inlet and outlet fields located at the boundaries of the CFD domains that overlap parts of the monodimensional loop. In particular we correct pressure losses and enthalpy values with source-sink terms that are imposed in the system code equations. The 1D-CFD coupling is a defective one since the CFD code requires point-wise values on the coupling interfaces and the system code provides only averaged quantities. In particular we impose, as inlet boundary conditions for the CFD domains, the mass flux and the mean enthalpy that are calculated by the system code. With this method the mass balance is preserved at every time step of the simulation. The coupling between consecutive CFD domains is not a defective one since with the proposed algorithm we can interpolate the field solutions on the boundary interfaces. We use the MED data structure as the base structure where all the field operations are

  17. MACCS version 1.5.11.1: A maintenance release of the code

    International Nuclear Information System (INIS)

    Chanin, D.; Foster, J.; Rollstin, J.; Miller, L.

    1993-10-01

    A new version of the MACCS code (version 1.5.11.1) has been developed by Sandia National Laboratories under sponsorship of the US Nuclear Regulatory Commission. MACCS was developed to support evaluations of the off-site consequences from hypothetical severe accidents at commercial power plants. MACCS is the only current public domain code in the US that embodies all of the following modeling capabilities: (1) weather sampling using a year of recorded weather data; (2) mitigative actions such as evacuation, sheltering, relocation, decontamination, and interdiction; (3) economic costs of mitigative actions; (4) cloudshine, groundshine, and inhalation pathways as well as food and water ingestion; (5) calculation of both individual and societal doses to various organs; and (6) calculation of both acute (nonstochastic) and latent (stochastic) health effects and risks of health effects. All of the consequence measures may be fun generated in the form of a complementary cumulative distribution function (CCDF). The current version implements a revised cancer model consistent with recent reports such as BEIR V and ICRP 60. In addition, a number of error corrections and portability enhancements have been implemented. This report describes only the changes made in creating the new version. Users of the code will need to obtain the code's original documentation, NUREG/CR-4691

  18. Syrthes thermal code and Estet or N3S fluid mechanics codes coupling; Couplage du code de thermique Syrthes et des codes de mecanique des fluides N3S et ou Estet

    Energy Technology Data Exchange (ETDEWEB)

    Peniguel, C [Electricite de France (EDF), 78 - Chatou (France). Direction des Etudes et Recherches; Rupp, I [SIMULOG, 78 - Guyancourt (France)

    1997-06-01

    EDF has developed numerical codes for modeling the conductive, radiative and convective thermal transfers and their couplings in complex industrial configurations: the convection in a fluid is solved by Estet in finite volumes or N3S in finite elements, the conduction is solved by Syrthes and the wall-to-wall thermal radiation is modelled by Syrthes with the help of a radiosity method. Syrthes controls the different heat exchanges which may occur between fluid and solid domains, using an explicit iterative method. An extension of Syrthes has been developed in order to allow the consideration of configurations where several fluid codes operate simultaneously, using ``message passing`` tools such as PVM (Parallel Virtual Machine) and the Calcium code coupler developed at EDF. Application examples are given

  19. The Regulatory and Kinase Domains but Not the Interdomain Linker Determine Human Double-stranded RNA-activated Kinase (PKR) Sensitivity to Inhibition by Viral Non-coding RNAs.

    Science.gov (United States)

    Sunita, S; Schwartz, Samantha L; Conn, Graeme L

    2015-11-20

    Double-stranded RNA (dsRNA)-activated protein kinase (PKR) is an important component of the innate immune system that presents a crucial first line of defense against viral infection. PKR has a modular architecture comprising a regulatory N-terminal dsRNA binding domain and a C-terminal kinase domain interposed by an unstructured ∼80-residue interdomain linker (IDL). Guided by sequence alignment, we created IDL deletions in human PKR (hPKR) and regulatory/kinase domain swap human-rat chimeric PKRs to assess the contributions of each domain and the IDL to regulation of the kinase activity by RNA. Using circular dichroism spectroscopy, limited proteolysis, kinase assays, and isothermal titration calorimetry, we show that each PKR protein is properly folded with similar domain boundaries and that each exhibits comparable polyinosinic-cytidylic (poly(rI:rC)) dsRNA activation profiles and binding affinities for adenoviral virus-associated RNA I (VA RNAI) and HIV-1 trans-activation response (TAR) RNA. From these results we conclude that the IDL of PKR is not required for RNA binding or mediating changes in protein conformation or domain interactions necessary for PKR regulation by RNA. In contrast, inhibition of rat PKR by VA RNAI and TAR RNA was found to be weaker than for hPKR by 7- and >300-fold, respectively, and each human-rat chimeric domain-swapped protein showed intermediate levels of inhibition. These findings indicate that PKR sequence or structural elements in the kinase domain, present in hPKR but absent in rat PKR, are exploited by viral non-coding RNAs to accomplish efficient inhibition of PKR. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  1. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  2. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  3. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  4. Development of environmental dose assessment system (EDAS) code of PC version

    Energy Technology Data Exchange (ETDEWEB)

    Taki, Mitsumasa; Kikuchi, Masamitsu; Kobayashi, Hideo; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-05-01

    A computer code (EDAS) was developed to assess the public dose for the safety assessment to get the license of nuclear reactor operation. This code system is used for the safety analysis of public around the nuclear reactor in normal operation and severe accident. This code was revised and composed for personal computer user according to the Nuclear Safety Guidelines reflected the ICRP1990 recommendation. These guidelines are revised by Nuclear Safety Commission on March, 2001, which are 'Weather analysis guideline for the safety assessment of nuclear power reactor', 'Public dose around the facility assessment guideline corresponding to the objective value for nuclear power light water reactor' and 'Public dose assessment guideline for safety review of nuclear power light water reactor'. This code has been already opened for public user by JAERI, and English version code and user manual are also prepared. This English version code is helpful for international cooperation concerning the nuclear safety assessment with JAERI. (author)

  5. Development of environmental dose assessment system (EDAS) code of PC version

    CERN Document Server

    Taki, M; Kobayashi, H; Yamaguchi, T

    2003-01-01

    A computer code (EDAS) was developed to assess the public dose for the safety assessment to get the license of nuclear reactor operation. This code system is used for the safety analysis of public around the nuclear reactor in normal operation and severe accident. This code was revised and composed for personal computer user according to the Nuclear Safety Guidelines reflected the ICRP1990 recommendation. These guidelines are revised by Nuclear Safety Commission on March, 2001, which are 'Weather analysis guideline for the safety assessment of nuclear power reactor', 'Public dose around the facility assessment guideline corresponding to the objective value for nuclear power light water reactor' and 'Public dose assessment guideline for safety review of nuclear power light water reactor'. This code has been already opened for public user by JAERI, and English version code and user manual are also prepared. This English version code is helpful for international cooperation concerning the nuclear safety assessme...

  6. Project of decree relative to the licensing and statement system of nuclear activities and to their control and bearing various modifications of the public health code and working code; Projet de decret relatif au regime d'autorisation et de declaration des activites nucleaires et a leur controle et portant diverses modifications du code de la sante publique et du code du travail

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This decree concerns the control of high level sealed radioactive sources and orphan sources. It has for objective to introduce administrative simplification, especially the radiation sources licensing and statement system, to reinforce the control measures planed by the public health code and by the employment code, to bring precision and complements in the editing of several already existing arrangements. (N.C.)

  7. DAE emergency response centre (ERC) at Kalpakkam for response to nuclear and radiological emergencies in public domain

    International Nuclear Information System (INIS)

    Meenakshisundaram, V.; Rajagopal, V.; Mathiyarasu, R.; Subramanian, V.; Rajaram, S.; Somayaji, K.M.; Kannan, V.; Rajagopalan, H.

    2008-01-01

    In India, Department of Atomic Energy (DAE) has been identified as the nodal agency/authority in respect of providing the necessary technical inputs in the event of any radiation emergency that may occur in public domain. The overall system takes into consideration statutory requirements, executive decisions as well as National and International obligations. This paper highlights the details about the strength of the Kalpakkam ERC and other essential requisites and their compliance since its formation

  8. Variable weight Khazani-Syed code using hybrid fixed-dynamic technique for optical code division multiple access system

    Science.gov (United States)

    Anas, Siti Barirah Ahmad; Seyedzadeh, Saleh; Mokhtar, Makhfudzah; Sahbudin, Ratna Kalos Zakiah

    2016-10-01

    Future Internet consists of a wide spectrum of applications with different bit rates and quality of service (QoS) requirements. Prioritizing the services is essential to ensure that the delivery of information is at its best. Existing technologies have demonstrated how service differentiation techniques can be implemented in optical networks using data link and network layer operations. However, a physical layer approach can further improve system performance at a prescribed received signal quality by applying control at the bit level. This paper proposes a coding algorithm to support optical domain service differentiation using spectral amplitude coding techniques within an optical code division multiple access (OCDMA) scenario. A particular user or service has a varying weight applied to obtain the desired signal quality. The properties of the new code are compared with other OCDMA codes proposed for service differentiation. In addition, a mathematical model is developed for performance evaluation of the proposed code using two different detection techniques, namely direct decoding and complementary subtraction.

  9. New MoM code incorporating multiple domain basis functions

    CSIR Research Space (South Africa)

    Lysko, AA

    2011-08-01

    Full Text Available piecewise linear approximation of geometry. This often leads to an unnecessarily great number of unknowns used to model relatively small loop and spiral antennas, coils and other curved structures. This is because the program creates a dense mesh... to accelerate computation of the elements of the impedance matrix and showed acceleration factor exceeding an order of magnitude, subject to a high accuracy requirement. 3. On Code Functionality and Application Results The package of programs was written...

  10. 42 CFR 414.40 - Coding and ancillary policies.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Coding and ancillary policies. 414.40 Section 414.40 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Practitioners § 414.40 Coding and ancillary policies. (a) General rule. CMS establishes uniform national...

  11. The Influence of Gender and Ethnicity on the Choice of Language in the Transaction Domain of Language Use: The Case of Undergraduates

    Directory of Open Access Journals (Sweden)

    Mehdi Granhemat

    2015-09-01

    Full Text Available Multilingual individuals, consciously or unconsciously, are often confronted with having to select one linguistic code over another from within their linguistic repertoires. The choice of a proper linguistic code enables effective communication and could also lead to the promotion of solidarity among interlocutors. The focus of this study was to examine the influence of gender and ethnicity on the choices of languages of Malaysian youths in the transaction domain of language use. In sociolinguistic studies, Domain as a theoretical concept can be employed to explain how language choices and some individual factors—in case of this study gender and ethnicity—related to language choices of multilinguals. Based on a random proportional stratified sampling strategy, a total of 498 undergraduate local students in a Malaysian public university were selected as respondents of the study. The male and female respondents mostly belonged to the three main ethnic groups, i.e. the Malays, Chinese, and Indians. Also some other ethnic minority groups’ members were included in the study. Data about the demographic profiles of the respondents and the choices of languages in the transaction domain of language use was collected through a self administrated questionnaire survey. SPSS software was used to run analyses such as determining the respondents’ most used languages and Chi-Square Test to find out the relationships between variables. According to the results, the linguistic situation in Malaysia is similar to a diglossic situation. Besides, the factor of ethnicity was found to be influential in the choice and use of linguistic codes among the Malaysian youths. But gender was not found to be a determinant of language choice in the transaction domain of language use.

  12. Generating Graphical User Interfaces from Precise Domain Specifications

    OpenAIRE

    Kamil Rybiński; Norbert Jarzębowski; Michał Śmiałek; Wiktor Nowakowski; Lucyna Skrzypek; Piotr Łabęcki

    2014-01-01

    Turning requirements into working systems is the essence of software engineering. This paper proposes automation of one of the aspects of this vast problem: generating user interfaces directly from requirements models. It presents syntax and semantics of a comprehensible yet precise domain specification language. For this language, the paper presents the process of generating code for the user interface elements. This includes model transformation procedures to generate window initiation code...

  13. Selected DOE headquarters publications

    International Nuclear Information System (INIS)

    1979-04-01

    This publication provides listings of (mainly policy and programmatic) publications which have been issued by headquarters organizations of the Department of Energy; assigned a DOE/XXX- type report number code, where XXX is the 1- to 4-letter code for the issuing headquarters organization; received by the Energy Library; and made available to the public

  14. Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization

    Science.gov (United States)

    Tanaka, Ken; Tomeba, Hiromichi; Adachi, Fumiyuki

    Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of orthogonal frequency division multiplexing (OFDM) and time-domain spreading, while multi-carrier code division multiple access (MC-CDMA) is a combination of OFDM and frequency-domain spreading. In MC-CDMA, a good bit error rate (BER) performance can be achieved by using frequency-domain equalization (FDE), since the frequency diversity gain is obtained. On the other hand, the conventional orthogonal MC DS-CDMA fails to achieve any frequency diversity gain. In this paper, we propose a new orthogonal MC DS-CDMA that can obtain the frequency diversity gain by applying FDE. The conditional BER analysis is presented. The theoretical average BER performance in a frequency-selective Rayleigh fading channel is evaluated by the Monte-Carlo numerical computation method using the derived conditional BER and is confirmed by computer simulation of the orthogonal MC DS-CDMA signal transmission.

  15. arXiv AlterBBN v2: A public code for calculating Big-Bang nucleosynthesis constraints in alternative cosmologies

    CERN Document Server

    Arbey, A.; Hickerson, K.P.; Jenssen, E.S.

    We present the version 2 of AlterBBN, an open public code for the calculation of the abundance of the elements from Big-Bang nucleosynthesis. It does not rely on any closed external library or program, aims at being user-friendly and allowing easy modifications, and provides a fast and reliable calculation of the Big-Bang nucleosynthesis constraints in the standard and alternative cosmologies.

  16. New Inversion and Interpretation of Public-Domain Electromagnetic Survey Data from Selected Areas in Alaska

    Science.gov (United States)

    Smith, B. D.; Kass, A.; Saltus, R. W.; Minsley, B. J.; Deszcz-Pan, M.; Bloss, B. R.; Burns, L. E.

    2013-12-01

    Public-domain airborne geophysical surveys (combined electromagnetics and magnetics), mostly collected for and released by the State of Alaska, Division of Geological and Geophysical Surveys (DGGS), are a unique and valuable resource for both geologic interpretation and geophysical methods development. A new joint effort by the US Geological Survey (USGS) and the DGGS aims to add value to these data through the application of novel advanced inversion methods and through innovative and intuitive display of data: maps, profiles, voxel-based models, and displays of estimated inversion quality and confidence. Our goal is to make these data even more valuable for interpretation of geologic frameworks, geotechnical studies, and cryosphere studies, by producing robust estimates of subsurface resistivity that can be used by non-geophysicists. The available datasets, which are available in the public domain, include 39 frequency-domain electromagnetic datasets collected since 1993, and continue to grow with 5 more data releases pending in 2013. The majority of these datasets were flown for mineral resource purposes, with one survey designed for infrastructure analysis. In addition, several USGS datasets are included in this study. The USGS has recently developed new inversion methodologies for airborne EM data and have begun to apply these and other new techniques to the available datasets. These include a trans-dimensional Markov Chain Monte Carlo technique, laterally-constrained regularized inversions, and deterministic inversions which include calibration factors as a free parameter. Incorporation of the magnetic data as an additional constraining dataset has also improved the inversion results. Processing has been completed in several areas, including Fortymile and the Alaska Highway surveys, and continues in others such as the Styx River and Nome surveys. Utilizing these new techniques, we provide models beyond the apparent resistivity maps supplied by the original

  17. Collection of regulatory texts relative to radiation protection. Part 1: laws and decrees (Extracts of the Public Health Code and of the Labour Code dealing with the protection of population, patients and workers against the hazards of ionizing radiations

    International Nuclear Information System (INIS)

    Rivas, Robert; Feries, Jean; Marzorati, Frank; Chevalier, Celine; Lachaume, Jean-Luc

    2013-01-01

    This first part contains legal and regulatory texts extracted from the Public Health Code and related to health general protection and to health products (medical devices), from the Social Security Code, and from the Labour Code related to individual work relationships, to health and safety at work, to work places, to work equipment and means of protection, to the prevention of some exposure risks and of risks related to some activities. This document is an update of the previous version from January 25, 2011

  18. Sources and Resources Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health.

    Science.gov (United States)

    Gorsky, Martin

    2015-08-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996-2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the 'declinist' historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour.

  19. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  20. Massively parallel Fokker-Planck code ALLAp

    International Nuclear Information System (INIS)

    Batishcheva, A.A.; Krasheninnikov, S.I.; Craddock, G.G.; Djordjevic, V.

    1996-01-01

    The recently developed for workstations Fokker-Planck code ALLA simulates the temporal evolution of 1V, 2V and 1D2V collisional edge plasmas. In this work we present the results of code parallelization on the CRI T3D massively parallel platform (ALLAp version). Simultaneously we benchmark the 1D2V parallel vesion against an analytic self-similar solution of the collisional kinetic equation. This test is not trivial as it demands a very strong spatial temperature and density variation within the simulation domain. (orig.)

  1. OSSMETER D3.2 – Report on Source Code Activity Metrics

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and initial prototypes of the tools that are needed for source code activity analysis. It builds upon the Deliverable 3.1 where infra-structure and a domain analysis have been

  2. Spectral phase encoding of ultra-short optical pulse in time domain for OCDMA application.

    Science.gov (United States)

    Wang, Xu; Wada, Naoya

    2007-06-11

    We propose a novel reconfigurable time domain spectral phase encoding (SPE) scheme for coherent optical code-division-multiple-access application. In the proposed scheme, the ultra-short optical pulse is stretched by dispersive device and the SPE is done in time domain using high speed phase modulator. The time domain SPE scheme is robust to wavelength drift of the light source and is very flexible and compatible with the fiber optical system. Proof-of-principle experiments of encoding with 16-chip, 20 GHz/chip binary-phase-shift-keying codes and 1.25 Gbps data transmission have been successfully demonstrated together with an arrayed-wave-guide decoder.

  3. A domain specific language for performance portable molecular dynamics algorithms

    Science.gov (United States)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  4. Block-based wavelet transform coding of mammograms with region-adaptive quantization

    Science.gov (United States)

    Moon, Nam Su; Song, Jun S.; Kwon, Musik; Kim, JongHyo; Lee, ChoongWoong

    1998-06-01

    To achieve both high compression ratio and information preserving, it is an efficient way to combine segmentation and lossy compression scheme. Microcalcification in mammogram is one of the most significant sign of early stage of breast cancer. Therefore in coding, detection and segmentation of microcalcification enable us to preserve it well by allocating more bits to it than to other regions. Segmentation of microcalcification is performed both in spatial domain and in wavelet transform domain. Peak error controllable quantization step, which is off-line designed, is suitable for medical image compression. For region-adaptive quantization, block- based wavelet transform coding is adopted and different peak- error-constrained quantizers are applied to blocks according to the segmentation result. In view of preservation of microcalcification, the proposed coding scheme shows better performance than JPEG.

  5. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  6. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  7. [Harassment in the public sector].

    Science.gov (United States)

    Puech, Paloma; Pitcho, Benjamin

    2013-01-01

    The French Labour Code, which provides full protection against moral and sexual harassment, is not applicable to public sector workers. The public hospital is however not exempt from such behaviour, which could go unpunished. Public sector workers are therefore protected by the French General Civil Service Regulations and the penal code.

  8. QR Codes in the Library: Are They Worth the Effort? Analysis of a QR Code Pilot Project

    OpenAIRE

    Wilson, Andrew M.

    2012-01-01

    The literature is filled with potential uses for Quick Response (QR) codes in the library. Setting, but few library QR code projects have publicized usage statistics. A pilot project carried out in the Eda Kuhn Loeb Music Library of the Harvard College Library sought to determine whether library patrons actually understand and use QR codes. Results and analysis of the pilot project are provided, attempting to answer the question as to whether QR codes are worth the effort for libraries.

  9. Computer codes and methods for simulating accelerator driven systems

    International Nuclear Information System (INIS)

    Sartori, E.; Byung Chan Na

    2003-01-01

    A large set of computer codes and associated data libraries have been developed by nuclear research and industry over the past half century. A large number of them are in the public domain and can be obtained under agreed conditions from different Information Centres. The areas covered comprise: basic nuclear data and models, reactor spectra and cell calculations, static and dynamic reactor analysis, criticality, radiation shielding, dosimetry and material damage, fuel behaviour, safety and hazard analysis, heat conduction and fluid flow in reactor systems, spent fuel and waste management (handling, transportation, and storage), economics of fuel cycles, impact on the environment of nuclear activities etc. These codes and models have been developed mostly for critical systems used for research or power generation and other technological applications. Many of them have not been designed for accelerator driven systems (ADS), but with competent use, they can be used for studying such systems or can form the basis for adapting existing methods to the specific needs of ADS's. The present paper describes the types of methods, codes and associated data available and their role in the applications. It provides Web addresses for facilitating searches for such tools. Some indications are given on the effect of non appropriate or 'blind' use of existing tools to ADS. Reference is made to available experimental data that can be used for validating the methods use. Finally, some international activities linked to the different computational aspects are described briefly. (author)

  10. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  11. 41 CFR 101-27.205 - Shelf-life codes.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Shelf-life codes. 101-27...-Management of Shelf-Life Materials § 101-27.205 Shelf-life codes. Shelf-life items shall be identified by use of a one-digit code to provide for uniform coding of shelf-life materials by all agencies. (a) The...

  12. [Promoting the code of ethics for nurses].

    Science.gov (United States)

    Chamboredon, Patrick; Lecointre, Brigitte

    2017-09-01

    The publication of the code of ethics for nurses requires the French National Order of Nurses' structures to undertake initiatives with the aim of promoting it as well as implementing the public service missions which have now been attributed to the Order. Each regional and departmental body has its role to play in raising awareness of this code and its application in the field. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  13. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, transparency, neutrality, impartiality, effectiveness, accountability, and legality. The normative context of public administration, as expressed in codes, seems to ignore the New Public Management and Reinventing Government reform movements....

  14. Acoustic, finite-difference, time-domain technique development

    International Nuclear Information System (INIS)

    Kunz, K.

    1994-01-01

    A close analog exists between the behavior of sound waves in an ideal gas and the radiated waves of electromagnetics. This analog has been exploited to obtain an acoustic, finite-difference, time-domain (AFDTD) technique capable of treating small signal vibrations in elastic media, such as air, water, and metal, with the important feature of bending motion included in the behavior of the metal. This bending motion is particularly important when the metal is formed into sheets or plates. Bending motion does not have an analog in electromagnetics, but can be readily appended to the acoustic treatment since it appears as a single additional term in the force equation for plate motion, which is otherwise analogous to the electromagnetic wave equation. The AFDTD technique has been implemented in a code architecture that duplicates the electromagnetic, finite-difference, time-domain technique code. The main difference in the implementation is the form of the first-order coupled differential equations obtained from the wave equation. The gradient of pressure and divergence of velocity appear in these equations in the place of curls of the electric and magnetic fields. Other small changes exist as well, but the codes are essentially interchangeable. The pre- and post-processing for model construction and response-data evaluation of the electromagnetic code, in the form of the TSAR code at Lawrence Livermore National Laboratory, can be used for the acoustic version. A variety of applications is possible, pending validation of the bending phenomenon. The applications include acoustic-radiation-pattern predictions for a submerged object; mine detection analysis; structural noise analysis for cars; acoustic barrier analysis; and symphonic hall/auditorium predictions and speaker enclosure modeling

  15. On fuzzy semantic similarity measure for DNA coding.

    Science.gov (United States)

    Ahmad, Muneer; Jung, Low Tang; Bhuiyan, Md Al-Amin

    2016-02-01

    A coding measure scheme numerically translates the DNA sequence to a time domain signal for protein coding regions identification. A number of coding measure schemes based on numerology, geometry, fixed mapping, statistical characteristics and chemical attributes of nucleotides have been proposed in recent decades. Such coding measure schemes lack the biologically meaningful aspects of nucleotide data and hence do not significantly discriminate coding regions from non-coding regions. This paper presents a novel fuzzy semantic similarity measure (FSSM) coding scheme centering on FSSM codons׳ clustering and genetic code context of nucleotides. Certain natural characteristics of nucleotides i.e. appearance as a unique combination of triplets, preserving special structure and occurrence, and ability to own and share density distributions in codons have been exploited in FSSM. The nucleotides׳ fuzzy behaviors, semantic similarities and defuzzification based on the center of gravity of nucleotides revealed a strong correlation between nucleotides in codons. The proposed FSSM coding scheme attains a significant enhancement in coding regions identification i.e. 36-133% as compared to other existing coding measure schemes tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  17. Collection of regulatory texts relative to radiation protection. Part 2: orders and decisions taken in application of the Public Health Code and Labour Code concerning the protection of populations, patients and workers against the risks of ionizing radiations; Recueil de textes reglementaires relatifs a la radioprotection. Partie 2: arretes et decisions pris en application du Code de Sante Publique et du Code du Travail concernant la protection de la population, des patients et des travailleurs contre les dangers des rayonnements ionisants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-05-15

    This collection of texts includes the general measures of population protection, exposure to natural radiations, general system of authorizations and statements, protection of persons exposed to ionizing radiations for medical purpose, situations of radiological emergency and long exposure to ionizing radiations, penal dispositions, application of the Public Health code and application of the Labour code. Chronological contents by date of publication is given. (N.C.)

  18. Spatial representations are specific to different domains of knowledge.

    Directory of Open Access Journals (Sweden)

    Rowena Beecham

    Full Text Available There is evidence that many abstract concepts are represented cognitively in a spatial format. However, it is unknown whether similar spatial processes are employed in different knowledge domains, or whether individuals exhibit similar spatial profiles within and across domains. This research investigated similarities in spatial representation in two knowledge domains--mathematics and music. Sixty-one adults completed analogous number magnitude and pitch discrimination tasks: the Spatial-Numerical Association of Response Codes and Spatial-Musical Association of Response Codes tasks. Subgroups of individuals with different response patterns were identified through cluster analyses. For both the mathematical and musical tasks, approximately half of the participants showed the expected spatial judgment effect when explicitly cued to focus on the spatial properties of the stimuli. Despite this, performances on the two tasks were largely independent. Consistent with previous research, the study provides evidence for the spatial representation of number and pitch in the majority of individuals. However, there was little evidence to support the claim that the same spatial representation processes underpin mathematical and musical judgments.

  19. Improvement of Electromagnetic Code for Phased Array Antenna Design

    National Research Council Canada - National Science Library

    Holter, Henrik

    2007-01-01

    .... The code which is named PBFDTD (Periodic Boundary FDTD) now handles magnetic materials (lossy and loss-free). Frequency domain surface currents and the electromagnetic field in the computational volume can be visualized...

  20. Computer codes used in particle accelerator design: First edition

    International Nuclear Information System (INIS)

    1987-01-01

    This paper contains a listing of more than 150 programs that have been used in the design and analysis of accelerators. Given on each citation are person to contact, classification of the computer code, publications describing the code, computer and language runned on, and a short description of the code. Codes are indexed by subject, person to contact, and code acronym

  1. Impact of e-publication changes in the International Code of Nomenclature for algae, fungi and plants (Melbourne Code, 2012) - did we need to "run for our lives"?

    Science.gov (United States)

    Nicolson, Nicky; Challis, Katherine; Tucker, Allan; Knapp, Sandra

    2017-05-25

    At the Nomenclature Section of the XVIII International Botanical Congress in Melbourne, Australia (IBC), the botanical community voted to allow electronic publication of nomenclatural acts for algae, fungi and plants, and to abolish the rule requiring Latin descriptions or diagnoses for new taxa. Since the 1st January 2012, botanists have been able to publish new names in electronic journals and may use Latin or English as the language of description or diagnosis. Using data on vascular plants from the International Plant Names Index (IPNI) spanning the time period in which these changes occurred, we analysed trajectories in publication trends and assessed the impact of these new rules for descriptions of new species and nomenclatural acts. The data show that the ability to publish electronically has not "opened the floodgates" to an avalanche of sloppy nomenclature, but concomitantly neither has there been a massive expansion in the number of names published, nor of new authors and titles participating in publication of botanical nomenclature. The e-publication changes introduced in the Melbourne Code have gained acceptance, and botanists are using these new techniques to describe and publish their work. They have not, however, accelerated the rate of plant species description or participation in biodiversity discovery as was hoped.

  2. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    Energy Technology Data Exchange (ETDEWEB)

    Poole, B R; Nelson, S D; Langdon, S

    2005-05-05

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes.

  3. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    International Nuclear Information System (INIS)

    Poole, B R; Nelson, S D; Langdon, S

    2005-01-01

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes

  4. 45 CFR 162.1002 - Medical data code sets.

    Science.gov (United States)

    2010-10-01

    ... Terminology, Fourth Edition (CPT-4), as maintained and distributed by the American Medical Association, for... 45 Public Welfare 1 2010-10-01 2010-10-01 false Medical data code sets. 162.1002 Section 162.1002... REQUIREMENTS ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1002 Medical data code sets. The Secretary adopts the...

  5. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  6. A UML profile for code generation of component based distributed systems

    International Nuclear Information System (INIS)

    Chiozzi, G.; Karban, R.; Andolfato, L.; Tejeda, A.

    2012-01-01

    A consistent and unambiguous implementation of code generation (model to text transformation) from UML (must rely on a well defined UML (Unified Modelling Language) profile, customizing UML for a particular application domain. Such a profile must have a solid foundation in a formally correct ontology, formalizing the concepts and their relations in the specific domain, in order to avoid a maze or set of wildly created stereotypes. The paper describes a generic profile for the code generation of component based distributed systems for control applications, the process to distill the ontology and define the profile, and the strategy followed to implement the code generator. The main steps that take place iteratively include: defining the terms and relations with an ontology, mapping the ontology to the appropriate UML meta-classes, testing the profile by creating modelling examples, and generating the code. This has allowed us to work on the modelling of E-ELT (European Extremely Large Telescope) control system and instrumentation without knowing what infrastructure will be finally used

  7. Re-estimation of Motion and Reconstruction for Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Raket, Lars Lau; Forchhammer, Søren

    2014-01-01

    Transform domain Wyner-Ziv (TDWZ) video coding is an efficient approach to distributed video coding (DVC), which provides low complexity encoding by exploiting the source statistics at the decoder side. The DVC coding efficiency depends mainly on side information and noise modeling. This paper...... proposes a motion re-estimation technique based on optical flow to improve side information and noise residual frames by taking partially decoded information into account. To improve noise modeling, a noise residual motion re-estimation technique is proposed. Residual motion compensation with motion...

  8. Using theory to explore facilitators and barriers to delayed prescribing in Australia: a qualitative study using the Theoretical Domains Framework and the Behaviour Change Wheel.

    Science.gov (United States)

    Sargent, Lucy; McCullough, Amanda; Del Mar, Chris; Lowe, John

    2017-02-13

    Delayed antibiotic prescribing reduces antibiotic use for acute respiratory infections in trials in general practice, but the uptake in clinical practice is low. The aim of the study was to identify facilitators and barriers to general practitioners' (GPs') use of delayed prescribing and to gain pharmacists' and the public's views about delayed prescribing in Australia. This study used the Theoretical Domains Framework and the Behaviour Change Wheel to explore facilitators and barriers to delayed prescribing in Australia. Forty-three semi-structured, face-to-face interviews with general practitioners, pharmacists and patients were conducted. Responses were coded into domains of the Theoretical Domains Framework, and specific criteria from the Behaviour Change Wheel were used to identify which domains were relevant to increasing the use of delayed prescribing by GPs. The interviews revealed nine key domains that influence GPs' use of delayed prescribing: knowledge; cognitive and interpersonal skills; memory, attention and decision-making processes; optimism; beliefs about consequences; intentions; goals; emotion; and social influences: GPs knew about delayed prescribing; however, they did not use it consistently, preferring to bring patients back for review and only using it with patients in a highly selective way. Pharmacists would support GPs and the public in delayed prescribing but would fill the prescription if people insisted. The public said they would delay taking their antibiotics if asked by their GP and given the right information on managing symptoms and when to take antibiotics. Using a theory-driven approach, we identified nine key domains that influence GPs' willingness to provide a delayed prescription to patients with an acute respiratory infection presenting to general practice. These data can be used to develop a structured intervention to change this behaviour and thus reduce antibiotic use for acute respiratory infections in general practice.

  9. NOTICONA--a nonlinear time-domain computer code of two-phase natural circulation instability

    International Nuclear Information System (INIS)

    Su Guanghui; Guo Yujun; Zhang Jinling; Qiu Shuizheng; Jia Dounan; Yu Zhenwan

    1997-10-01

    A microcomputer code, NOTICONA, is developed, which is used for non-linear analysing the two-phase natural circulation systems. The mathematical model of the code includes point source neutron-kinetic model, the feedback of reactivity model, single-phase and two-phase flow model, heat transfer model in different conditions, associated model, etc. NOTICONA is compared with experiments, and its correctness and accuracy are proved. Using NOTICONA, the density wave oscillation (type I) of the 5 MW Test Heating Reactor are calculated, and the marginal stability boundary is obtained

  10. Fractal Image Coding Based on a Fitting Surface

    Directory of Open Access Journals (Sweden)

    Sheng Bi

    2014-01-01

    Full Text Available A no-search fractal image coding method based on a fitting surface is proposed. In our research, an improved gray-level transform with a fitting surface is introduced. One advantage of this method is that the fitting surface is used for both the range and domain blocks and one set of parameters can be saved. Another advantage is that the fitting surface can approximate the range and domain blocks better than the previous fitting planes; this can result in smaller block matching errors and better decoded image quality. Since the no-search and quadtree techniques are adopted, smaller matching errors also imply less number of blocks matching which results in a faster encoding process. Moreover, by combining all the fitting surfaces, a fitting surface image (FSI is also proposed to speed up the fractal decoding. Experiments show that our proposed method can yield superior performance over the other three methods. Relative to range-averaged image, FSI can provide faster fractal decoding process. Finally, by combining the proposed fractal coding method with JPEG, a hybrid coding method is designed which can provide higher PSNR than JPEG while maintaining the same Bpp.

  11. Layered and Laterally Constrained 2D Inversion of Time Domain Induced Polarization Data

    DEFF Research Database (Denmark)

    Fiandaca, Gianluca; Ramm, James; Auken, Esben

    description of the transmitter waveform and of the receiver transfer function allowing for a quantitative interpretation of the parameters. The code has been optimized for parallel computation and the inversion time is comparable to codes inverting just for direct current resistivity. The new inversion......In a sedimentary environment, quasi-layered models often represent the actual geology more accurately than smooth minimum-structure models. We have developed a new layered and laterally constrained inversion algorithm for time domain induced polarization data. The algorithm is based on the time...... transform of a complex resistivity forward response and the inversion extracts the spectral information of the time domain measures in terms of the Cole-Cole parameters. The developed forward code and inversion algorithm use the full time decay of the induced polarization response, together with an accurate...

  12. Trademarks, consumer protection and domain names on the Internet

    Directory of Open Access Journals (Sweden)

    Hana Kelblová

    2007-01-01

    Full Text Available The article deals with current problems of the conflict of domain names on the Internet with trade marks in relation to the consumer protection. The aim of the article is to refer to ways and means of protection against of the speculative registration of a domain name. In the Czech legal order these means represent legal regulation of the unfair competition in Commercial Code, regulation of liability for damage together with the Trademarks Act.

  13. Min-Max decoding for non binary LDPC codes

    OpenAIRE

    Savin, Valentin

    2008-01-01

    Iterative decoding of non-binary LDPC codes is currently performed using either the Sum-Product or the Min-Sum algorithms or slightly different versions of them. In this paper, several low-complexity quasi-optimal iterative algorithms are proposed for decoding non-binary codes. The Min-Max algorithm is one of them and it has the benefit of two possible LLR domain implementations: a standard implementation, whose complexity scales as the square of the Galois field's cardinality and a reduced c...

  14. Effective enforcement of the forest practices code

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The British Columbia Forest Practices Code establishes a scheme to guide and direct forest harvesting and other forest uses in concert with other related acts. The Code is made up of the Forest Practices Code of British Columbia Act, regulations, standards, and guidebooks. This document provides information on Code enforcement. It reviews the roles of the three provincial resource ministries and the Attorney General in enforcing the code, the various activities undertaken to ensure compliance (including inspections, investigations, and responses to noncompliance), and the role of the public in helping to enforce the Code. The appendix contains a list of Ministry of Forests office locations and telephone numbers.

  15. Using context to improve protein domain identification

    Directory of Open Access Journals (Sweden)

    Llinás Manuel

    2011-03-01

    Full Text Available Abstract Background Identifying domains in protein sequences is an important step in protein structural and functional annotation. Existing domain recognition methods typically evaluate each domain prediction independently of the rest. However, the majority of proteins are multidomain, and pairwise domain co-occurrences are highly specific and non-transitive. Results Here, we demonstrate how to exploit domain co-occurrence to boost weak domain predictions that appear in previously observed combinations, while penalizing higher confidence domains if such combinations have never been observed. Our framework, Domain Prediction Using Context (dPUC, incorporates pairwise "context" scores between domains, along with traditional domain scores and thresholds, and improves domain prediction across a variety of organisms from bacteria to protozoa and metazoa. Among the genomes we tested, dPUC is most successful at improving predictions for the poorly-annotated malaria parasite Plasmodium falciparum, for which over 38% of the genome is currently unannotated. Our approach enables high-confidence annotations in this organism and the identification of orthologs to many core machinery proteins conserved in all eukaryotes, including those involved in ribosomal assembly and other RNA processing events, which surprisingly had not been previously known. Conclusions Overall, our results demonstrate that this new context-based approach will provide significant improvements in domain and function prediction, especially for poorly understood genomes for which the need for additional annotations is greatest. Source code for the algorithm is available under a GPL open source license at http://compbio.cs.princeton.edu/dpuc/. Pre-computed results for our test organisms and a web server are also available at that location.

  16. 47 CFR 52.19 - Area code relief.

    Science.gov (United States)

    2010-10-01

    ... meetings to which the telecommunications industry and the public are invited on area code relief for a... 47 Telecommunication 3 2010-10-01 2010-10-01 false Area code relief. 52.19 Section 52.19 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) NUMBERING...

  17. Extension of the code COCOSYS to a dispersion code for smoke and carbon monoxide

    International Nuclear Information System (INIS)

    Sdouz, Gert; Mayrhofer, Robert

    2009-01-01

    The code COCOSYS (Containment Code SYStem) was developed by GRS in Germany to simulate processes and nuclear plant states during severe accidents in the containments of light water reactors. It contains several physical models, especially a module for aerosol behaviour. The goal of this work was to extend COCOSYS for applications in more general geometries mainly for complex public buildings. For the application in public buildings models for air condition systems and different boundary conditions according to different environments were developed. The principal application of the extended code COCOSYS is in the area of emergency situations especially in the simulation for carbon monoxide and smoke dispersion. After developing and implementing the new models several test calculations were performed to evaluate the functionality of the extended code. The comparison of the results with those of the original COCOSYS code showed no discrepancies. For the first realistic application several fire emergency scenarios in the Vienna General Hospital (AKH) were selected in agreement with the fire department of the hospital. One of the scenarios addresses the danger of carbon monoxide (CO) and smoke leaking into a fire protection section through a damaged fire protection flap. As a result of the dispersion simulation the CO-concentration in all of the rooms is obtained. Together with additional results as deposition and smoke dispersion the outcome of the simulation can be used for training. Among the next steps are the validation of the new models and the selection of critical scenarios. (author)

  18. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  19. Applications of the ARGUS code in accelerator physics

    International Nuclear Information System (INIS)

    Petillo, J.J.; Mankofsky, A.; Krueger, W.A.; Kostas, C.; Mondelli, A.A.; Drobot, A.T.

    1993-01-01

    ARGUS is a three-dimensional, electromagnetic, particle-in-cell (PIC) simulation code that is being distributed to U.S. accelerator laboratories in collaboration between SAIC and the Los Alamos Accelerator Code Group. It uses a modular architecture that allows multiple physics modules to share common utilities for grid and structure input., memory management, disk I/O, and diagnostics, Physics modules are in place for electrostatic and electromagnetic field solutions., frequency-domain (eigenvalue) solutions, time- dependent PIC, and steady-state PIC simulations. All of the modules are implemented with a domain-decomposition architecture that allows large problems to be broken up into pieces that fit in core and that facilitates the adaptation of ARGUS for parallel processing ARGUS operates on either Cray or workstation platforms, and MOTIF-based user interface is available for X-windows terminals. Applications of ARGUS in accelerator physics and design are described in this paper

  20. Calculation of fluid-structure interaction for reactor safety with the Cassiopee code

    International Nuclear Information System (INIS)

    Graveleau, J.L.; Louvet, P.D.

    1979-01-01

    The cassiopee code is an eulerian-lagrangian coupled code for computations where the hydrodynamic is coupled with structural domains. It is completely explicit. The fluid zones may be computed either in lagrangian or in eulerian coordinates; thin shells can be computed wih their flexural behaviour; elastic plastic zones must be calculated in a lagrangian way. This code is under development in Cadarache. Its purpose is to compute the hypothetical core disruptive accident of a LMFBR when lagrangian codes are not sufficient. This paper contains a description of the code and two examples of computations, one of which has been compared with experimental results

  1. Machine-Learning Algorithms to Code Public Health Spending Accounts.

    Science.gov (United States)

    Brady, Eoghan S; Leider, Jonathon P; Resnick, Beth A; Alfonso, Y Natalia; Bishai, David

    Government public health expenditure data sets require time- and labor-intensive manipulation to summarize results that public health policy makers can use. Our objective was to compare the performances of machine-learning algorithms with manual classification of public health expenditures to determine if machines could provide a faster, cheaper alternative to manual classification. We used machine-learning algorithms to replicate the process of manually classifying state public health expenditures, using the standardized public health spending categories from the Foundational Public Health Services model and a large data set from the US Census Bureau. We obtained a data set of 1.9 million individual expenditure items from 2000 to 2013. We collapsed these data into 147 280 summary expenditure records, and we followed a standardized method of manually classifying each expenditure record as public health, maybe public health, or not public health. We then trained 9 machine-learning algorithms to replicate the manual process. We calculated recall, precision, and coverage rates to measure the performance of individual and ensembled algorithms. Compared with manual classification, the machine-learning random forests algorithm produced 84% recall and 91% precision. With algorithm ensembling, we achieved our target criterion of 90% recall by using a consensus ensemble of ≥6 algorithms while still retaining 93% coverage, leaving only 7% of the summary expenditure records unclassified. Machine learning can be a time- and cost-saving tool for estimating public health spending in the United States. It can be used with standardized public health spending categories based on the Foundational Public Health Services model to help parse public health expenditure information from other types of health-related spending, provide data that are more comparable across public health organizations, and evaluate the impact of evidence-based public health resource allocation.

  2. Time-domain Green's Function Method for three-dimensional nonlinear subsonic flows

    Science.gov (United States)

    Tseng, K.; Morino, L.

    1978-01-01

    The Green's Function Method for linearized 3D unsteady potential flow (embedded in the computer code SOUSSA P) is extended to include the time-domain analysis as well as the nonlinear term retained in the transonic small disturbance equation. The differential-delay equations in time, as obtained by applying the Green's Function Method (in a generalized sense) and the finite-element technique to the transonic equation, are solved directly in the time domain. Comparisons are made with both linearized frequency-domain calculations and existing nonlinear results.

  3. MicroRNA-128 targets myostatin at coding domain sequence to regulate myoblasts in skeletal muscle development.

    Science.gov (United States)

    Shi, Lei; Zhou, Bo; Li, Pinghua; Schinckel, Allan P; Liang, Tingting; Wang, Han; Li, Huizhi; Fu, Lingling; Chu, Qingpo; Huang, Ruihua

    2015-09-01

    MicroRNAs (miRNAs or miRs) play a critical role in skeletal muscle development. In a previous study we observed that miR-128 was highly expressed in skeletal muscle. However, its function in regulating skeletal muscle development is not clear. Our hypothesis was that miR-128 is involved in the regulation of the proliferation and differentiation of skeletal myoblasts. In this study, through bioinformatics analyses, we demonstrate that miR-128 specifically targeted mRNA of myostatin (MSTN), a critical inhibitor of skeletal myogenesis, at coding domain sequence (CDS) region, resulting in down-regulating of myostatin post-transcription. Overexpression of miR-128 inhibited proliferation of mouse C2C12 myoblast cells but promoted myotube formation; whereas knockdown of miR-128 had completely opposite effects. In addition, ectopic miR-128 regulated the expression of myogenic factor 5 (Myf5), myogenin (MyoG), paired box (Pax) 3 and 7. Furthermore, an inverse relationship was found between the expression of miR-128 and MSTN protein expression in vivo and in vitro. Taken together, these results reveal that there is a novel pathway in skeletal muscle development in which miR-128 regulates myostatin at CDS region to inhibit proliferation but promote differentiation of myoblast cells. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. 41 CFR 101-30.403-2 - Management codes.

    Science.gov (United States)

    2010-07-01

    ....4-Use of the Federal Catalog System § 101-30.403-2 Management codes. For internal use within an... codes shall not be affixed immediately adjacent to or as a part of the national stock number, nor shall... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Management codes. 101-30...

  5. An Overview of Public Domain Tools for Measuring the Sustainability of Environmental Remediation - 12060

    Energy Technology Data Exchange (ETDEWEB)

    Claypool, John E.; Rogers, Scott [AECOM, Denver, Colorado, 80202 (United States)

    2012-07-01

    their clients. When it comes to the public domain, Federal government agencies are spearheading the development of software tools to measure and report emissions of air pollutants (e.g., carbon dioxide, other greenhouse gases, criteria air pollutants); consumption of energy, water and natural resources; accident and safety risks; project costs and other economic metrics. Most of the tools developed for the Government are available to environmental practitioners without charge, so they are growing in usage and popularity. The key features and metrics calculated by the available public-domain tools for measuring the sustainability of environmental remediation projects share some commonalities but there are differences amongst the tools. The SiteWise{sup TM} sustainability tool developed for the Navy and US Army will be compared with the Sustainable Remediation Tool (SRT{sup TM}) developed for the US Air Force (USAF). In addition, the USAF's Clean Solar and Wind Energy in Environmental Programs (CleanSWEEP), a soon-to-be-released tool for evaluating the economic feasibility of utilizing renewal energy for powering remediation systems will be described in the paper. (authors)

  6. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  7. Voluntary codes: private governance, the public interest and innovation

    National Research Council Canada - National Science Library

    Webb, Kernaghan

    2004-01-01

    This volume is a logical extension of the Office of Consumer Affairs' work in the area of voluntary codes that may assist all parties in developing a better understanding of the strengths, weaknesses...

  8. Optical code-division multiple-access networks

    Science.gov (United States)

    Andonovic, Ivan; Huang, Wei

    1999-04-01

    This review details the approaches adopted to implement classical code division multiple access (CDMA) principles directly in the optical domain, resulting in all optical derivatives of electronic systems. There are a number of ways of realizing all-optical CDMA systems, classified as incoherent and coherent based on spreading in the time and frequency dimensions. The review covers the basic principles of optical CDMA (OCDMA), the nature of the codes used in these approaches and the resultant limitations on system performance with respect to the number of stations (code cardinality), the number of simultaneous users (correlation characteristics of the families of codes), concluding with consideration of network implementation issues. The latest developments will be presented with respect to the integration of conventional time spread codes, used in the bulk of the demonstrations of these networks to date, with wavelength division concepts, commonplace in optical networking. Similarly, implementations based on coherent correlation with the aid of a local oscillator will be detailed and comparisons between approaches will be drawn. Conclusions regarding the viability of these approaches allowing the goal of a large, asynchronous high capacity optical network to be realized will be made.

  9. Determining mode excitations of vacuum electronics devices via three-dimensional simulations using the SOS code

    Science.gov (United States)

    Warren, Gary

    1988-01-01

    The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.

  10. Assessing water availability over peninsular Malaysia using public domain satellite data products

    International Nuclear Information System (INIS)

    Ali, M I; Hashim, M; Zin, H S M

    2014-01-01

    Water availability monitoring is an essential task for water resource sustainability and security. In this paper, the assessment of satellite remote sensing technique for determining water availability is reported. The water-balance analysis is used to compute the spatio-temporal water availability with main inputs; the precipitation and actual evapotranspiration rate (AET), both fully derived from public-domain satellite products of Tropical Rainfall Measurement Mission (TRMM) and MODIS, respectively. Both these satellite products were first subjected to calibration to suit corresponding selected local precipitation and AET samples. Multi-temporal data sets acquired 2000-2010 were used in this study. The results of study, indicated strong agreement of monthly water availability with the basin flow rate (r 2 = 0.5, p < 0.001). Similar agreements were also noted between the estimated annual average water availability with the in-situ measurement. It is therefore concluded that the method devised in this study provide a new alternative for water availability mapping over large area, hence offers the only timely and cost-effective method apart from providing comprehensive spatio-temporal patterns, crucial in water resource planning to ensure water security

  11. Dynamic load balancing in a concurrent plasma PIC code on the JPL/Caltech Mark III hypercube

    International Nuclear Information System (INIS)

    Liewer, P.C.; Leaver, E.W.; Decyk, V.K.; Dawson, J.M.

    1990-01-01

    Dynamic load balancing has been implemented in a concurrent one-dimensional electromagnetic plasma particle-in-cell (PIC) simulation code using a method which adds very little overhead to the parallel code. In PIC codes, the orbits of many interacting plasma electrons and ions are followed as an initial value problem as the particles move in electromagnetic fields calculated self-consistently from the particle motions. The code was implemented using the GCPIC algorithm in which the particles are divided among processors by partitioning the spatial domain of the simulation. The problem is load-balanced by partitioning the spatial domain so that each partition has approximately the same number of particles. During the simulation, the partitions are dynamically recreated as the spatial distribution of the particles changes in order to maintain processor load balance

  12. Understanding effects in reviews of implementation interventions using the Theoretical Domains Framework.

    Science.gov (United States)

    Little, Elizabeth A; Presseau, Justin; Eccles, Martin P

    2015-06-17

    Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to

  13. JPEG2000 COMPRESSION CODING USING HUMAN VISUAL SYSTEM MODEL

    Institute of Scientific and Technical Information of China (English)

    Xiao Jiang; Wu Chengke

    2005-01-01

    In order to apply the Human Visual System (HVS) model to JPEG2000 standard,several implementation alternatives are discussed and a new scheme of visual optimization isintroduced with modifying the slope of rate-distortion. The novelty is that the method of visual weighting is not lifting the coefficients in wavelet domain, but is complemented by code stream organization. It remains all the features of Embedded Block Coding with Optimized Truncation (EBCOT) such as resolution progressive, good robust for error bit spread and compatibility of lossless compression. Well performed than other methods, it keeps the shortest standard codestream and decompression time and owns the ability of VIsual Progressive (VIP) coding.

  14. RCS modeling with the TSAR FDTD code

    Energy Technology Data Exchange (ETDEWEB)

    Pennock, S.T.; Ray, S.L.

    1992-03-01

    The TSAR electromagnetic modeling system consists of a family of related codes that have been designed to work together to provide users with a practical way to set up, run, and interpret the results from complex 3-D finite-difference time-domain (FDTD) electromagnetic simulations. The software has been in development at the Lawrence Livermore National Laboratory (LLNL) and at other sites since 1987. Active internal use of the codes began in 1988 with limited external distribution and use beginning in 1991. TSAR was originally developed to analyze high-power microwave and EMP coupling problems. However, the general-purpose nature of the tools has enabled us to use the codes to solve a broader class of electromagnetic applications and has motivated the addition of new features. In particular a family of near-to-far field transformation routines have been added to the codes, enabling TSAR to be used for radar-cross section and antenna analysis problems.

  15. Information report of AREVA Malvesi site - 2015 Edition. This report is written in compliance with article L. 125-15 of the French environment code

    International Nuclear Information System (INIS)

    2016-01-01

    Published in compliance with the French code of the environment, this report presents, first, the AREVA Malvesi site with its facilities and activities in the domain of uranium conversion. Then, it takes stock of the dispositions implemented for the limitation and prevention of risks and summarizes the events declared in 2015. Next, it presents the management of the site effluents and wastes and the environmental monitoring. Finally, the actions of public information are presented. The recommendations of the Health and safety Committee are included in appendix

  16. Information report of AREVA Melox site - 2015 Edition. This report is written in compliance with article L. 125-15 of the French environment code

    International Nuclear Information System (INIS)

    2016-01-01

    Published in compliance with the French code of the environment, this report presents, first, the AREVA Melox site with its facilities and activities in the domain of MOX fuel fabrication. Then, it takes stock of the dispositions implemented for the limitation and prevention of risks and summarizes the events declared in 2015. Next, it presents the management of the site effluents and wastes and the environmental monitoring. Finally, the actions of public information are presented. The recommendations of the Health and safety Committee are included in appendix

  17. Information report of AREVA Romans site - 2015 Edition. This report is written in compliance with article L. 125-15 of the French environment code

    International Nuclear Information System (INIS)

    2016-01-01

    Published in compliance with the French code of the environment, this report presents, first, the AREVA Romans site with its facilities and activities in the domain of fuel assembly fabrication. Then, it takes stock of the dispositions implemented for the limitation and prevention of risks and summarizes the events declared in 2015. Next, it presents the management of the site effluents and wastes and the environmental monitoring. Finally, the actions of public information are presented. The recommendations of the Health and safety Committee are included in appendix

  18. Services for domain specific developments in the Cloud

    Science.gov (United States)

    Schwichtenberg, Horst; Gemuend, André

    2015-04-01

    We will discuss and demonstrate the possibilities of new Cloud Services where the complete development of code is in the Cloud. We will discuss the possibilities of such services where the complete development cycle from programing to testing is in the cloud. This can be also combined with dedicated research domain specific services and hide the burden of accessing available infrastructures. As an example, we will show a service that is intended to complement the services of the VERCE projects infrastructure, a service that utilizes Cloud resources to offer simplified execution of data pre- and post-processing scripts. It offers users access to the ObsPy seismological toolbox for processing data with the Python programming language, executed on virtual Cloud resources in a secured sandbox. The solution encompasses a frontend with a modern graphical user interface, a messaging infrastructure as well as Python worker nodes for background processing. All components are deployable in the Cloud and have been tested on different environments based on OpenStack and OpenNebula. Deployments on commercial, public Clouds will be tested in the future.

  19. Efficient MPEG-2 to H.264/AVC Transcoding of Intra-Coded Video

    Directory of Open Access Journals (Sweden)

    Vetro Anthony

    2007-01-01

    Full Text Available This paper presents an efficient transform-domain architecture and corresponding mode decision algorithms for transcoding intra-coded video from MPEG-2 to H.264/AVC. Low complexity is achieved in several ways. First, our architecture employs direct conversion of the transform coefficients, which eliminates the need for the inverse discrete cosine transform (DCT and forward H.264/AVC transform. Then, within this transform-domain architecture, we perform macroblock-based mode decisions based on H.264/AVC transform coefficients, which is possible using a novel method of calculating distortion in the transform domain. The proposed method for distortion calculation could be used to make rate-distortion optimized mode decisions with lower complexity. Compared to the pixel-domain architecture with rate-distortion optimized mode decision, simulation results show that there is a negligible loss in quality incurred by the direct conversion of transform coefficients and the proposed transform-domain mode decision algorithms, while complexity is significantly reduced. To further reduce the complexity, we also propose two fast mode decision algorithms. The first algorithm ranks modes based on a simple cost function in the transform domain, then computes the rate-distortion optimal mode from a reduced set of ranked modes. The second algorithm exploits temporal correlations in the mode decision between temporally adjacent frames. Simulation results show that these algorithms provide additional computational savings over the proposed transform-domain architecture while maintaining virtually the same coding efficiency.

  20. 1 CFR 8.6 - Forms of publication.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Forms of publication. 8.6 Section 8.6 General... FEDERAL REGULATIONS § 8.6 Forms of publication. (a) Under section 1506 of title 44, United States Code, the Administrative Committee authorizes publication of the Code of Federal Regulations in the...

  1. DOE headquarters publications

    International Nuclear Information System (INIS)

    1978-09-01

    This bibliography provides listings of (mainly policy and programmatic) publications issued from the U.S. Department of Energy, Washington, D.C. The listings are arranged by the ''report code'' assigned to each of the major organizations at DOE Headquarters, followed by the three categories of environmental reports issued from DOE Headquarters. All of the publications listed, except for those shown as still ''in preparation,'' may be seen in the Energy Library. A title index arranged by title keywords follows the listings. Certain publications are omitted. They include such items as pamphlets, ''fact sheets,'' bulletins and weekly/monthly issuances of DOE's Energy Information Administration and Economic Regulatory Administration, and employee bulletins and newsletters. Omitted from the bibliography altogether are headquarters publications assigned other types of report codes--e.g., ''HCP'' (Headquarters Contractor Publication) and ''CONF'' (conference proceedings)

  2. Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)

    Science.gov (United States)

    Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.

    2013-10-01

    Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.

  3. A theory manual for multi-physics code coupling in LIME.

    Energy Technology Data Exchange (ETDEWEB)

    Belcourt, Noel; Bartlett, Roscoe Ainsworth; Pawlowski, Roger Patrick; Schmidt, Rodney Cannon; Hooper, Russell Warren

    2011-03-01

    The Lightweight Integrating Multi-physics Environment (LIME) is a software package for creating multi-physics simulation codes. Its primary application space is when computer codes are currently available to solve different parts of a multi-physics problem and now need to be coupled with other such codes. In this report we define a common domain language for discussing multi-physics coupling and describe the basic theory associated with multiphysics coupling algorithms that are to be supported in LIME. We provide an assessment of coupling techniques for both steady-state and time dependent coupled systems. Example couplings are also demonstrated.

  4. Domain decomposition methods for fluid dynamics

    International Nuclear Information System (INIS)

    Clerc, S.

    1995-01-01

    A domain decomposition method for steady-state, subsonic fluid dynamics calculations, is proposed. The method is derived from the Schwarz alternating method used for elliptic problems, extended to non-linear hyperbolic problems. Particular emphasis is given on the treatment of boundary conditions. Numerical results are shown for a realistic three-dimensional two-phase flow problem with the FLICA-4 code for PWR cores. (from author). 4 figs., 8 refs

  5. 1 CFR 5.4 - Publication not authorized.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Publication not authorized. 5.4 Section 5.4... Publication not authorized. (a) Chapter 15 of title 44, United States Code, does not apply to treaties...) Chapter 15 of title 44, United States Code, prohibits the publication in the Federal Register of comments...

  6. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  7. Reducing BER of spectral-amplitude coding optical code-division multiple-access systems by single photodiode detection technique

    Science.gov (United States)

    Al-Khafaji, H. M. R.; Aljunid, S. A.; Amphawan, A.; Fadhil, H. A.; Safar, A. M.

    2013-03-01

    In this paper, we present a single photodiode detection (SPD) technique for spectral-amplitude coding optical code-division multiple-access (SAC-OCDMA) systems. The proposed technique eliminates both phase-induced intensity noise (PIIN) and multiple-access interference (MAI) in the optical domain. Analytical results show that for 35 simultaneous users transmitting at data rate of 622 Mbps, the bit-error rate (BER) = 1.4x10^-28 for SPD technique is much better compared to 9.3x10^-6 and 9.6x10^-3 for the modified-AND as well as the AND detection techniques, respectively. Moreover, we verified the improved performance afforded by the proposed technique using data transmission simulations.

  8. Radiological emergencies due to postulated events of melted radioactive material mixed in steel reaching public domain

    International Nuclear Information System (INIS)

    Meena, T.R.; Anoj Kumar; Patra, R.P.; Vikas; Patil, S.S.; Chatterjee, M.K.; Sharma, Ranjit; Murali, S.

    2014-01-01

    National level response mechanism is developed at emergency response centres of DAE (DAE-ERCs) at 22 different locations spread all over the country and National Disaster Response Forces with National Disaster Management Authority (NDMA). ERCs are equipped with radiation monitors, radionuclide identifinders, Personnel Radiation Dosimeters (PRD) with monitoring capabilities of the order of tens of nGy/h (μR/hr) above the radiation background at any suspected locations. Even if small amounts of radioactive material is smuggled and brought in some other form into public domain, ERCs are capable to detect, identify and segregate the radioactive material from any inactive scrap. DAE-ERCs have demonstrated their capability in source search, detection, identification and recovery during the radiological emergency at Mayapuri, New Delhi

  9. Radiological emergencies due to postulated events of melted radioactive material mixed in steel reaching public domain

    Energy Technology Data Exchange (ETDEWEB)

    Meena, T. R.; Kumar, Anoj; Patra, R. P.; Vikas,; Patil, S. S.; Chatterjee, M. K.; Sharma, Ranjit; Murali, S., E-mail: tejram@barc.gov.in [Radiation Safety Systems Division, Bhabha Atomic Research Centre, Mumbai (India)

    2014-07-01

    National level response mechanism is developed at emergency response centres of DAE (DAE-ERCs) at 22 different locations spread all over the country and National Disaster Response Forces with National Disaster Management Authority (NDMA). ERCs are equipped with radiation monitors, radionuclide identifinders, Personnel Radiation Dosimeters (PRD) with monitoring capabilities of the order of tens of nGy/h (μR/hr) above the radiation background at any suspected locations. Even if small amounts of radioactive material is smuggled and brought in some other form into public domain, ERCs are capable to detect, identify and segregate the radioactive material from any inactive scrap. DAE-ERCs have demonstrated their capability in source search, detection, identification and recovery during the radiological emergency at Mayapuri, New Delhi.

  10. Performance of synthetic antiferromagnetic racetrack memory: domain wall versus skyrmion

    KAUST Repository

    Tomasello, R; Puliafito, V; Martinez, E; Manchon, Aurelien; Ricci, M; Carpentieri, M; Finocchio, G

    2017-01-01

    A storage scheme based on racetrack memory, where the information can be coded in a domain or a skyrmion, seems to be an alternative to conventional hard disk drive for high density storage. Here, we perform a full micromagnetic study

  11. MHD code using multi graphical processing units: SMAUG+

    Science.gov (United States)

    Gyenge, N.; Griffiths, M. K.; Erdélyi, R.

    2018-01-01

    This paper introduces the Sheffield Magnetohydrodynamics Algorithm Using GPUs (SMAUG+), an advanced numerical code for solving magnetohydrodynamic (MHD) problems, using multi-GPU systems. Multi-GPU systems facilitate the development of accelerated codes and enable us to investigate larger model sizes and/or more detailed computational domain resolutions. This is a significant advancement over the parent single-GPU MHD code, SMAUG (Griffiths et al., 2015). Here, we demonstrate the validity of the SMAUG + code, describe the parallelisation techniques and investigate performance benchmarks. The initial configuration of the Orszag-Tang vortex simulations are distributed among 4, 16, 64 and 100 GPUs. Furthermore, different simulation box resolutions are applied: 1000 × 1000, 2044 × 2044, 4000 × 4000 and 8000 × 8000 . We also tested the code with the Brio-Wu shock tube simulations with model size of 800 employing up to 10 GPUs. Based on the test results, we observed speed ups and slow downs, depending on the granularity and the communication overhead of certain parallel tasks. The main aim of the code development is to provide massively parallel code without the memory limitation of a single GPU. By using our code, the applied model size could be significantly increased. We demonstrate that we are able to successfully compute numerically valid and large 2D MHD problems.

  12. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  13. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  14. Creativity as Action: Findings from Five Creative Domains

    Directory of Open Access Journals (Sweden)

    Vlad eGlaveanu

    2013-04-01

    Full Text Available The present paper outlines an action theory of creativity and substantiates this approach by investigating creative expression in five different domains. We propose an action framework for the analysis of creative acts built on the assumption that creativity is a relational, inter-subjective phenomenon. This framework, drawing extensively from the work of Dewey (1934 on art as experience, is used to derive a coding frame for the analysis of interview material. The article reports findings from the analysis of 60 interviews with recognised French creators in five creative domains: art, design, science, scriptwriting, and music. Results point to complex models of action and inter-action specific for each domain and also to interesting patterns of similarity and differences between domains. These findings highlight the fact that creative action takes place not ‘inside’ individual creators but ‘in between’ actors and their environment. Implications for the field of educational psychology are discussed.

  15. OpenQ∗D simulation code for QCD+QED

    DEFF Research Database (Denmark)

    Campos, Isabel; Fritzsch, Patrick; Hansen, Martin

    2018-01-01

    The openQ∗D code for the simulation of QCD+QED with C∗ boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion....... An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/....

  16. Research on the coding and decoding technology of the OCDMA system

    Science.gov (United States)

    Li, Ping; Wang, Yuru; Lan, Zhenping; Wang, Jinpeng; Zou, Nianyu

    2015-12-01

    Optical Code Division Multiplex Access, OCDMA, is a kind of new technology which is combined the wireless CDMA technology and the optical fiber communication technology together. The address coding technology in OCDMA system has been researched. Besides, the principle of the codec based on optical fiber delay line and non-coherent spectral domain encoding and decoding has been introduced and analysed, and the results was verified by experiment.

  17. Code conversion for system design and safety analysis of NSSS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hae Cho; Kim, Young Tae; Choi, Young Gil; Kim, Hee Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-01-01

    This report describes overall project works related to conversion, installation and validation of computer codes which are used in NSSS design and safety analysis of nuclear power plants. Domain/os computer codes for system safety analysis are installed and validated on Apollo DN10000, and then Apollo version are converted and installed again on HP9000/700 series with appropriate validation. Also, COOLII and COAST which are cyber version computer codes are converted into versions of Apollo DN10000 and HP9000/700, and installed with validation. This report details whole processes of work involved in the computer code conversion and installation, as well as software verification and validation results which are attached to this report. 12 refs., 8 figs. (author)

  18. The New Civil Process Code and the Mediation Act: The Incentive to Extrajudicial and Consensual Conflicts Resolution in Public Administration

    Directory of Open Access Journals (Sweden)

    Aline Sueli de Salles Santos

    2016-10-01

    Full Text Available The purpose of this paper is to discuss the contextual aspects of the norm, inserted in the article 174 of the New Civil Process Code and in the Mediation Act, which determines the creation of the chambers of mediation and conciliation, aiming to resolve the consensual and extrajudicial conflict in the public administration. In addition, it will also focuses on the perspectives of that legislative innovation, which tends to produce socially relevant results.

  19. Finite difference time domain modelling of particle accelerators

    International Nuclear Information System (INIS)

    Jurgens, T.G.; Harfoush, F.A.

    1989-03-01

    Finite Difference Time Domain (FDTD) modelling has been successfully applied to a wide variety of electromagnetic scattering and interaction problems for many years. Here the method is extended to incorporate the modelling of wake fields in particle accelerators. Algorithmic comparisons are made to existing wake field codes, such as MAFIA T3. 9 refs., 7 figs

  20. SERPENT Monte Carlo reactor physics code

    International Nuclear Information System (INIS)

    Leppaenen, J.

    2010-01-01

    SERPENT is a three-dimensional continuous-energy Monte Carlo reactor physics burnup calculation code, developed at VTT Technical Research Centre of Finland since 2004. The code is specialized in lattice physics applications, but the universe-based geometry description allows transport simulation to be carried out in complicated three-dimensional geometries as well. The suggested applications of SERPENT include generation of homogenized multi-group constants for deterministic reactor simulator calculations, fuel cycle studies involving detailed assembly-level burnup calculations, validation of deterministic lattice transport codes, research reactor applications, educational purposes and demonstration of reactor physics phenomena. The Serpent code has been publicly distributed by the OECD/NEA Data Bank since May 2009 and RSICC in the U. S. since March 2010. The code is being used in some 35 organizations in 20 countries around the world. This paper presents an overview of the methods and capabilities of the Serpent code, with examples in the modelling of WWER-440 reactor physics. (Author)

  1. Web Syndication Approaches for Sharing Primary Data in "Small Science" Domains

    Directory of Open Access Journals (Sweden)

    Eric C Kansa

    2010-06-01

    Full Text Available In some areas of science, sophisticated web services and semantics underlie "cyberinfrastructure". However, in "small science" domains, especially in field sciences such as archaeology, conservation, and public health, datasets often resist standardization. Publishing data in the small sciences should embrace this diversity rather than attempt to corral research into "universal" (domain standards. A growing ecosystem of increasingly powerful Web syndication based approaches for sharing data on the public Web can offer a viable approach. Atom Feed based services can be used with scientific collections to identify and create linkages across different datasets, even across disciplinary boundaries without shared domain standards.

  2. Lower bounds for the minimum distance of algebraic geometry codes

    DEFF Research Database (Denmark)

    Beelen, Peter

    , such as the Goppa bound, the Feng-Rao bound and the Kirfel-Pellikaan bound. I will finish my talk by giving several examples. Especially for two-point codes, the generalized order bound is fairly easy to compute. As an illustration, I will indicate how a lower bound can be obtained for the minimum distance of some...... description of these codes in terms of order domains has been found. In my talk I will indicate how one can use the ideas behind the order bound to obtain a lower bound for the minimum distance of any AG-code. After this I will compare this generalized order bound with other known lower bounds...

  3. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  4. Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

    Science.gov (United States)

    Coutinho, Eduardo; Schuller, Björn

    2017-01-01

    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.

  5. DOE headquarters publications

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-09-01

    This bibliography provides listings of (mainly policy and programmatic) publications issued from the U.S. Department of Energy, Washington, D.C. The listings are arranged by the ''report code'' assigned to each of the major organizations at DOE Headquarters, followed by the three categories of environmental reports issued from DOE Headquarters. All of the publications listed, except for those shown as still ''in preparation,'' may be seen in the Energy Library. A title index arranged by title keywords follows the listings. Certain publications are omitted. They include such items as pamphlets, ''fact sheets,'' bulletins and weekly/monthly issuances of DOE's Energy Information Administration and Economic Regulatory Administration, and employee bulletins and newsletters. Omitted from the bibliography altogether are headquarters publications assigned other types of report codes--e.g., ''HCP'' (Headquarters Contractor Publication) and ''CONF'' (conference proceedings). (RWR)

  6. [Reflection around the code of ethics for nurses].

    Science.gov (United States)

    Depoire, Nathalie

    2017-09-01

    The code of ethics for nurses highlights the values, principles and obligations which characterise our profession. It also emphasises the conditions required to enable nurses to perform their professional practice with the autonomy granted to them by the Public Health Code. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  7. Waggawagga-CLI: A command-line tool for predicting stable single α-helices (SAH-domains, and the SAH-domain distribution across eukaryotes.

    Directory of Open Access Journals (Sweden)

    Dominic Simm

    Full Text Available Stable single-alpha helices (SAH-domains function as rigid connectors and constant force springs between structural domains, and can provide contact surfaces for protein-protein and protein-RNA interactions. SAH-domains mainly consist of charged amino acids and are monomeric and stable in polar solutions, characteristics which distinguish them from coiled-coil domains and intrinsically disordered regions. Although the number of reported SAH-domains is steadily increasing, genome-wide analyses of SAH-domains in eukaryotic genomes are still missing. Here, we present Waggawagga-CLI, a command-line tool for predicting and analysing SAH-domains in protein sequence datasets. Using Waggawagga-CLI we predicted SAH-domains in 24 datasets from eukaryotes across the tree of life. SAH-domains were predicted in 0.5 to 3.5% of the protein-coding content per species. SAH-domains are particularly present in longer proteins supporting their function as structural building block in multi-domain proteins. In human, SAH-domains are mainly used as alternative building blocks not being present in all transcripts of a gene. Gene ontology analysis showed that yeast proteins with SAH-domains are particular enriched in macromolecular complex subunit organization, cellular component biogenesis and RNA metabolic processes, and that they have a strong nuclear and ribonucleoprotein complex localization and function in ribosome and nucleic acid binding. Human proteins with SAH-domains have roles in all types of RNA processing and cytoskeleton organization, and are predicted to function in RNA binding, protein binding involved in cell and cell-cell adhesion, and cytoskeletal protein binding. Waggawagga-CLI allows the user to adjust the stabilizing and destabilizing contribution of amino acid interactions in i,i+3 and i,i+4 spacings, and provides extensive flexibility for user-designed analyses.

  8. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  9. Rulemaking efforts on codes and standards

    International Nuclear Information System (INIS)

    Millman, G.C.

    1992-01-01

    Section 50.55a of the NRC regulations provides a mechanism for incorporating national codes and standards into the regulatory process. It incorporates by reference ASME Boiler and Pressure Vessel Code (ASME B and PV Code) Section 3 rules for construction and Section 11 rules for inservice inspection and inservice testing. The regulation is periodically amended to update these references. The rulemaking process, as applied to Section 50.55a amendments, is overviewed to familiarize users with associated internal activities of the NRC staff and the manner in which public comments are integrated into the process. The four ongoing rulemaking actions that would individually amend Section 50.55a are summarized. Two of the actions would directly impact requirements for inservice testing. Benefits accrued with NRC endorsement of the ASME B and PV Code, and possible future endorsement of the ASME Operations and Maintenance Code (ASME OM Code), are identified. Emphasis is placed on the need for code writing committees to be especially sensitive to user feedback on code rules incorporated into the regulatory process to ensure that the rules are complete, technically accurate, clear, practical, and enforceable

  10. Performance Analysis of Wavelength Multiplexed Sac Ocdma Codes in Beat Noise Mitigation in Sac Ocdma Systems

    Science.gov (United States)

    Alhassan, A. M.; Badruddin, N.; Saad, N. M.; Aljunid, S. A.

    2013-07-01

    In this paper we investigate the use of wavelength multiplexed spectral amplitude coding (WM SAC) codes in beat noise mitigation in coherent source SAC OCDMA systems. A WM SAC code is a low weight SAC code, where the whole code structure is repeated diagonally (once or more) in the wavelength domain to achieve the same cardinality as a higher weight SAC code. Results show that for highly populated networks, the WM SAC codes provide better performance than SAC codes. However, for small number of active users the situation is reversed. Apart from their promising improvement in performance, these codes are more flexible and impose less complexity on the system design than their SAC counterparts.

  11. Abiding by codes of ethics and codes of conduct imposed on members of learned and professional geoscience institutions and - a tiresome formality or a win-win for scientific and professional integrity and protection of the public?

    Science.gov (United States)

    Allington, Ruth; Fernandez, Isabel

    2015-04-01

    In 2012, the International Union of Geological Sciences (IUGS) formed the Task Group on Global Geoscience Professionalism ("TG-GGP") to bring together the expanding network of organizations around the world whose primary purpose is self-regulation of geoscience practice. An important part of TG-GGP's mission is to foster a shared understanding of aspects of professionalism relevant to individual scientists and applied practitioners working in one or more sectors of the wider geoscience profession (e.g. research, teaching, industry, geoscience communication and government service). These may be summarised as competence, ethical practice, and professional, technical and scientific accountability. Legal regimes for the oversight of registered or licensed professionals differ around the world and in many jurisdictions there is no registration or licensure with the force of law. However, principles of peer-based self-regulation universally apply. This makes professional geoscience organisations ideal settings within which geoscientists can debate and agree what society should expect of us in the range of roles we fulfil. They can provide the structures needed to best determine what expectations, in the public interest, are appropriate for us collectively to impose on each other. They can also provide the structures for the development of associated procedures necessary to identify and discipline those who do not live up to the expected standards of behaviour established by consensus between peers. Codes of Ethics (sometimes referred to as Codes of Conduct), to which all members of all major professional and/or scientific geoscience organizations are bound (whether or not they are registered or hold professional qualifications awarded by those organisations), incorporate such traditional tenets as: safeguarding the health and safety of the public, scientific integrity, and fairness. Codes also increasingly include obligations concerning welfare of the environment and

  12. Parallel iterative decoding of transform domain Wyner-Ziv video using cross bitplane correlation

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Huang, Xin; Forchhammer, Søren

    2011-01-01

    decoding scheme is proposed to improve the coding efficiency of TDWZ video codecs. The proposed parallel iterative LDPC decoding scheme is able to utilize cross bitplane correlation during decoding, by iteratively refining the soft-input, updating a modeled noise distribution and thereafter enhancing......In recent years, Transform Domain Wyner-Ziv (TDWZ) video coding has been proposed as an efficient Distributed Video Coding (DVC) solution, which fully or partly exploits the source statistics at the decoder to reduce the computational burden at the encoder. In this paper, a parallel iterative LDPC...

  13. 77 FR 17460 - Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct

    Science.gov (United States)

    2012-03-26

    ..., 2012, NTIA requested public comments on (1) which consumer data privacy issues should be the focus of.... 120214135-2203-02] RIN 0660-XA27 Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct... request for public comments on the multistakeholder process to develop consumer data privacy codes of...

  14. Fourier spectral of PalmCode as descriptor for palmprint recognition

    NARCIS (Netherlands)

    Ruan, Qiuqi; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Mu, Meiru

    Study on automatic person recognition by palmprint is currently a hot topic. In this paper, we propose a novel palmprint recognition method by transforming the typical palmprint phase code feature into its Fourier frequency domain. The resulting real-valued Fourier spectral features are further

  15. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  16. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  17. Ethics of the profession of public relations--does the public relations affects on journalism in Croatia?

    Science.gov (United States)

    Tanta, Ivan; Lesinger, Gordana

    2013-09-01

    The UK's leading professional body for public relations "Chartered Institute of Public Relations" (CIPR) said that the public relations is about reputation--they are the result of what you do, what you say and what others say about you. Furthermore CIPR says that public relations are discipline whose objectives are safeguarding reputation, establishing understanding and pot pores, and the impact on the thinking and behavior of the public. Although the primary goal of public relations is to preserve and build a reputation, to tell the truth to a customer who has hired experts in this area, it seems that in its own way of development, public relations practitioners have stopped worrying about their reputation and the perception of the discipline within the public they address. All relevant professional bodies for public relations, including the Croatian Association for Public Relation (HUOJ), had set up codes of ethics and high standards according which the members and practitioners should be evaluated. Among other things stays that practitioner of public relations is required to check the reliability and accuracy of the data prior to their distribution and nurture honesty and accountability to the public interest. It seems that right this instruction of code of ethics has been often violated. In a public speech in Croatia, and therefore in the media, exist manipulation, propaganda, and all the techniques of spin, which practitioners of public relations are skillfully using in the daily transfer of information to the users and target groups. The aim of this paper is to determine what is the perception of the profession to the public. As in today's journalism increasingly present plume of public relations, we wish to comment on the part where journalism ends and begins PR and vice versa. In this paper, we analyze and compare codes of ethics ethics associations for public relations, as well as codes of ethics journalists' associations, in order to answer the question

  18. Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

    Directory of Open Access Journals (Sweden)

    Eduardo Coutinho

    Full Text Available Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech and cross-domain experiments (i.e., models trained in one modality and tested on the other. In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.

  19. Shared acoustic codes underlie emotional communication in music and speech—Evidence from deep transfer learning

    Science.gov (United States)

    Schuller, Björn

    2017-01-01

    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies—the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain. PMID:28658285

  20. Implementation of the critical points model in a SFM-FDTD code working in oblique incidence

    Energy Technology Data Exchange (ETDEWEB)

    Hamidi, M; Belkhir, A; Lamrous, O [Laboratoire de Physique et Chimie Quantique, Universite Mouloud Mammeri, Tizi-Ouzou (Algeria); Baida, F I, E-mail: omarlamrous@mail.ummto.dz [Departement d' Optique P.M. Duffieux, Institut FEMTO-ST UMR 6174 CNRS Universite de Franche-Comte, 25030 Besancon Cedex (France)

    2011-06-22

    We describe the implementation of the critical points model in a finite-difference-time-domain code working in oblique incidence and dealing with dispersive media through the split field method. Some tests are presented to validate our code in addition to an application devoted to plasmon resonance of a gold nanoparticles grating.

  1. Code of ethics for dental researchers.

    Science.gov (United States)

    2014-01-01

    The International Association for Dental Research, in 2009, adopted a code of ethics. The code applies to members of the association and is enforceable by sanction, with the stated requirement that members are expected to inform the association in cases where they believe misconduct has occurred. The IADR code goes beyond the Belmont and Helsinki statements by virtue of covering animal research. It also addresses issues of sponsorship of research and conflicts of interest, international collaborative research, duty of researchers to be informed about applicable norms, standards of publication (including plagiarism), and the obligation of "whistleblowing" for the sake of maintaining the integrity of the dental research enterprise as a whole. The code is organized, like the ADA code, into two sections. The IADR principles are stated, but not defined, and number 12, instead of the ADA's five. The second section consists of "best practices," which are specific statements of expected or interdicted activities. The short list of definitions is useful.

  2. Opening of energy markets: consequences on the missions of public utility and of security of supplies in the domain of electric power and gas

    International Nuclear Information System (INIS)

    2001-01-01

    This conference was jointly organized by the International Energy Agency (IEA) and the French ministry of economy, finances, and industry (general direction of energy and raw materials, DGEMP). It was organized in 6 sessions dealing with: 1 - the public utility in the domain of energy: definition of the public utility missions, experience feedback about liberalized markets, public utility obligation and pricing regulation; 2 - the new US energy policy and the lessons learnt from the California crisis; 3 - the security of electric power supplies: concepts of security of supplies, opinion of operators, security of power supplies versus liberalization and investments; 4 - security of gas supplies: markets liberalization and investments, long-term contracts and security of supplies; 5 - debate: how to integrate the objectives of public utility and of security of supplies in a competing market; 6 - conclusions. This document brings together the available talks and transparencies presented at the conference. (J.S.)

  3. Development of 3D CFD code based on structured non-orthogonal grids

    International Nuclear Information System (INIS)

    Vaidya, Abhijeet Mohan; Maheshwari, Naresh Kumar; Rama Rao, A.

    2016-01-01

    Most of the nuclear industry problems involve complex geometries. Solution of flow and heat transfer over complex geometries is a very important requirement for designing new reactor systems. Hence development of a general purpose three dimensional (3D) CFD code is undertaken. For handling complex shape of computational domain, implementation on structured non-orthogonal coordinates is being done. The code is validated by comparing its results for 3D inclined lid driven cavity at different inclination angles and Reynolds numbers with OpenFOAM results. This paper contains formulation and validation of the new code developed. (author)

  4. Advanced Electric and Magnetic Material Models for FDTD Electromagnetic Codes

    CERN Document Server

    Poole, Brian R; Nelson, Scott D

    2005-01-01

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which requires nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes an...

  5. Selected DOE Headquarters publications, October 1979

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-07-01

    This publication provides cumulative listings of and an index to DOE headquarters publications issued since October 1979. Three types of headquarters publications are included: publications dealing mainly with program and policy that are attributed to and issued by headquarters organizations, reports prepared by contractors to describe research and development work they have performed for the Department, and environmental development plans and impact statements. Such items as pamphlets, fact sheets, bulletins, newsletters, telephone directories, headquarters publications issued under the DOE-tr and CONF codes, technical reports from the Jet Propulsion Laboratory and NASA issued under DOE/JPL and DOE/NASA codes, and weekly/monthly reports of the Energy Information Administration are not included. (RWR)

  6. Code of practice for ionizing radiation

    International Nuclear Information System (INIS)

    Khoo Boo Huat

    1995-01-01

    Prior to 1984, the use of ionizing radiation in Malaysia was governed by the Radioactive Substances Act of 1968. After 1984, its use came under the control of Act 304, called the Atomic Energy Licensing Act 1984. Under powers vested by the Act, the Radiation Protection (Basic Safety Standards) Regulations 1988 were formulated to regulate its use. These Acts do not provide information on proper working procedures. With the publication of the codes of Practice by The Standards and Industrial Research Institute of Malaysia (SIRIM), the users are now able to follow proper guidelines and use ionizing radiation safely and beneficially. This paper discusses the relevant sections in the following codes: 1. Code of Practice for Radiation Protection (Medical X-ray Diagnosis) MS 838:1983. 2. Code of Practice for Safety in Laboratories Part 4: Ionizing radiation MS 1042: Part 4: 1992. (author)

  7. 78 FR 24725 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2013-04-26

    ... Production, Storage, and Handling of Liquefied Natural Gas (LNG). NFPA 61--2013 Standard for the 7/6/2015... Nitrate Film. NFPA 51--2013 Standard for the Design 7/6/2015 and Installation of Oxygen-Fuel Gas Systems... Charging Plants. NFPA 52--2013 Vehicular Gaseous Fuel 1/3/2014 Systems Code. NFPA 53--2011 Recommended...

  8. Information report of AREVA Tricastin site - 2015 Edition. This report is written in compliance with article L. 125-15 of the French environment code

    International Nuclear Information System (INIS)

    2016-01-01

    Published in compliance with the French code of the environment, this report presents, first, the AREVA Tricastin site with its facilities (AREVA NC, EURODIF Production, SET, SOCATRI and AREVA NP Pierrelatte) and activities in the domain of uranium conversion and enrichment. Then, it takes stock of the dispositions implemented for the limitation and prevention of risks and summarizes the events declared in 2015. Next, it presents the management of the site effluents and wastes and the environmental monitoring. Finally, the actions of public information are presented. The recommendations of the Health and safety Committee are included in appendix

  9. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  10. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  11. Finite difference time domain solution of electromagnetic scattering on the hypercube

    International Nuclear Information System (INIS)

    Calalo, R.H.; Lyons, J.R.; Imbriale, W.A.

    1988-01-01

    Electromagnetic fields interacting with a dielectric or conducting structure produce scattered electromagnetic fields. To model the fields produced by complicated, volumetric structures, the finite difference time domain (FDTD) method employs an iterative solution to Maxwell's time dependent curl equations. Implementations of the FDTD method intensively use memory and perform numerous calculations per time step iteration. The authors have implemented an FDTD code on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. This code allows to solve problems requiring as many as 2,048,000 unit cells on a 32 node Hypercube. For smaller problems, the code produces solutions in a fraction of the time to solve the same problems on sequential computers

  12. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  13. Parallel Computing Characteristics of CUPID code under MPI and Hybrid environment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeon, Byoung Jin; Choi, Hyoung Gwon [Seoul National Univ. of Science and Technology, Seoul (Korea, Republic of)

    2014-05-15

    In this paper, a characteristic of parallel algorithm is presented for solving an elliptic type equation of CUPID via domain decomposition method using the MPI and the parallel performance is estimated in terms of a scalability which shows the speedup ratio. In addition, the time-consuming pattern of major subroutines is studied. Two different grid systems are taken into account: 40,000 meshes for coarse system and 320,000 meshes for fine system. Since the matrix of the CUPID code differs according to whether the flow is single-phase or two-phase, the effect of matrix shape is evaluated. Finally, the effect of the preconditioner for matrix solver is also investigated. Finally, the hybrid (OpenMP+MPI) parallel algorithm is introduced and discussed in detail for solving pressure solver. Component-scale thermal-hydraulics code, CUPID has been developed for two-phase flow analysis, which adopts a three-dimensional, transient, three-field model, and parallelized to fulfill a recent demand for long-transient and highly resolved multi-phase flow behavior. In this study, the parallel performance of the CUPID code was investigated in terms of scalability. The CUPID code was parallelized with domain decomposition method. The MPI library was adopted to communicate the information at the neighboring domain. For managing the sparse matrix effectively, the CSR storage format is used. To take into account the characteristics of the pressure matrix which turns to be asymmetric for two-phase flow, both single-phase and two-phase calculations were run. In addition, the effect of the matrix size and preconditioning was also investigated. The fine mesh calculation shows better scalability than the coarse mesh because the number of coarse mesh does not need to decompose the computational domain excessively. The fine mesh can be present good scalability when dividing geometry with considering the ratio between computation and communication time. For a given mesh, single-phase flow

  14. 45 CFR Appendix B to Part 73 - Code of Ethics for Government Service

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Code of Ethics for Government Service B Appendix B to Part 73 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF CONDUCT Pt. 73, App. B Appendix B to Part 73—Code of Ethics for Government Service Any person in...

  15. HCV IRES domain IIb affects the configuration of coding RNA in the 40S subunit's decoding groove.

    Science.gov (United States)

    Filbin, Megan E; Kieft, Jeffrey S

    2011-07-01

    Hepatitis C virus (HCV) uses a structured internal ribosome entry site (IRES) RNA to recruit the translation machinery to the viral RNA and begin protein synthesis without the ribosomal scanning process required for canonical translation initiation. Different IRES structural domains are used in this process, which begins with direct binding of the 40S ribosomal subunit to the IRES RNA and involves specific manipulation of the translational machinery. We have found that upon initial 40S subunit binding, the stem-loop domain of the IRES that contains the start codon unwinds and adopts a stable configuration within the subunit's decoding groove. This configuration depends on the sequence and structure of a different stem-loop domain (domain IIb) located far from the start codon in sequence, but spatially proximal in the IRES•40S complex. Mutation of domain IIb results in misconfiguration of the HCV RNA in the decoding groove that includes changes in the placement of the AUG start codon, and a substantial decrease in the ability of the IRES to initiate translation. Our results show that two distal regions of the IRES are structurally communicating at the initial step of 40S subunit binding and suggest that this is an important step in driving protein synthesis.

  16. Selected DOE Headquarters publications, October 1977-September 1979

    International Nuclear Information System (INIS)

    1979-11-01

    This sixth issue of cumulative listings of DOE Headquarters publications covers the first two years of the Department's operation (October 1, 1977 - September 30, 1979). It lists two groups of publications issued by then-existing Headquarters organizations and provides an index to their title keywords. The two groups of publications are publications assigned a DOE/XXX-type report number code and Headquarters contractor reports prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department. Certain publications are omitted. They include such items as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, as well as headquarters publications issued under the DOE-tr (DOE translation) and CONF (conference proceedings) codes, and technical reports from the Jet Propulsion Laboratory and NASA issued under DOE/JPL and DOE/NASA codes. The contents of this issue will not be repeated in subsequent issues of DOE/AD-0010

  17. Performance analysis of super-orthogonal space-frequency trellis coded OFDM system

    CSIR Research Space (South Africa)

    Sokoya, O

    2009-08-01

    Full Text Available that is used with OFDM. SOSFTC-OFDM utilizes the diversities in frequency and space domain by assuming that coding is done along adjacent subcarrier in an OFDM environment. This paper evaluates the exact pairwise error probability (PEP) of the SOSFTC...

  18. Climiate Resilience Screening Index and Domain Scores

    Data.gov (United States)

    U.S. Environmental Protection Agency — CRSI and related-domain scores for all 50 states and 3135 counties in the U.S. This dataset is not publicly accessible because: They are already available within the...

  19. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  20. Discovery of rare protein-coding genes in model methylotroph Methylobacterium extorquens AM1.

    Science.gov (United States)

    Kumar, Dhirendra; Mondal, Anupam Kumar; Yadav, Amit Kumar; Dash, Debasis

    2014-12-01

    Proteogenomics involves the use of MS to refine annotation of protein-coding genes and discover genes in a genome. We carried out comprehensive proteogenomic analysis of Methylobacterium extorquens AM1 (ME-AM1) from publicly available proteomics data with a motive to improve annotation for methylotrophs; organisms capable of surviving in reduced carbon compounds such as methanol. Besides identifying 2482(50%) proteins, 29 new genes were discovered and 66 annotated gene models were revised in ME-AM1 genome. One such novel gene is identified with 75 peptides, lacks homolog in other methylobacteria but has glycosyl transferase and lipopolysaccharide biosynthesis protein domains, indicating its potential role in outer membrane synthesis. Many novel genes are present only in ME-AM1 among methylobacteria. Distant homologs of these genes in unrelated taxonomic classes and low GC-content of few genes suggest lateral gene transfer as a potential mode of their origin. Annotations of methylotrophy related genes were also improved by the discovery of a short gene in methylotrophy gene island and redefining a gene important for pyrroquinoline quinone synthesis, essential for methylotrophy. The combined use of proteogenomics and rigorous bioinformatics analysis greatly enhanced the annotation of protein-coding genes in model methylotroph ME-AM1 genome. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. ODYSSEY: A PUBLIC GPU-BASED CODE FOR GENERAL RELATIVISTIC RADIATIVE TRANSFER IN KERR SPACETIME

    Energy Technology Data Exchange (ETDEWEB)

    Pu, Hung-Yi [Institute of Astronomy and Astrophysics, Academia Sinica, 11F of Astronomy-Mathematics Building, AS/NTU No. 1, Taipei 10617, Taiwan (China); Yun, Kiyun; Yoon, Suk-Jin [Department of Astronomy and Center for Galaxy Evolution Research, Yonsei University, Seoul 120-749 (Korea, Republic of); Younsi, Ziri [Institut für Theoretische Physik, Max-von-Laue-Straße 1, D-60438 Frankfurt am Main (Germany)

    2016-04-01

    General relativistic radiative transfer calculations coupled with the calculation of geodesics in the Kerr spacetime are an essential tool for determining the images, spectra, and light curves from matter in the vicinity of black holes. Such studies are especially important for ongoing and upcoming millimeter/submillimeter very long baseline interferometry observations of the supermassive black holes at the centers of Sgr A* and M87. To this end we introduce Odyssey, a graphics processing unit (GPU) based code for ray tracing and radiative transfer in the Kerr spacetime. On a single GPU, the performance of Odyssey can exceed 1 ns per photon, per Runge–Kutta integration step. Odyssey is publicly available, fast, accurate, and flexible enough to be modified to suit the specific needs of new users. Along with a Graphical User Interface powered by a video-accelerated display architecture, we also present an educational software tool, Odyssey-Edu, for showing in real time how null geodesics around a Kerr black hole vary as a function of black hole spin and angle of incidence onto the black hole.

  2. Transcript structure and domain display: a customizable transcript visualization tool.

    Science.gov (United States)

    Watanabe, Kenneth A; Ma, Kaiwang; Homayouni, Arielle; Rushton, Paul J; Shen, Qingxi J

    2016-07-01

    Transcript Structure and Domain Display (TSDD) is a publicly available, web-based program that provides publication quality images of transcript structures and domains. TSDD is capable of producing transcript structures from GFF/GFF3 and BED files. Alternatively, the GFF files of several model organisms have been pre-loaded so that users only needs to enter the locus IDs of the transcripts to be displayed. Visualization of transcripts provides many benefits to researchers, ranging from evolutionary analysis of DNA-binding domains to predictive function modeling. TSDD is freely available for non-commercial users at http://shenlab.sols.unlv.edu/shenlab/software/TSD/transcript_display.html : jeffery.shen@unlv.nevada.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Quality of coding diagnoses in emergency departments: effects on mapping the public's health.

    Science.gov (United States)

    Aharonson-Daniel, Limor; Schwartz, Dagan; Hornik-Lurie, Tzipi; Halpern, Pinchas

    2014-01-01

    Emergency department (ED) attendees reflect the health of the population served by that hospital and the availability of health care services in the community. To examine the quality and accuracy of diagnoses recorded in the ED to appraise its potential utility as a guage of the population's medical needs. Using the Delphi process, a preliminary list of health indicators generated by an expert focus group was converted to a query to the Ministry of Health's database. In parallel, medical charts were reviewed in four hospitals to compare the handwritten diagnosis in the medical record with that recorded on the standard diagnosis "pick list" coding sheet. Quantity and quality of coding were assessed using explicit criteria. During 2010 a total of 17,761 charts were reviewed; diagnoses were not coded in 42%. The accuracy of existing coding was excellent (mismatch 1%-5%). Database query (2,670,300 visits to 28 hospitals in 2009) demonstrated potential benefits of these data as indicators of regional health needs. The findings suggest that an increase in the provision of community care may reduce ED attendance. Information on ED visits can be used to support health care planning. A "pick list" form with common diagnoses can facilitate quality recording of diagnoses in a busy ED, profiling the population's health needs in order to optimize care. Better compliance with the directive to code diagnosis is desired.

  4. Hidden Structural Codes in Protein Intrinsic Disorder.

    Science.gov (United States)

    Borkosky, Silvia S; Camporeale, Gabriela; Chemes, Lucía B; Risso, Marikena; Noval, María Gabriela; Sánchez, Ignacio E; Alonso, Leonardo G; de Prat Gay, Gonzalo

    2017-10-17

    Intrinsic disorder is a major structural category in biology, accounting for more than 30% of coding regions across the domains of life, yet consists of conformational ensembles in equilibrium, a major challenge in protein chemistry. Anciently evolved papillomavirus genomes constitute an unparalleled case for sequence to structure-function correlation in cases in which there are no folded structures. E7, the major transforming oncoprotein of human papillomaviruses, is a paradigmatic example among the intrinsically disordered proteins. Analysis of a large number of sequences of the same viral protein allowed for the identification of a handful of residues with absolute conservation, scattered along the sequence of its N-terminal intrinsically disordered domain, which intriguingly are mostly leucine residues. Mutation of these led to a pronounced increase in both α-helix and β-sheet structural content, reflected by drastic effects on equilibrium propensities and oligomerization kinetics, and uncovers the existence of local structural elements that oppose canonical folding. These folding relays suggest the existence of yet undefined hidden structural codes behind intrinsic disorder in this model protein. Thus, evolution pinpoints conformational hot spots that could have not been identified by direct experimental methods for analyzing or perturbing the equilibrium of an intrinsically disordered protein ensemble.

  5. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  6. [Student nurses and the code of ethics].

    Science.gov (United States)

    Trolliet, Julie

    2017-09-01

    Student nurses, just like all practising professionals, are expected to be aware of and to respect the code of ethics governing their profession. Since the publication of this code, actions to raise awareness of it and explain it to all the relevant players have been put in place. The French National Federation of Student Nurses decided to survey future professionals regarding this new text. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  7. A 3D spectral anelastic hydrodynamic code for shearing, stratified flows

    Science.gov (United States)

    Barranco, Joseph A.; Marcus, Philip S.

    2006-11-01

    We have developed a three-dimensional (3D) spectral hydrodynamic code to study vortex dynamics in rotating, shearing, stratified systems (e.g., the atmosphere of gas giant planets, protoplanetary disks around newly forming protostars). The time-independent background state is stably stratified in the vertical direction and has a unidirectional linear shear flow aligned with one horizontal axis. Superposed on this background state is an unsteady, subsonic flow that is evolved with the Euler equations subject to the anelastic approximation to filter acoustic phenomena. A Fourier Fourier basis in a set of quasi-Lagrangian coordinates that advect with the background shear is used for spectral expansions in the two horizontal directions. For the vertical direction, two different sets of basis functions have been implemented: (1) Chebyshev polynomials on a truncated, finite domain, and (2) rational Chebyshev functions on an infinite domain. Use of this latter set is equivalent to transforming the infinite domain to a finite one with a cotangent mapping, and using cosine and sine expansions in the mapped coordinate. The nonlinear advection terms are time-integrated explicitly, the pressure/enthalpy terms are integrated semi-implicitly, and the Coriolis force and buoyancy terms are treated semi-analytically. We show that internal gravity waves can be damped by adding new terms to the Euler equations. The code exhibits excellent parallel performance with the message passing interface (MPI). As a demonstration of the code, we simulate the merger of two 3D vortices in the midplane of a protoplanetary disk.

  8. [The legal framework of the code of ethics for nurses].

    Science.gov (United States)

    Doutriaux, Yves

    2017-09-01

    As with all professions with its own order, nurses now have a code of ethics written by the Order's elected leaders. These leaders oversee its application for the benefit of patients and professionals, notably when a complaint is made. The code reconciles the interests of patients and public health care with the competition law. The numerous innovations which it has introduced are beneficial to nurses employed in the public and private sectors as well as freelance nurses. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  9. Selected DOE Headquarters Publications, October 1979

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-09-01

    This publication provides cumulative listings of and an index to DOE headquarters publications issued since October 1979. (Publications issued during October 1977-September 1979 are covered in DOE/AD-0010/6.) Three types of headquarters publications are included: publications dealing mainly with program and policy that are attributed to and issued by headquarters organizations, reports prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department, and environmental development plans and impact statements. Certain publications have been omitted. They include such items as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, headquarters publications issued under the DOE-tr and CONF codes, technical reports from the Jet Propulsion Laboratory and NASA issued under DOE/JPL and DOE/NASA codes, and weekly/monthly reports of the Energy Information Administration. (RWR)

  10. Observations on Polar Coding with CRC-Aided List Decoding

    Science.gov (United States)

    2016-09-01

    TECHNICAL REPORT 3041 September 2016 Observations on Polar Coding with CRC-Aided List Decoding David Wasserman Approved for public release. SSC...described in [2, 3]. In FY15 and FY16 we used cyclic redundancy check (CRC)-aided polar list decoding [4]. Section 2 describes the basics of polar coding ...and gives details of the encoders and decoders we used. In the course of our research, we performed simulations of polar codes in hundreds of cases

  11. Public health legal preparedness in Indian country.

    Science.gov (United States)

    Bryan, Ralph T; Schaefer, Rebecca McLaughlin; DeBruyn, Lemyra; Stier, Daniel D

    2009-04-01

    American Indian/Alaska Native tribal governments are sovereign entities with inherent authority to create laws and enact health regulations. Laws are an essential tool for ensuring effective public health responses to emerging threats. To analyze how tribal laws support public health practice in tribal communities, we reviewed tribal legal documentation available through online databases and talked with subject-matter experts in tribal public health law. Of the 70 tribal codes we found, 14 (20%) had no clearly identifiable public health provisions. The public health-related statutes within the remaining codes were rarely well integrated or comprehensive. Our findings provide an evidence base to help tribal leaders strengthen public health legal foundations in tribal communities.

  12. Ocean circulation code on machine connection

    International Nuclear Information System (INIS)

    Vitart, F.

    1993-01-01

    This work is part of a development of a global climate model based on a coupling between an ocean model and an atmosphere model. The objective was to develop this global model on a massively parallel machine (CM2). The author presents the OPA7 code (equations, boundary conditions, equation system resolution) and parallelization on the CM2 machine. CM2 data structure is briefly evoked, and two tests are reported (on a flat bottom basin, and a topography with eight islands). The author then gives an overview of studies aimed at improving the ocean circulation code: use of a new state equation, use of a formulation of surface pressure, use of a new mesh. He reports the study of the use of multi-block domains on CM2 through advection tests, and two-block tests

  13. Coupling of the SYRTHES thermal code with the ESTET or N3S fluid mechanics codes; Couplage du code de thermique SYRTHES et des codes de mecanique des fluides ESTET ou N3S

    Energy Technology Data Exchange (ETDEWEB)

    Peniguel, C [Electricite de France (EDF), 78 - Chatou (France). Direction des Etudes et Recherches; Rupp, I [Simulog, 78 (France)

    1998-12-31

    Thermal aspects take place in several industrial applications in which Electricite de France (EdF) is concerned. In most cases, several physical phenomena like conduction, radiation and convection are involved in thermal transfers. The aim of this paper is to present a numerical tool adapted to industrial configurations and which uses the coupling between fluid convection (resolved with ESTET in finite-volumes or with N3S in finite-elements) and radiant heat transfers between walls (resolved with SYRTHES using a radiosity method). SYRTHES manages the different thermal exchanges that can occur between fluid and solid domains thanks to an explicit iterative method. An extension of SYRTHES has been developed which allows to take into account simultaneously several fluid codes using `message passing` computer tools like Parallel Virtual Machine (PVM) and the code coupling software CALCIUM developed by the Direction of Studies and Researches (DER) of EdF. Various examples illustrate the interest of such a numerical tool. (J.S.) 12 refs.

  14. Coupling of the SYRTHES thermal code with the ESTET or N3S fluid mechanics codes; Couplage du code de thermique SYRTHES et des codes de mecanique des fluides ESTET ou N3S

    Energy Technology Data Exchange (ETDEWEB)

    Peniguel, C. [Electricite de France (EDF), 78 - Chatou (France). Direction des Etudes et Recherches; Rupp, I. [Simulog, 78 (France)

    1997-12-31

    Thermal aspects take place in several industrial applications in which Electricite de France (EdF) is concerned. In most cases, several physical phenomena like conduction, radiation and convection are involved in thermal transfers. The aim of this paper is to present a numerical tool adapted to industrial configurations and which uses the coupling between fluid convection (resolved with ESTET in finite-volumes or with N3S in finite-elements) and radiant heat transfers between walls (resolved with SYRTHES using a radiosity method). SYRTHES manages the different thermal exchanges that can occur between fluid and solid domains thanks to an explicit iterative method. An extension of SYRTHES has been developed which allows to take into account simultaneously several fluid codes using `message passing` computer tools like Parallel Virtual Machine (PVM) and the code coupling software CALCIUM developed by the Direction of Studies and Researches (DER) of EdF. Various examples illustrate the interest of such a numerical tool. (J.S.) 12 refs.

  15. Code Generation by Model Transformation : A Case Study in Transformation Modularity

    NARCIS (Netherlands)

    Hemel, Z.; Kats, L.C.L.; Visser, E.

    2008-01-01

    Preprint of paper published in: Theory and Practice of Model Transformations (ICMT 2008), Lecture Notes in Computer Science 5063; doi:10.1007/978-3-540-69927-9_13 The realization of model-driven software development requires effective techniques for implementing code generators for domain-specific

  16. Coding potential of the products of alternative splicing in human.

    KAUST Repository

    Leoni, Guido

    2011-01-20

    BACKGROUND: Analysis of the human genome has revealed that as much as an order of magnitude more of the genomic sequence is transcribed than accounted for by the predicted and characterized genes. A number of these transcripts are alternatively spliced forms of known protein coding genes; however, it is becoming clear that many of them do not necessarily correspond to a functional protein. RESULTS: In this study we analyze alternative splicing isoforms of human gene products that are unambiguously identified by mass spectrometry and compare their properties with those of isoforms of the same genes for which no peptide was found in publicly available mass spectrometry datasets. We analyze them in detail for the presence of uninterrupted functional domains, active sites as well as the plausibility of their predicted structure. We report how well each of these strategies and their combination can correctly identify translated isoforms and derive a lower limit for their specificity, that is, their ability to correctly identify non-translated products. CONCLUSIONS: The most effective strategy for correctly identifying translated products relies on the conservation of active sites, but it can only be applied to a small fraction of isoforms, while a reasonably high coverage, sensitivity and specificity can be achieved by analyzing the presence of non-truncated functional domains. Combining the latter with an assessment of the plausibility of the modeled structure of the isoform increases both coverage and specificity with a moderate cost in terms of sensitivity.

  17. Coding potential of the products of alternative splicing in human.

    KAUST Repository

    Leoni, Guido; Le Pera, Loredana; Ferrè , Fabrizio; Raimondo, Domenico; Tramontano, Anna

    2011-01-01

    BACKGROUND: Analysis of the human genome has revealed that as much as an order of magnitude more of the genomic sequence is transcribed than accounted for by the predicted and characterized genes. A number of these transcripts are alternatively spliced forms of known protein coding genes; however, it is becoming clear that many of them do not necessarily correspond to a functional protein. RESULTS: In this study we analyze alternative splicing isoforms of human gene products that are unambiguously identified by mass spectrometry and compare their properties with those of isoforms of the same genes for which no peptide was found in publicly available mass spectrometry datasets. We analyze them in detail for the presence of uninterrupted functional domains, active sites as well as the plausibility of their predicted structure. We report how well each of these strategies and their combination can correctly identify translated isoforms and derive a lower limit for their specificity, that is, their ability to correctly identify non-translated products. CONCLUSIONS: The most effective strategy for correctly identifying translated products relies on the conservation of active sites, but it can only be applied to a small fraction of isoforms, while a reasonably high coverage, sensitivity and specificity can be achieved by analyzing the presence of non-truncated functional domains. Combining the latter with an assessment of the plausibility of the modeled structure of the isoform increases both coverage and specificity with a moderate cost in terms of sensitivity.

  18. Parametric time-frequency domain spatial audio

    CERN Document Server

    Delikaris-Manias, Symeon; Politis, Archontis

    2018-01-01

    This book provides readers with the principles and best practices in spatial audio signal processing. It describes how sound fields and their perceptual attributes are captured and analyzed within the time-frequency domain, how essential representation parameters are coded, and how such signals are efficiently reproduced for practical applications. The book is split into four parts starting with an overview of the fundamentals. It then goes on to explain the reproduction of spatial sound before offering an examination of signal-dependent spatial filtering. The book finishes with coverage of both current and future applications and the direction that spatial audio research is heading in. Parametric Time-frequency Domain Spatial Audio focuses on applications in entertainment audio, including music, home cinema, and gaming--covering the capturing and reproduction of spatial sound as well as its generation, transduction, representation, transmission, and perception. This book will teach readers the tools needed...

  19. Electrical safety code manual a plan language guide to national electrical code, OSHA and NFPA 70E

    CERN Document Server

    Keller, Kimberley

    2010-01-01

    Safety in any workplace is extremely important. In the case of the electrical industry, safety is critical and the codes and regulations which determine safe practices are both diverse and complicated. Employers, electricians, electrical system designers, inspectors, engineers and architects must comply with safety standards listed in the National Electrical Code, OSHA and NFPA 70E. Unfortunately, the publications which list these safety requirements are written in very technically advanced terms and the average person has an extremely difficult time understanding exactly what they need to

  20. Machine-Checked Sequencer for Critical Embedded Code Generator

    Science.gov (United States)

    Izerrouken, Nassima; Pantel, Marc; Thirioux, Xavier

    This paper presents the development of a correct-by-construction block sequencer for GeneAuto a qualifiable (according to DO178B/ED12B recommendation) automatic code generator. It transforms Simulink models to MISRA C code for safety critical systems. Our approach which combines classical development process and formal specification and verification using proof-assistants, led to preliminary fruitful exchanges with certification authorities. We present parts of the classical user and tools requirements and derived formal specifications, implementation and verification for the correctness and termination of the block sequencer. This sequencer has been successfully applied to real-size industrial use cases from various transportation domain partners and led to requirement errors detection and a correct-by-construction implementation.

  1. Selected DOE Headquarters publications, October 1977-September 1979

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-11-01

    This sixth issue of cumulative listings of DOE Headquarters publications covers the first two years of the Department's operation (October 1, 1977 - September 30, 1979). It lists two groups of publications issued by then-existing Headquarters organizations and provides an index to their title keywords. The two groups of publications are publications assigned a DOE/XXX-type report number code and Headquarters contractor reports prepared by contractors (and published by DOE) to describe research and development work they have performed for the Department. Certain publications are omitted. They include such items as pamphlets, fact sheets, bulletins, newsletters, and telephone directories, as well as headquarters publications issued under the DOE-tr (DOE translation) and CONF (conference proceedings) codes, and technical reports from the Jet Propulsion Laboratory and NASA issued under DOE/JPL and DOE/NASA codes. The contents of this issue will not be repeated in subsequent issues of DOE/AD-0010. (RWR)

  2. The finite-difference time-domain method for electromagnetics with Matlab simulations

    CERN Document Server

    Elsherbeni, Atef Z

    2016-01-01

    This book introduces the powerful Finite-Difference Time-Domain method to students and interested researchers and readers. An effective introduction is accomplished using a step-by-step process that builds competence and confidence in developing complete working codes for the design and analysis of various antennas and microwave devices.

  3. A Domain-Specific Language for Generic Interlocking Models and Their Properties

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    of this work is to provide a domain-specific language for generic models and an instantiator tool taking not only configuration data but also a generic model as input instead of using a hard-coded generator for instantiating only one fixed generic model and its properties with configuration data....

  4. Assessing the INTERTRAN code for application in Asian environs

    International Nuclear Information System (INIS)

    Yoshimura, S.

    1986-10-01

    A Japanese study, which was carried out as part of the IAEA Coordinated Research Programme on Radiation Protection Implications of Transport Accidents Involving Radioactive Materials, provided evaluations of transport conditions of nuclear fuel in Japan. Nuclear fuel is transported in Japan in the form of UO 2 , UF 6 , fresh fuel assemblies and spent fuel. Based on these transport conditions calculations were made using the INTERTRAN code which was developed as part of the IAEA Coordinated Research Programme on Safe Transport of Radioactive Materials (1980-1985), for assessing doses to workers and to the public due to the transport of nuclear fuel. As a part of the study, a new code was developed for evaluating radiological impacts of the transport of radioactive materials. The code was also used for assessing doses from the transport of nuclear fuel in Japan. The results indicate that doses to workers and to the public due to the incident-free transport of nuclear fuel are low, i.e., of the order of 1-30 man mSv/100 km. The doses calculated by the Japanese code were in general slightly smaller than the doses calculated using the INTERTRAN code. The study concerned normal conditions of transport, i.e., no impact from incidents or accidents was evaluated. The study resulted, in addition, in some suggestions for further developing the INTERTRAN code

  5. Publications, 1977-1979

    International Nuclear Information System (INIS)

    Hilborn, H.S.

    1980-03-01

    This is a compilation of documents that communicate the results of scientific and technical work done at Savannah River. The compilation includes those documents that have been published (research and development reports, journal articles, book chapters, etc.), and documents that have been announced in Energy Research Abstracts. Where applicable, the meeting at which the paper was presented is given. This report updates the information included in DP-929, Rev. 2, Publications, 1951 through 1971 and DP-929-1, Publications, 1972 through 1976, and replaces DP-929-1, Supplements 1 and 2. The bibliographic listing is arranged alphabetically by the first-mentioned author of each document. The listing includes an identifying code number. This identifying code is used to cross reference the author and subject listings with the bibliographic information. The subject listing is arranged alphabetically by key word out of context (KWOC) indexing of the titles. The listing includes the identifying code number and the complete title. The author listing includes in alphabetical order all authors, coauthors, editors, and compilers with identifying code numbers, and the titles of all documents on which their names appear

  6. Performance testing of thermal analysis codes for nuclear fuel casks

    International Nuclear Information System (INIS)

    Sanchez, L.C.

    1987-01-01

    In 1982 Sandia National Laboratories held the First Industry/Government Joint Thermal and Structural Codes Information Exchange and presented the initial stages of an investigation of thermal analysis computer codes for use in the design of nuclear fuel shipping casks. The objective of the investigation was to (1) document publicly available computer codes, (2) assess code capabilities as determined from their user's manuals, and (3) assess code performance on cask-like model problems. Computer codes are required to handle the thermal phenomena of conduction, convection and radiation. Several of the available thermal computer codes were tested on a set of model problems to assess performance on cask-like problems. Solutions obtained with the computer codes for steady-state thermal analysis were in good agreement and the solutions for transient thermal analysis differed slightly among the computer codes due to modeling differences

  7. SRAC2006: A comprehensive neutronics calculation code system

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Kugo, Teruhiko; Kaneko, Kunio; Tsuchihashi, Keichiro

    2007-02-01

    The SRAC is a code system applicable to neutronics analysis of a variety of reactor types. Since the publication of the second version of the users manual (JAERI-1302) in 1986 for the SRAC system, a number of additions and modifications to the functions and the library data have been made to establish a comprehensive neutronics code system. The current system includes major neutron data libraries (JENDL-3.3, JENDL-3.2, ENDF/B-VII, ENDF/B-VI.8, JEFF-3.1, JEF-2.2, etc.), and integrates five elementary codes for neutron transport and diffusion calculation; PIJ based on the collision probability method applicable to 16 kind of lattice models, S N transport codes ANISN(1D) and TWOTRN(2D), diffusion codes TUD(1D) and CITATION(multi-D). The system also includes an auxiliary code COREBN for multi-dimensional core burn-up calculation. (author)

  8. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  9. Parallelization of a three-dimensional whole core transport code DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Jin Young, Cho; Han Gyu, Joo; Ha Yong, Kim; Moon-Hee, Chang [Korea Atomic Energy Research Institute, Yuseong-gu, Daejon (Korea, Republic of)

    2003-07-01

    Parallelization of the DeCART (deterministic core analysis based on ray tracing) code is presented that reduces the computational burden of the tremendous computing time and memory required in three-dimensional whole core transport calculations. The parallelization employs the concept of MPI grouping and the MPI/OpenMP mixed scheme as well. Since most of the computing time and memory are used in MOC (method of characteristics) and the multi-group CMFD (coarse mesh finite difference) calculation in DeCART, variables and subroutines related to these two modules are the primary targets for parallelization. Specifically, the ray tracing module was parallelized using a planar domain decomposition scheme and an angular domain decomposition scheme. The parallel performance of the DeCART code is evaluated by solving a rodded variation of the C5G7MOX three dimensional benchmark problem and a simplified three-dimensional SMART PWR core problem. In C5G7MOX problem with 24 CPUs, a speedup of maximum 21 is obtained on an IBM Regatta machine and 22 on a LINUX Cluster in the MOC kernel, which indicates good parallel performance of the DeCART code. In the simplified SMART problem, the memory requirement of about 11 GBytes in the single processor cases reduces to 940 Mbytes with 24 processors, which means that the DeCART code can now solve large core problems with affordable LINUX clusters. (authors)

  10. A comparison of two fully coupled codes for integrated dynamic analysis of floating vertical axis wind turbines

    NARCIS (Netherlands)

    Koppenol, Boy; Cheng, Zhengshun; Gao, Zhen; Simao Ferreira, C.; Moan, T; Tande, John Olav Giæver; Kvamsdal, Trond; Muskulus, Michael

    2017-01-01

    This paper presents a comparison of two state-of-the-art codes that are capable of modelling floating vertical axis wind turbines (VAWTs) in fully coupled time-domain simulations, being the HAWC2 by DTU and the SIMO-RIFLEX-AC code by NTNU/MARINTEK. The comparative study focusses on the way

  11. Operational Cybersecurity Risks and Their Effect on Adoption of Additive Manufacturing in the Naval Domain

    Science.gov (United States)

    2017-12-01

    CYBERSECURITY RISKS AND THEIR EFFECT ON ADOPTION OF ADDITIVE MANUFACTURING IN THE NAVAL DOMAIN by Michael D. Grimshaw December 2017 Thesis...OF ADDITIVE MANUFACTURING IN THE NAVAL DOMAIN 5. FUNDING NUMBERS 6. AUTHOR(S) Michael D. Grimshaw 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Additive manufacturing (AM) has been proven to provide multiple benefits over traditional

  12. Code of Conduct for wind-power projects - Feasibility study; Code of Conduct fuer windkraftprojekte. Machbarkeitsstudie - Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Strub, P. [Pierre Strub, freischaffender Berater, Binningen (Switzerland); Ziegler, Ch. [Inter Act, Basel (Switzerland)

    2009-02-15

    This final report deals with the results of a feasibility study concerning the development of a Code of Conduct for wind-power projects. The aim is to strengthen the acceptance of wind-power by the general public. The necessity of new, voluntary market instruments is discussed. The urgency of development in this area is quoted as being high, and the authors consider the feasibility of the definition of a code of conduct as being proven. The code of conduct can, according to the authors, be of use at various levels but primarily in project development. Further free-enterprise instruments are also suggested that should help support socially compatible and successful market development. It is noted that the predominant portion of those questioned are prepared to co-operate in further work on the subject

  13. On the Comparative Performance Analysis of Turbo-Coded Non-Ideal Sigle-Carrier and Multi-Carrier Waveforms over Wideband Vogler-Hoffmeyer HF Channels

    Directory of Open Access Journals (Sweden)

    F. Genc

    2014-09-01

    Full Text Available The purpose of this paper is to compare the turbo-coded Orthogonal Frequency Division Multiplexing (OFDM and turbo-coded Single Carrier Frequency Domain Equalization (SC-FDE systems under the effects of Carrier Frequency Offset (CFO, Symbol Timing Offset (STO and phase noise in wide-band Vogler-Hoffmeyer HF channel model. In mobile communication systems multi-path propagation occurs. Therefore channel estimation and equalization is additionally necessary. Furthermore a non-ideal local oscillator generally is misaligned with the operating frequency at the receiver. This causes carrier frequency offset. Hence in coded SC-FDE and coded OFDM systems; a very efficient, low complex frequency domain channel estimation and equalization is implemented in this paper. Also Cyclic Prefix (CP based synchronization synchronizes the clock and carrier frequency offset.The simulations show that non-ideal turbo-coded OFDM has better performance with greater diversity than non-ideal turbo-coded SC-FDE system in HF channel.

  14. Time domain spectral phase encoding/DPSK data modulation using single phase modulator for OCDMA application.

    Science.gov (United States)

    Wang, Xu; Gao, Zhensen; Kataoka, Nobuyuki; Wada, Naoya

    2010-05-10

    A novel scheme using single phase modulator for simultaneous time domain spectral phase encoding (SPE) signal generation and DPSK data modulation is proposed and experimentally demonstrated. Array- Waveguide-Grating and Variable-Bandwidth-Spectrum-Shaper based devices can be used for decoding the signal directly in spectral domain. The effects of fiber dispersion, light pulse width and timing error on the coding performance have been investigated by simulation and verified in experiment. In the experiment, SPE signal with 8-chip, 20GHz/chip optical code patterns has been generated and modulated with 2.5 Gbps DPSK data using single modulator. Transmission of the 2.5 Gbps data over 34km fiber with BEROCDMA) and secure optical communication applications. (c) 2010 Optical Society of America.

  15. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes

    International Nuclear Information System (INIS)

    Nagels-Silvert, V.

    2004-09-01

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  16. Code manual for MACCS2: Volume 1, user's guide

    International Nuclear Information System (INIS)

    Chanin, D.I.; Young, M.L.

    1997-03-01

    This report describes the use of the MACCS2 code. The document is primarily a user's guide, though some model description information is included. MACCS2 represents a major enhancement of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, distributed by government code centers since 1990, was developed to evaluate the impacts of severe accidents at nuclear power plants on the surrounding public. The principal phenomena considered are atmospheric transport and deposition under time-variant meteorology, short- and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. No other U.S. code that is publicly available at present offers all these capabilities. MACCS2 was developed as a general-purpose tool applicable to diverse reactor and nonreactor facilities licensed by the Nuclear Regulatory Commission or operated by the Department of Energy or the Department of Defense. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency-response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. Other improvements are in the areas of phenomenological modeling and new output options. Initial installation of the code, written in FORTRAN 77, requires a 486 or higher IBM-compatible PC with 8 MB of RAM

  17. 76 FR 9636 - Public Meeting

    Science.gov (United States)

    2011-02-18

    ... DEPARTMENT OF THE TREASURY United States Mint Public Meeting ACTION: Notification of Citizens Coinage Advisory Committee March 1, 2011, Public Meeting. SUMMARY: Pursuant to United States Code, Title... (CCAC) public meeting scheduled for March 1, 2011. Date: March 1, 2011. Time: 10 a.m. to 1 p.m. Location...

  18. Psychometric characteristics of a public-domain self-report measure of vocational interests: the Oregon Vocational Interest Scales.

    Science.gov (United States)

    Pozzebon, Julie A; Visser, Beth A; Ashton, Michael C; Lee, Kibeom; Goldberg, Lewis R

    2010-03-01

    We investigated the psychometric properties of the Oregon Vocational Interest Scales (ORVIS), a brief public-domain alternative to commercial inventories, in a large community sample and in a college sample. In both samples, we examined the factor structure, scale intercorrelations, and personality correlates of the ORVIS, and in the community sample, we also examined the correlations of the ORVIS scales with cognitive abilities and with the scales of a longer, proprietary interest survey. In both samples, all 8 scales-Leadership, Organization, Altruism, Creativity, Analysis, Producing, Adventuring, and Erudition-showed wide variation in scores, high internal-consistency reliabilities, and a pattern of high convergent and low discriminant correlations with the scales of the proprietary interest survey. Overall, the results support the construct validity of the scales, which are recommended for use in research on vocational interests and other individual differences.

  19. UNIPIC code for simulations of high power microwave devices

    International Nuclear Information System (INIS)

    Wang Jianguo; Zhang Dianhui; Wang Yue; Qiao Hailiang; Li Xiaoze; Liu Chunliang; Li Yongdong; Wang Hongguang

    2009-01-01

    In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.

  20. UNIPIC code for simulations of high power microwave devices

    Science.gov (United States)

    Wang, Jianguo; Zhang, Dianhui; Liu, Chunliang; Li, Yongdong; Wang, Yue; Wang, Hongguang; Qiao, Hailiang; Li, Xiaoze

    2009-03-01

    In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.

  1. La apropiación del dominio público y las posibilidades de acceso a los bienes culturales | The appropriation of the public domain and the possibilities of access to cultural goods

    Directory of Open Access Journals (Sweden)

    Joan Ramos Toledano

    2017-06-01

    Full Text Available Resumen: Las normas de propiedad intelectual y copyright prevén un periodo de protección otorgando unos derechos económicos exclusivos y temporales. Pasado un plazo determinado, las obras protegidas entran en lo que se denomina dominio público. Éste suele ser considerado como el momento en el que los bienes culturales pasan a estar bajo el dominio y control de la sociedad en conjunto. El presente trabajo pretende argumentar que, dado nuestro actual sistema económico, en realidad el dominio público funciona más como una posibilidad de negocio para determinadas empresas que como una verdadera opción para que el público pueda acceder a las obras.   Abstract: The legislation of continental intellectual property and copyright provide for a period of protection granting exclusive and temporary economic rights. After a certain period, protected works enter into what is called the public domain. This is often considered as the moment in which the cultural goods come under the control and domain of society as a whole. The present paper pretends to argue that, given our current economic system, the public domain actually functions more as a business opportunity for certain companies than as a real option for the public to access artistic and intellectual works.

  2. An introduction to using QR codes in scholarly journals

    Directory of Open Access Journals (Sweden)

    Jae Hwa Chang

    2014-08-01

    Full Text Available The Quick Response (QR code was first developed in 1994 by Denso Wave Incorporated, Japan. From that point on, it came into general use as an identification mark for all kinds of commercial products, advertisements, and other public announcements. In scholarly journals, the QR code is used to provide immediate direction to the journal homepage or specific content such as figures or videos. To produce a QR code and print it in the print version or upload to the web is very simple. Using a QR code producing program, an editor can add simple information to a website. After that, a QR code is produced. A QR code is very stable, such that it can be used for a long time without loss of quality. Producing and adding QR codes to a journal costs nothing; therefore, to increase the visibility of their journals, it is time for editors to add QR codes to their journals.

  3. RAYIC - a numerical code for the study of ion cyclotron heating of large Tokamak plasmas

    International Nuclear Information System (INIS)

    Brambilla, M.

    1984-02-01

    The code RAYIC models coupling, propagation and absorption of e.m. waves in large axisymmetric plasmas in the ion cyclotron frequency domain. It can be used both to investigate the waves behaviour, and as a source of the power deposition profiles for use in transport codes. The present user manual, after a brief summary of the physical model, presents the structure of RAYIC, the complete list of input-output variables (calling sequence), and some examples of the output which can be obtained from the code. (orig.)

  4. Development of statistical analysis code for meteorological data (W-View)

    International Nuclear Information System (INIS)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  5. Selected ICAR Data from the SAPA-Project: Development and Initial Validation of a Public-Domain Measure

    Directory of Open Access Journals (Sweden)

    David M. Condon

    2016-01-01

    Full Text Available These data were collected during the initial evaluation of the International Cognitive Ability Resource (ICAR project. ICAR is an international collaborative effort to develop open-source public-domain tools for cognitive ability assessment, including tools that can be administered in non-proctored environments (e.g., online administration and those which are based on automatic item generation algorithms. These data provide initial validation of the first four ICAR item types as reported in Condon & Revelle [1]. The 4 item types contain a total of 60 items: 9 Letter and Number Series items, 11 Matrix Reasoning items, 16 Verbal Reasoning items and 24 Three-dimensional Rotation items. Approximately 97,000 individuals were administered random subsets of these 60 items using the Synthetic Aperture Personality Assessment method between August 18, 2010 and May 20, 2013. The data are available in rdata and csv formats and are accompanied by documentation stored as a text file. Re-use potential includes a wide range of structural and item-level analyses.

  6. Domain Specific Language Support for Exascale. Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Baden, Scott [Univ. of California, San Diego, CA (United States)

    2017-07-11

    The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically

  7. Ubiquitin domain proteins in disease

    DEFF Research Database (Denmark)

    Klausen, Louise Kjær; Schulze, Andrea; Seeger, Michael

    2007-01-01

    The human genome encodes several ubiquitin-like (UBL) domain proteins (UDPs). Members of this protein family are involved in a variety of cellular functions and many are connected to the ubiquitin proteasome system, an essential pathway for protein degradation in eukaryotic cells. Despite...... and cancer. Publication history: Republished from Current BioData's Targeted Proteins database (TPdb; http://www.targetedproteinsdb.com)....

  8. Coding as literacy metalithikum IV

    CERN Document Server

    Bühlmann, Vera; Moosavi, Vahid

    2015-01-01

    Recent developments in computer science, particularly "data-driven procedures" have opened a new level of design and engineering. This has also affected architecture. The publication collects contributions on Coding as Literacy by computer scientists, mathematicians, philosophers, cultural theorists, and architects. "Self-Organizing Maps" (SOM) will serve as the concrete reference point for all further discussions.

  9. Data compression and genomes: a two-dimensional life domain map.

    Science.gov (United States)

    Menconi, Giulia; Benci, Vieri; Buiatti, Marcello

    2008-07-21

    We define the complexity of DNA sequences as the information content per nucleotide, calculated by means of some Lempel-Ziv data compression algorithm. It is possible to use the statistics of the complexity values of the functional regions of different complete genomes to distinguish among genomes of different domains of life (Archaea, Bacteria and Eukarya). We shall focus on the distribution function of the complexity of non-coding regions. We show that the three domains may be plotted in separate regions within the two-dimensional space where the axes are the skewness coefficient and the curtosis coefficient of the aforementioned distribution. Preliminary results on 15 genomes are introduced.

  10. Iterative Decoding of Concatenated Codes: A Tutorial

    Directory of Open Access Journals (Sweden)

    Phillip A. Regalia

    2005-05-01

    Full Text Available The turbo decoding algorithm of a decade ago constituted a milestone in error-correction coding for digital communications, and has inspired extensions to generalized receiver topologies, including turbo equalization, turbo synchronization, and turbo CDMA, among others. Despite an accrued understanding of iterative decoding over the years, the “turbo principle” remains elusive to master analytically, thereby inciting interest from researchers outside the communications domain. In this spirit, we develop a tutorial presentation of iterative decoding for parallel and serial concatenated codes, in terms hopefully accessible to a broader audience. We motivate iterative decoding as a computationally tractable attempt to approach maximum-likelihood decoding, and characterize fixed points in terms of a “consensus” property between constituent decoders. We review how the decoding algorithm for both parallel and serial concatenated codes coincides with an alternating projection algorithm, which allows one to identify conditions under which the algorithm indeed converges to a maximum-likelihood solution, in terms of particular likelihood functions factoring into the product of their marginals. The presentation emphasizes a common framework applicable to both parallel and serial concatenated codes.

  11. Methods of evaluating the effects of coding on SAR data

    Science.gov (United States)

    Dutkiewicz, Melanie; Cumming, Ian

    1993-01-01

    It is recognized that mean square error (MSE) is not a sufficient criterion for determining the acceptability of an image reconstructed from data that has been compressed and decompressed using an encoding algorithm. In the case of Synthetic Aperture Radar (SAR) data, it is also deemed to be insufficient to display the reconstructed image (and perhaps error image) alongside the original and make a (subjective) judgment as to the quality of the reconstructed data. In this paper we suggest a number of additional evaluation criteria which we feel should be included as evaluation metrics in SAR data encoding experiments. These criteria have been specifically chosen to provide a means of ensuring that the important information in the SAR data is preserved. The paper also presents the results of an investigation into the effects of coding on SAR data fidelity when the coding is applied in (1) the signal data domain, and (2) the image domain. An analysis of the results highlights the shortcomings of the MSE criterion, and shows which of the suggested additional criterion have been found to be most important.

  12. [The impact of the code of ethics on training].

    Science.gov (United States)

    Tirand-Martin, Catherine

    2017-09-01

    The publication of the French code of ethics for nurses, in November 2016, has had an impact on the training of student nurses. In this context, appropriation of the code is clearly facilitated by the form and definition of the text which immediately reinforces students' commitment on the path to professionalization. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  13. Advanced codes and methods supporting improved fuel cycle economics - 5493

    International Nuclear Information System (INIS)

    Curca-Tivig, F.; Maupin, K.; Thareau, S.

    2015-01-01

    AREVA's code development program was practically completed in 2014. The basic codes supporting a new generation of advanced methods are the followings. GALILEO is a state-of-the-art fuel rod performance code for PWR and BWR applications. Development is completed, implementation started in France and the U.S.A. ARCADIA-1 is a state-of-the-art neutronics/ thermal-hydraulics/ thermal-mechanics code system for PWR applications. Development is completed, implementation started in Europe and in the U.S.A. The system thermal-hydraulic codes S-RELAP5 and CATHARE-2 are not really new but still state-of-the-art in the domain. S-RELAP5 was completely restructured and re-coded such that its life cycle increases by further decades. CATHARE-2 will be replaced in the future by the new CATHARE-3. The new AREVA codes and methods are largely based on first principles modeling with an extremely broad international verification and validation data base. This enables AREVA and its customers to access more predictable licensing processes in a fast evolving regulatory environment (new safety criteria, requests for enlarged qualification databases, statistical applications, uncertainty propagation...). In this context, the advanced codes and methods and the associated verification and validation represent the key to avoiding penalties on products, on operational limits, or on methodologies themselves

  14. A Parallel Non-Overlapping Domain-Decomposition Algorithm for Compressible Fluid Flow Problems on Triangulated Domains

    Science.gov (United States)

    Barth, Timothy J.; Chan, Tony F.; Tang, Wei-Pai

    1998-01-01

    This paper considers an algebraic preconditioning algorithm for hyperbolic-elliptic fluid flow problems. The algorithm is based on a parallel non-overlapping Schur complement domain-decomposition technique for triangulated domains. In the Schur complement technique, the triangulation is first partitioned into a number of non-overlapping subdomains and interfaces. This suggests a reordering of triangulation vertices which separates subdomain and interface solution unknowns. The reordering induces a natural 2 x 2 block partitioning of the discretization matrix. Exact LU factorization of this block system yields a Schur complement matrix which couples subdomains and the interface together. The remaining sections of this paper present a family of approximate techniques for both constructing and applying the Schur complement as a domain-decomposition preconditioner. The approximate Schur complement serves as an algebraic coarse space operator, thus avoiding the known difficulties associated with the direct formation of a coarse space discretization. In developing Schur complement approximations, particular attention has been given to improving sequential and parallel efficiency of implementations without significantly degrading the quality of the preconditioner. A computer code based on these developments has been tested on the IBM SP2 using MPI message passing protocol. A number of 2-D calculations are presented for both scalar advection-diffusion equations as well as the Euler equations governing compressible fluid flow to demonstrate performance of the preconditioning algorithm.

  15. Verification of gyrokinetic microstability codes with an LHD configuration

    Energy Technology Data Exchange (ETDEWEB)

    Mikkelsen, D. R. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Nunami, M. [National Inst. for Fusion Science (Japan); Watanabe, T. -H. [Nagoya Univ. (Japan); Sugama, H. [National Inst. for Fusion Science (Japan); Tanaka, K. [National Inst. for Fusion Science (Japan)

    2014-11-01

    We extend previous benchmarks of the GS2 and GKV-X codes to verify their algorithms for solving the gyrokinetic Vlasov-Poisson equations for plasma microturbulence. Code benchmarks are the most complete way of verifying the correctness of implementations for the solution of mathematical models for complex physical processes such as those studied here. The linear stability calculations reported here are based on the plasma conditions of an ion-ITB plasma in the LHD configuration. The plasma parameters and the magnetic geometry differ from previous benchmarks involving these codes. We find excellent agreement between the independently written pre-processors that calculate the geometrical coefficients used in the gyrokinetic equations. Grid convergence tests are used to establish the resolution and domain size needed to obtain converged linear stability results. The agreement of the frequencies, growth rates and eigenfunctions in the benchmarks reported here provides additional verification that the algorithms used by the GS2 and GKV-X codes are correctly finding the linear eigenvalues and eigenfunctions of the gyrokinetic Vlasov-Poisson equations.

  16. The amendment of the Labour Code

    Directory of Open Access Journals (Sweden)

    Jana Mervartová

    2012-01-01

    Full Text Available The amendment of the Labour Code, No. 365/2011 Coll., effective as from 1st January 2012, brings some of fundamental changes in labour law. The amendment regulates relation between the Labour Code and the Civil Code; and is also formulates principles of labour law relations newly. The basic period by fixed-term contract of employment is extended and also frequency its conclusion is limited. The length of trial period and the amount of redundancy payment are graduated. An earlier legislative regulation which an employee is temporarily assign to work for different employer has been returned. The number of hours by agreement to perform work is increased. The monetary compensation by competitive clause is reduced. The other changes are realised in part of collective labour law. The authoress of article notifies of the most important changes. She compares new changes of the Labour Code and former legal system and she also evaluates their advantages and disadvantages. The main objective of changes ensures labour law relations to be more flexible. And it should motivate creation of new jobs opening by employers. Amended provisions are aimed to reduction expenses of employers under the reform of the public finances. Also changes are expected in the Labour Code in connection with the further new Civil Code.

  17. Performance, Accuracy and Efficiency Evaluation of a Three-Dimensional Whole-Core Neutron Transport Code AGENT

    International Nuclear Information System (INIS)

    Jevremovic, Tatjana; Hursin, Mathieu; Satvat, Nader; Hopkins, John; Xiao, Shanjie; Gert, Godfree

    2006-01-01

    The AGENT (Arbitrary Geometry Neutron Transport) an open-architecture reactor modeling tool is deterministic neutron transport code for two or three-dimensional heterogeneous neutronic design and analysis of the whole reactor cores regardless of geometry types and material configurations. The AGENT neutron transport methodology is applicable to all generations of nuclear power and research reactors. It combines three theories: (1) the theory of R-functions used to generate real three-dimensional whole-cores of square, hexagonal or triangular cross sections, (2) the planar method of characteristics used to solve isotropic neutron transport in non-homogenized 2D) reactor slices, and (3) the one-dimensional diffusion theory used to couple the planar and axial neutron tracks through the transverse leakage and angular mesh-wise flux values. The R-function-geometrical module allows a sequential building of the layers of geometry and automatic sub-meshing based on the network of domain functions. The simplicity of geometry description and selection of parameters for accurate treatment of neutron propagation is achieved through the Boolean algebraic hierarchically organized simple primitives into complex domains (both being represented with corresponding domain functions). The accuracy is comparable to Monte Carlo codes and is obtained by following neutron propagation through real geometrical domains that does not require homogenization or simplifications. The efficiency is maintained through a set of acceleration techniques introduced at all important calculation levels. The flux solution incorporates power iteration with two different acceleration techniques: Coarse Mesh Re-balancing (CMR) and Coarse Mesh Finite Difference (CMFD). The stand-alone originally developed graphical user interface of the AGENT code design environment allows the user to view and verify input data by displaying the geometry and material distribution. The user can also view the output data such

  18. Public health nursing, ethics and human rights.

    Science.gov (United States)

    Ivanov, Luba L; Oden, Tami L

    2013-05-01

    Public health nursing has a code of ethics that guides practice. This includes the American Nurses Association Code of Ethics for Nurses, Principles of the Ethical Practice of Public Health, and the Scope and Standards of Public Health Nursing. Human rights and Rights-based care in public health nursing practice are relatively new. They reflect human rights principles as outlined in the Universal Declaration of Human Rights and applied to public health practice. As our health care system is restructured and there are new advances in technology and genetics, a focus on providing care that is ethical and respects human rights is needed. Public health nurses can be in the forefront of providing care that reflects an ethical base and a rights-based approach to practice with populations. © 2013 Wiley Periodicals, Inc.

  19. Public privacy: Reciprocity and Silence

    Directory of Open Access Journals (Sweden)

    Jenny Kennedy

    2014-10-01

    Full Text Available In his 1958 poem 'Dedication to my Wife' TS Eliot proclaims "these are private words addressed to you in public". Simultaneously written for his wife, Valerie Fletcher, and to the implied you of a discourse network, Eliot's poem helps to illustrate the narrative voices and silences that are constitutive of an intimate public sphere. This paper situates reciprocity as a condition of possibility for public privacy. It shows how reciprocity is enabled by systems of code operating through material and symbolic registers. Code promises to control communication, to produce neutral, systemic forms of meaning. Yet such automation is challenged by uneven and fragmented patterns of reciprocity. Moreover, examining the media of public privacy reveals historical trajectories important for understanding contemporary socio­technical platforms of reciprocity. To explore the implicit requirement of reciprocity in publicly private practices, three sites of communication are investigated framed by a media archaeology perspective: postal networks, the mail­art project PostSecret and the anonymous zine 'You'.

  20. Two-dimensional full-wave code for reflectometry simulations in TJ-II

    International Nuclear Information System (INIS)

    Blanco, E.; Heuraux, S.; Estrada, T.; Sanchez, J.; Cupido, L.

    2004-01-01

    A two-dimensional full-wave code in the extraordinary mode has been developed to simulate reflectometry in TJ-II. The code allows us to study the measurement capabilities of the future correlation reflectometer that is being installed in TJ-II. The code uses the finite-difference-time-domain technique to solve Maxwell's equations in the presence of density fluctuations. Boundary conditions are implemented by a perfectly matched layer to simulate free propagation. To assure the stability of the code, the current equations are solved by a fourth-order Runge-Kutta method. Density fluctuation parameters such as fluctuation level, wave numbers, and correlation lengths are extrapolated from those measured at the plasma edge using Langmuir probes. In addition, realistic plasma shape, density profile, magnetic configuration, and experimental setup of TJ-II are included to determine the plasma regimes in which accurate information may be obtained

  1. Code of practice for radiation protection in veterinary medicine

    International Nuclear Information System (INIS)

    Duffy, J.; Fenton, D.; McGarry, A.; McAllister, H.; Skelly, C

    2002-11-01

    This Code of Practice updates the Code of Practice on Radiation Protection in Veterinary Radiology prepared by the Nuclear Energy Board in June 1989. The Code is designed to give guidance to veterinary surgeons to ensure that they, their employees and members of the public are adequately protected from the hazards of ionising radiation arising from the use of X-ray equipment and radioactive substances in the practice of veterinary medicine. It reflects the regulations as specified in the Radiological Protection Act, 1991, (Ionising Radiation) Order, 2000 (S.I. No. 125 of 2000)

  2. Aeroelastic code development activities in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Wright, A.D. [National Renewable Energy Lab., Golden, Colorado (United States)

    1996-09-01

    Designing wind turbines to be fatigue resistant and to have long lifetimes at minimal cost is a major goal of the federal wind program and the wind industry in the United States. To achieve this goal, we must be able to predict critical loads for a wide variety of different wind turbines operating under extreme conditions. The codes used for wind turbine dynamic analysis must be able to analyze a wide range of different wind turbine configurations as well as rapidly predict the loads due to turbulent wind inflow with a minimal set of degrees of freedom. Code development activities in the US have taken a two-pronged approach in order to satisfy both of these criteria: (1) development of a multi-purpose code which can be used to analyze a wide variety of wind turbine configurations without having to develop new equations of motion with each configuration change, and (2) development of specialized codes with minimal sets of specific degrees of freedom for analysis of two- and three-bladed horizontal axis wind turbines and calculation of machine loads due to turbulent inflow. In the first method we have adapted a commercial multi-body dynamics simulation package for wind turbine analysis. In the second approach we are developing specialized codes with limited degrees of freedom, usually specified in the modal domain. This paper will summarize progress to date in the development, validation, and application of these codes. (au) 13 refs.

  3. Public Health and Politics: Using the Tax Code to Expand Advocacy.

    Science.gov (United States)

    Gorovitz, Eric

    2017-03-01

    Protecting the public's health has always been an inherently political endeavor. The field of public health, however, is conspicuously and persistently absent from sustained, sophisticated engagement in political processes, particularly elections, that determine policy outcomes. This results, in large part, from widespread misunderstanding of rules governing how, and how much, public advocates working in tax-exempt organizations can participate in public policy development. This article briefly summarizes the rules governing public policy engagement by exempt organizations. It then describes different types of exempt organizations, and how they can work together to expand engagement. Next, it identifies several key mechanisms of policy development that public health advocates could influence. Finally, it suggests some methods of applying the tax rules to increase participation in these arenas.

  4. Adaptive Noise Model for Transform Domain Wyner-Ziv Video using Clustering of DCT Blocks

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Huang, Xin; Forchhammer, Søren

    2011-01-01

    The noise model is one of the most important aspects influencing the coding performance of Distributed Video Coding. This paper proposes a novel noise model for Transform Domain Wyner-Ziv (TDWZ) video coding by using clustering of DCT blocks. The clustering algorithm takes advantage of the residual...... modelling. Furthermore, the proposed cluster level noise model is adaptively combined with a coefficient level noise model in this paper to robustly improve coding performance of TDWZ video codec up to 1.24 dB (by Bjøntegaard metric) compared to the DISCOVER TDWZ video codec....... information of all frequency bands, iteratively classifies blocks into different categories and estimates the noise parameter in each category. The experimental results show that the coding performance of the proposed cluster level noise model is competitive with state-ofthe- art coefficient level noise...

  5. DOGMA: domain-based transcriptome and proteome quality assessment.

    Science.gov (United States)

    Dohmen, Elias; Kremer, Lukas P M; Bornberg-Bauer, Erich; Kemena, Carsten

    2016-09-01

    Genome studies have become cheaper and easier than ever before, due to the decreased costs of high-throughput sequencing and the free availability of analysis software. However, the quality of genome or transcriptome assemblies can vary a lot. Therefore, quality assessment of assemblies and annotations are crucial aspects of genome analysis pipelines. We developed DOGMA, a program for fast and easy quality assessment of transcriptome and proteome data based on conserved protein domains. DOGMA measures the completeness of a given transcriptome or proteome and provides information about domain content for further analysis. DOGMA provides a very fast way to do quality assessment within seconds. DOGMA is implemented in Python and published under GNU GPL v.3 license. The source code is available on https://ebbgit.uni-muenster.de/domainWorld/DOGMA/ CONTACTS: e.dohmen@wwu.de or c.kemena@wwu.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Development of statistical analysis code for meteorological data (W-View)

    Energy Technology Data Exchange (ETDEWEB)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  7. Comodore V2007: assessment doses for the public from atmospheric and liquid discharges

    International Nuclear Information System (INIS)

    Devin, Patrick; Kerouanton, David; Delgove, Laure; Giordanetto, Anne; Petit, Jany; Perrier, Gilles; Brun, Frederic; Garnier, Francois; Bernard, Christophe

    2008-01-01

    Protecting the environment and the public from radioactive hazard is a top priority for all companies operating in the nuclear domain. In order to quantify dose impact on members of the public due to annual discharges of its nuclear installations, AREVA developed the COMODORE code in collaboration with Institute of Radiation Protection and Nuclear Safety (IRSN). COMODORE is a synthesis of 3 software validated by IRSN (ACADIE, COTRAM and AQUAREJ). ACADIE is a code elaborated by IRSN and the Treatment Business Unit of AREVA synthesizing the works of the GRNC (Nord-Cotentin Radioecology Group) created by the French government to deal with the estimation of exposure levels to ionizing radiation and associated risks of leukemia for populations in the Nord-Cotentin. COMODORE is a version of ACADIE designed to be adaptable to any other AREVA site. Thus, the operators of the south east of France (Pierrelatte, Marcoule and Romans sites) carried out the adaptation of COMODORE for theirs specificities (for instance, uranium in the terrestrial model). At the moment, COMODORE is used in routine by the AREVA operators to assess the annual dosimetric impact and is also being adapted with SGN to model the radiological impact of uranium ore treatment residues repositories. This tool contributes to the transparency by giving stake holders environmental data they need. (author)

  8. Radiological impact assessment in Malaysia using RESRAD computer code

    International Nuclear Information System (INIS)

    Syed Hakimi Sakuma Syed Ahmad; Khairuddin Mohamad Kontol; Razali Hamzah

    1999-01-01

    Radiological Impact Assessment (RIA) can be conducted in Malaysia by using the RESRAD computer code developed by Argonne National Laboratory, U.S.A. The code can do analysis to derive site specific guidelines for allowable residual concentrations of radionuclides in soil. Concepts of the RIA in the context of waste management concern in Malaysia, some regulatory information and assess status of data collection are shown. Appropriate use scenarios and site specific parameters are used as much as possible so as to be realistic so that will reasonably ensure that individual dose limits and or constraints will be achieved. Case study have been conducted to fulfil Atomic Energy Licensing Board (AELB) requirements where for disposal purpose the operator must be required to carry out. a radiological impact assessment to all proposed disposals. This is to demonstrate that no member of public will be exposed to more than 1 mSv/year from all activities. Results obtained from analyses show the RESRAD computer code is able to calculate doses, risks, and guideline values. Sensitivity analysis by the computer code shows that the parameters used as input are justified so as to improve confidence to the public and the AELB the results of the analysis. The computer code can also be used as an initial assessment to conduct screening assessment in order to determine a proper disposal site. (Author)

  9. A massively parallel algorithm for the collision probability calculations in the Apollo-II code using the PVM library

    International Nuclear Information System (INIS)

    Stankovski, Z.

    1995-01-01

    The collision probability method in neutron transport, as applied to 2D geometries, consume a great amount of computer time, for a typical 2D assembly calculation evaluations. Consequently RZ or 3D calculations became prohibitive. In this paper we present a simple but efficient parallel algorithm based on the message passing host/node programing model. Parallelization was applied to the energy group treatment. Such approach permits parallelization of the existing code, requiring only limited modifications. Sequential/parallel computer portability is preserved, witch is a necessary condition for a industrial code. Sequential performances are also preserved. The algorithm is implemented on a CRAY 90 coupled to a 128 processor T3D computer, a 16 processor IBM SP1 and a network of workstations, using the Public Domain PVM library. The tests were executed for a 2D geometry with the standard 99-group library. All results were very satisfactory, the best ones with IBM SP1. Because of heterogeneity of the workstation network, we did ask high performances for this architecture. The same source code was used for all computers. A more impressive advantage of this algorithm will appear in the calculations of the SAPHYR project (with the future fine multigroup library of about 8000 groups) with a massively parallel computer, using several hundreds of processors. (author). 5 refs., 6 figs., 2 tabs

  10. A massively parallel algorithm for the collision probability calculations in the Apollo-II code using the PVM library

    International Nuclear Information System (INIS)

    Stankovski, Z.

    1995-01-01

    The collision probability method in neutron transport, as applied to 2D geometries, consume a great amount of computer time, for a typical 2D assembly calculation about 90% of the computing time is consumed in the collision probability evaluations. Consequently RZ or 3D calculations became prohibitive. In this paper the author presents a simple but efficient parallel algorithm based on the message passing host/node programmation model. Parallelization was applied to the energy group treatment. Such approach permits parallelization of the existing code, requiring only limited modifications. Sequential/parallel computer portability is preserved, which is a necessary condition for a industrial code. Sequential performances are also preserved. The algorithm is implemented on a CRAY 90 coupled to a 128 processor T3D computer, a 16 processor IBM SPI and a network of workstations, using the Public Domain PVM library. The tests were executed for a 2D geometry with the standard 99-group library. All results were very satisfactory, the best ones with IBM SPI. Because of heterogeneity of the workstation network, the author did not ask high performances for this architecture. The same source code was used for all computers. A more impressive advantage of this algorithm will appear in the calculations of the SAPHYR project (with the future fine multigroup library of about 8000 groups) with a massively parallel computer, using several hundreds of processors

  11. SEJITS: embedded specializers to turn patterns-based designs into optimized parallel code

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    All software should be parallel software. This is natural result of the transition to a many core world. For a small fraction of the world's programmers (efficiency programmers), this is not a problem. They enjoy mapping algorithms onto the details of a particular system and are well served by low level languages and OpenMP, MPI, or OpenCL. Most programmers, however, are "domain specialists" who write code. They are too busy working in their domain of choice (such as physics) to master the intricacies of each computer they use. How do we make these programmers productive without giving up performance? We have been working with a team at UC Berkeley's ParLab to address this problem. The key is a clear software architecture expressed in terms of design patterns that exposes the concurrency in a problem. The resulting code is written using a patterns-based framework within a high level, productivity language (such as Python). Then a separate system is used by a small group o...

  12. Improving pairwise comparison of protein sequences with domain co-occurrence

    Science.gov (United States)

    Gascuel, Olivier

    2018-01-01

    Comparing and aligning protein sequences is an essential task in bioinformatics. More specifically, local alignment tools like BLAST are widely used for identifying conserved protein sub-sequences, which likely correspond to protein domains or functional motifs. However, to limit the number of false positives, these tools are used with stringent sequence-similarity thresholds and hence can miss several hits, especially for species that are phylogenetically distant from reference organisms. A solution to this problem is then to integrate additional contextual information to the procedure. Here, we propose to use domain co-occurrence to increase the sensitivity of pairwise sequence comparisons. Domain co-occurrence is a strong feature of proteins, since most protein domains tend to appear with a limited number of other domains on the same protein. We propose a method to take this information into account in a typical BLAST analysis and to construct new domain families on the basis of these results. We used Plasmodium falciparum as a case study to evaluate our method. The experimental findings showed an increase of 14% of the number of significant BLAST hits and an increase of 25% of the proteome area that can be covered with a domain. Our method identified 2240 new domains for which, in most cases, no model of the Pfam database could be linked. Moreover, our study of the quality of the new domains in terms of alignment and physicochemical properties show that they are close to that of standard Pfam domains. Source code of the proposed approach and supplementary data are available at: https://gite.lirmm.fr/menichelli/pairwise-comparison-with-cooccurrence PMID:29293498

  13. MOMCON: A spectral code for obtaining three-dimensional magnetohydrodynamic equilibria

    International Nuclear Information System (INIS)

    Hirshman, S.P.; Lee, D.K.

    1986-01-01

    A new code, MOMCON (spectral moments code with constraints), is described that computes three-dimensional ideal magnetohydrodynamic (MHD) equilibria in a fixed toroidal domain using a Fourier expansion for the inverse coordinates (R, Z) representing nested magnetic surfaces. A set of nonlinear coupled ordinary differential equations for the spectral coefficients of (R, Z) is solved using an accelerated steepest descent method. A stream function, lambda, is introduced to improve the mode convergence properties of the Fourier series for R and Z. The convergence rate of the R-Z spectra is optimized on each flux surface by solving nonlinear constraint equations relating the m>=2 spectral coefficients of R and Z. (orig.)

  14. Progress in nuclear well logging modeling using deterministic transport codes

    International Nuclear Information System (INIS)

    Kodeli, I.; Aldama, D.L.; Maucec, M.; Trkov, A.

    2002-01-01

    Further studies in continuation of the work presented in 2001 in Portoroz were performed in order to study and improve the performances, precission and domain of application of the deterministic transport codes with respect to the oil well logging analysis. These codes are in particular expected to complement the Monte Carlo solutions, since they can provide a detailed particle flux distribution in the whole geometry in a very reasonable CPU time. Real-time calculation can be envisaged. The performances of deterministic transport methods were compared to those of the Monte Carlo method. IRTMBA generic benchmark was analysed using the codes MCNP-4C and DORT/TORT. Centric as well as excentric casings were considered using 14 MeV point neutron source and NaI scintillation detectors. Neutron and gamma spectra were compared at two detector positions.(author)

  15. Internal Corrosion Control of Water Supply Systems Code of Practice

    Science.gov (United States)

    This Code of Practice is part of a series of publications by the IWA Specialist Group on Metals and Related Substances in Drinking Water. It complements the following IWA Specialist Group publications: 1. Best Practice Guide on the Control of Lead in Drinking Water 2. Best Prac...

  16. Code manual for MACCS2: Volume 1, user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Chanin, D.I.; Young, M.L.

    1997-03-01

    This report describes the use of the MACCS2 code. The document is primarily a user`s guide, though some model description information is included. MACCS2 represents a major enhancement of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, distributed by government code centers since 1990, was developed to evaluate the impacts of severe accidents at nuclear power plants on the surrounding public. The principal phenomena considered are atmospheric transport and deposition under time-variant meteorology, short- and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. No other U.S. code that is publicly available at present offers all these capabilities. MACCS2 was developed as a general-purpose tool applicable to diverse reactor and nonreactor facilities licensed by the Nuclear Regulatory Commission or operated by the Department of Energy or the Department of Defense. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency-response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. Other improvements are in the areas of phenomenological modeling and new output options. Initial installation of the code, written in FORTRAN 77, requires a 486 or higher IBM-compatible PC with 8 MB of RAM.

  17. DOE headquarters publications

    International Nuclear Information System (INIS)

    1978-12-01

    This bibliography provides listings of (mainly policy and programmatic) publications issued from the U.S. Department of Energy, Washington, D.C. The listings are arranged by the report number assigned to each publication. All of the publications listed, except for those shown as still in preparation, may be seen in the Energy Library. A title index arranged by title keywords follows the listings. Certain publications have been omitted. They include such items as pamphlets, fact sheets, bulletins and weekly/monthly issuances of DOE's Energy Information Administration and Economic Regulatory Administration, and employee bulletins and newsletters. Omitted from the bibliography altogether are headquarters publications assigned other types of report codes--e.g., HCP (Headquarters Contractor Publication) and CONF

  18. Conformational landscape of an amyloid intra-cellular domain and Landau-Ginzburg-Wilson paradigm in protein dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Jin; He, Jianfeng, E-mail: Antti.Niemi@physics.uu.se, E-mail: hjf@bit.edu.cn [School of Physics, Beijing Institute of Technology, Beijing 100081 (China); Niemi, Antti J., E-mail: Antti.Niemi@physics.uu.se, E-mail: hjf@bit.edu.cn [School of Physics, Beijing Institute of Technology, Beijing 100081 (China); Department of Physics and Astronomy, Uppsala University, P.O. Box 803, S-75108 Uppsala (Sweden); Laboratoire de Mathematiques et Physique Theorique CNRS UMR 6083, Fédération Denis Poisson, Université de Tours, Parc de Grandmont, F37200 Tours (France)

    2016-07-28

    The Landau-Ginzburg-Wilson paradigm is proposed as a framework, to investigate the conformational landscape of intrinsically unstructured proteins. A universal Cα-trace Landau free energy is deduced from general symmetry considerations, with the ensuing all-atom structure modeled using publicly available reconstruction programs Pulchra and Scwrl. As an example, the conformational stability of an amyloid precursor protein intra-cellular domain (AICD) is inspected; the reference conformation is the crystallographic structure with code 3DXC in Protein Data Bank (PDB) that describes a heterodimer of AICD and a nuclear multi-domain adaptor protein Fe65. Those conformations of AICD that correspond to local or near-local minima of the Landau free energy are identified. For this, the response of the original 3DXC conformation to variations in the ambient temperature is investigated, using the Glauber algorithm. The conclusion is that in isolation the AICD conformation in 3DXC must be unstable. A family of degenerate conformations that minimise the Landau free energy is identified, and it is proposed that the native state of an isolated AICD is a superposition of these conformations. The results are fully in line with the presumed intrinsically unstructured character of isolated AICD and should provide a basis for a systematic analysis of AICD structure in future NMR experiments.

  19. Conformational landscape of an amyloid intra-cellular domain and Landau-Ginzburg-Wilson paradigm in protein dynamics

    International Nuclear Information System (INIS)

    Dai, Jin; He, Jianfeng; Niemi, Antti J.

    2016-01-01

    The Landau-Ginzburg-Wilson paradigm is proposed as a framework, to investigate the conformational landscape of intrinsically unstructured proteins. A universal Cα-trace Landau free energy is deduced from general symmetry considerations, with the ensuing all-atom structure modeled using publicly available reconstruction programs Pulchra and Scwrl. As an example, the conformational stability of an amyloid precursor protein intra-cellular domain (AICD) is inspected; the reference conformation is the crystallographic structure with code 3DXC in Protein Data Bank (PDB) that describes a heterodimer of AICD and a nuclear multi-domain adaptor protein Fe65. Those conformations of AICD that correspond to local or near-local minima of the Landau free energy are identified. For this, the response of the original 3DXC conformation to variations in the ambient temperature is investigated, using the Glauber algorithm. The conclusion is that in isolation the AICD conformation in 3DXC must be unstable. A family of degenerate conformations that minimise the Landau free energy is identified, and it is proposed that the native state of an isolated AICD is a superposition of these conformations. The results are fully in line with the presumed intrinsically unstructured character of isolated AICD and should provide a basis for a systematic analysis of AICD structure in future NMR experiments.

  20. Minimizing embedding impact in steganography using trellis-coded quantization

    Science.gov (United States)

    Filler, Tomáš; Judas, Jan; Fridrich, Jessica

    2010-01-01

    In this paper, we propose a practical approach to minimizing embedding impact in steganography based on syndrome coding and trellis-coded quantization and contrast its performance with bounds derived from appropriate rate-distortion bounds. We assume that each cover element can be assigned a positive scalar expressing the impact of making an embedding change at that element (single-letter distortion). The problem is to embed a given payload with minimal possible average embedding impact. This task, which can be viewed as a generalization of matrix embedding or writing on wet paper, has been approached using heuristic and suboptimal tools in the past. Here, we propose a fast and very versatile solution to this problem that can theoretically achieve performance arbitrarily close to the bound. It is based on syndrome coding using linear convolutional codes with the optimal binary quantizer implemented using the Viterbi algorithm run in the dual domain. The complexity and memory requirements of the embedding algorithm are linear w.r.t. the number of cover elements. For practitioners, we include detailed algorithms for finding good codes and their implementation. Finally, we report extensive experimental results for a large set of relative payloads and for different distortion profiles, including the wet paper channel.

  1. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  2. How American Nurses Association Code of Ethics informs genetic/genomic nursing.

    Science.gov (United States)

    Tluczek, Audrey; Twal, Marie E; Beamer, Laura Curr; Burton, Candace W; Darmofal, Leslie; Kracun, Mary; Zanni, Karen L; Turner, Martha

    2018-01-01

    Members of the Ethics and Public Policy Committee of the International Society of Nurses in Genetics prepared this article to assist nurses in interpreting the American Nurses Association (2015) Code of Ethics for Nurses with Interpretive Statements (Code) within the context of genetics/genomics. The Code explicates the nursing profession's norms and responsibilities in managing ethical issues. The nearly ubiquitous application of genetic/genomic technologies in healthcare poses unique ethical challenges for nursing. Therefore, authors conducted literature searches that drew from various professional resources to elucidate implications of the code in genetic/genomic nursing practice, education, research, and public policy. We contend that the revised Code coupled with the application of genomic technologies to healthcare creates moral obligations for nurses to continually refresh their knowledge and capacities to translate genetic/genomic research into evidence-based practice, assure the ethical conduct of scientific inquiry, and continually develop or revise national/international guidelines that protect the rights of individuals and populations within the context of genetics/genomics. Thus, nurses have an ethical responsibility to remain knowledgeable about advances in genetics/genomics and incorporate emergent evidence into their work.

  3. Domain decomposition parallel computing for transient two-phase flow of nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [KAERI, Daejeon (Korea, Republic of); Choi, Hyoung Gwon [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    KAERI (Korea Atomic Energy Research Institute) has been developing a multi-dimensional two-phase flow code named CUPID for multi-physics and multi-scale thermal hydraulics analysis of Light water reactors (LWRs). The CUPID code has been validated against a set of conceptual problems and experimental data. In this work, the CUPID code has been parallelized based on the domain decomposition method with Message passing interface (MPI) library. For domain decomposition, the CUPID code provides both manual and automatic methods with METIS library. For the effective memory management, the Compressed sparse row (CSR) format is adopted, which is one of the methods to represent the sparse asymmetric matrix. CSR format saves only non-zero value and its position (row and column). By performing the verification for the fundamental problem set, the parallelization of the CUPID has been successfully confirmed. Since the scalability of a parallel simulation is generally known to be better for fine mesh system, three different scales of mesh system are considered: 40000 meshes for coarse mesh system, 320000 meshes for mid-size mesh system, and 2560000 meshes for fine mesh system. In the given geometry, both single- and two-phase calculations were conducted. In addition, two types of preconditioners for a matrix solver were compared: Diagonal and incomplete LU preconditioner. In terms of enhancement of the parallel performance, the OpenMP and MPI hybrid parallel computing for a pressure solver was examined. It is revealed that the scalability of hybrid calculation was enhanced for the multi-core parallel computation.

  4. Code of practice for radiological protection in dentistry

    International Nuclear Information System (INIS)

    1988-01-01

    This Code of Practice applies to all those involved in the practice of dentistry and is designed to minimise radiation doses to patients, dental staff and the public from the use of dental radiographic equipment

  5. CHANGES IN THE FISCAL CODE AND THEIR INFLUENCE ON THE ACTIVITY OF ROMANIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Gheorghe MOROŞAN

    2016-09-01

    Full Text Available Businesses, all over the world, want a stabile legislation. In the economic domain, all the companies need a clear fiscal code on a long period of time. Unfortunately, in the last ten years, the Romanian Fiscal Code has been amended several times. The old fiscal code is in force since 2003 and suffered throughout this period no less than 150 amendments. The unanimous opinion of the experts was that there was a clear need of a new code. The paper analyzes the changes brought by the Fiscal Code starting with 2016 and its implications on the activity of business operators and on the state budget for the next period of time. It seems that some of the changes will not have the desired effect on the state budget and, generally, on the economy.

  6. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    Science.gov (United States)

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  7. Object-Oriented Parallel Particle-in-Cell Code for Beam Dynamics Simulation in Linear Accelerators

    International Nuclear Information System (INIS)

    Qiang, J.; Ryne, R.D.; Habib, S.; Decky, V.

    1999-01-01

    In this paper, we present an object-oriented three-dimensional parallel particle-in-cell code for beam dynamics simulation in linear accelerators. A two-dimensional parallel domain decomposition approach is employed within a message passing programming paradigm along with a dynamic load balancing. Implementing object-oriented software design provides the code with better maintainability, reusability, and extensibility compared with conventional structure based code. This also helps to encapsulate the details of communications syntax. Performance tests on SGI/Cray T3E-900 and SGI Origin 2000 machines show good scalability of the object-oriented code. Some important features of this code also include employing symplectic integration with linear maps of external focusing elements and using z as the independent variable, typical in accelerators. A successful application was done to simulate beam transport through three superconducting sections in the APT linac design

  8. Signal to Noise Ratio (SNR Enhancement Comparison of Impulse-, Coding- and Novel Linear-Frequency-Chirp-Based Optical Time Domain Reflectometry (OTDR for Passive Optical Network (PON Monitoring Based on Unique Combinations of Wavelength Selective Mirrors

    Directory of Open Access Journals (Sweden)

    Christopher M. Bentz

    2014-03-01

    Full Text Available We compare optical time domain reflectometry (OTDR techniques based on conventional single impulse, coding and linear frequency chirps concerning their signal to noise ratio (SNR enhancements by measurements in a passive optical network (PON with a maximum one-way attenuation of 36.6 dB. A total of six subscribers, each represented by a unique mirror pair with narrow reflection bandwidths, are installed within a distance of 14 m. The spatial resolution of the OTDR set-up is 3.0 m.

  9. Codes of conduct: An extra suave instrument of EU governance?

    DEFF Research Database (Denmark)

    Borras, Susana

    able to coordinate actors successfully (effectiveness)? and secondly, under what conditions are codes of conduct able to generate democratically legitimate political processes? The paper examines carefully a recent case study, the “Code of Conduct for the Recruitment of Researchers” (CCRR). The code...... establishes a specific set of voluntary norms and principles that shall guide the recruiting process of researchers by European research organizations (universities, public research organizations and firms) in the 33 countries of the single market minded initiative of the European Research Area. A series...

  10. Code of ethics and conduct for European nursing.

    Science.gov (United States)

    Sasso, Loredana; Stievano, Alessandro; González Jurado, Máximo; Rocco, Gennaro

    2008-11-01

    A main identifying factor of professions is professionals' willingness to comply with ethical and professional standards, often defined in a code of ethics and conduct. In a period of intense nursing mobility, if the public are aware that health professionals have committed themselves to the drawing up of a code of ethics and conduct, they will have more trust in the health professional they choose, especially if this person comes from another European Member State. The Code of Ethics and Conduct for European Nursing is a programmatic document for the nursing profession constructed by the FEPI (European Federation of Nursing Regulators) according to Directive 2005/36/EC On recognition of professional qualifications , and Directive 2006/123/EC On services in the internal market, set out by the European Commission. This article describes the construction of the Code and gives an overview of some specific areas of importance. The main text of the Code is reproduced in Appendix 1.

  11. Characteristics of scientific web publications

    DEFF Research Database (Denmark)

    Thorlund Jepsen, Erik; Seiden, Piet; Ingwersen, Peter Emil Rerup

    2004-01-01

    were generated based on specifically selected domain topics that are searched for in three publicly accessible search engines (Google, AllTheWeb, and AltaVista). A sample of the retrieved hits was analyzed with regard to how various publication attributes correlated with the scientific quality...... of the content and whether this information could be employed to harvest, filter, and rank Web publications. The attributes analyzed were inlinks, outlinks, bibliographic references, file format, language, search engine overlap, structural position (according to site structure), and the occurrence of various...... types of metadata. As could be expected, the ranked output differs between the three search engines. Apparently, this is caused by differences in ranking algorithms rather than the databases themselves. In fact, because scientific Web content in this subject domain receives few inlinks, both Alta...

  12. Oil and gas field code master list 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    The Oil and Gas Field Code Master List 1997 is the sixteenth annual listing of all identified oil and gas fields in the US. It is updated with field information collected through October 1997. The purpose of this publication is to provide unique, standardized codes for identification of domestic fields. Use of these field codes fosters consistency of field identification by government and industry. As a result of their widespread adoption they have in effect become a national standard. The use of field names and codes listed in this publication is required on survey forms and other reports regarding field-specific data collected by EIA. There are 58,366 field records in this year`s FCML, 437 more than last year. The FCML includes: field records for each State and county in which a field resides; field records for each offshore area block in the Gulf of Mexico in which a field resides; field records for each alias field name (definition of alias is listed); fields crossing State boundaries that may be assigned different names by the respective State naming authorities. This report also contains an Invalid Field Record List of 4 records that have been removed from the FCML since last year`s report. These records were found to be either technically incorrect or to represent field names which were never recognized by State naming authorities.

  13. Quality assurance aspects of the computer code CODAR2

    International Nuclear Information System (INIS)

    Maul, P.R.

    1986-03-01

    The computer code CODAR2 was developed originally for use in connection with the Sizewell Public Inquiry to evaluate the radiological impact of routine discharges to the sea from the proposed PWR. It has subsequently bee used to evaluate discharges from Heysham 2. The code was frozen in September 1983, and this note gives details of its verification, validation and evaluation. Areas where either improved modelling methods or more up-to-date information relevant to CODAR2 data bases have subsequently become available are indicated; these will be incorporated in any future versions of the code. (author)

  14. Publications issued in 1996. Priced and unpriced

    International Nuclear Information System (INIS)

    1997-01-01

    The publications issued by the IAEA's Division of Publications in 1996 are grouped in four categories: Priced and miscellaneous publications classified by divisions and by series; unpriced and miscellaneous publications classified by divisions and series. The information provided about each publication includes the symbol, language, title, centre and project code, data of time and number of pages

  15. Response to a widespread, unauthorized dispersal of radioactive waste in the public domain

    International Nuclear Information System (INIS)

    Wenslawski, F.A.; North, H.S.

    1979-01-01

    In March 1976 State of Nevada radiological health officials became aware that radioactive items destined for disposal at a radioactive waste burial facility near Beatty, Nevada had instead been distributed to wide segments of the public domain. Because the facility was jointly licensed by the State of Nevada and the Federal Nuclear Regulatory Commission, both agencies quickly responded. It was learned that over a period of several years a practice existed at the disposal facility of opening containers, removing contents and allowing employees to take items of worth or fancy. Numerous items such as hand tools, electric motors, laboratory instruments, shipping containers, etc., had received widespread and uncontrolled distribution in the town of Beatty as well as lesser distributions to other locations. Because the situation might have had the potential for a significant health and safety impact, a comprehensive recovery operation was conducted. During the course of seven days of intense effort, thirty-five individuals became involved in a comprehensive door by door survey and search of the town. Aerial surveys were performed using a helicopter equipped with sensitive radiation detectors, while ground level scans were conducted using a van containing similar instrumentation. Aerial reconnaissance photographs were taken, a special town meeting was held and numerous persons were interviewed. The recovery effort resulted in a retrieval of an estimated 20 to 25 pickup truck loads of radioactively contaminated equipment as well as several loads of large items returned on a 40-foot flatbed trailer

  16. Use of media and public-domain Internet sources for detection and assessment of plant health threats.

    Science.gov (United States)

    Thomas, Carla S; Nelson, Noele P; Jahn, Gary C; Niu, Tianchan; Hartley, David M

    2011-09-05

    Event-based biosurveillance is a recognized approach to early warning and situational awareness of emerging health threats. In this study, we build upon previous human and animal health work to develop a new approach to plant pest and pathogen surveillance. We show that monitoring public domain electronic media for indications and warning of epidemics and associated social disruption can provide information about the emergence and progression of plant pest infestation or disease outbreak. The approach is illustrated using a case study, which describes a plant pest and pathogen epidemic in China and Vietnam from February 2006 to December 2007, and the role of ducks in contributing to zoonotic virus spread in birds and humans. This approach could be used as a complementary method to traditional plant pest and pathogen surveillance to aid global and national plant protection officials and political leaders in early detection and timely response to significant biological threats to plant health, economic vitality, and social stability. This study documents the inter-relatedness of health in human, animal, and plant populations and emphasizes the importance of plant health surveillance.

  17. Past, Current, and Future Challenges in Linking Data to Publications

    Science.gov (United States)

    Hanson, B.

    2015-12-01

    Data are the currency of science and assure the integrity of published research. As the ability to collect, analyze, and visualize data has grown beyond what could be included in a publication, and as the value of the data become more clear (or the lack of availability of data was criticized), publishers and the scientific community developed several solutions to enhance access to underlying data. Most leading journals now require authors to agree as a condition of submission that underlying data will be included or made available; indeed, publication is the key leverage point in exposing much scholarly data. Most journals allow PDF or other supplements and links to data sets hosted by authors or labs, or better, data repositories such as Dryad, and some have banned "data not shown" or any reference to unpublished work. Many of these solutions have proven problematic and recent studies have found that large fraction of data are undiscoverable even a few years after publication. The best solution has been dedicated domain repositories collectively supported by publishers, funders, and the scientific community and where deposition is required before or at the time of publication. These provide quality control and curation and facilitate reuse. However, expanding these beyond a few key repositories and developing standardized workflows and functionality among repositories and between them and publishers has been problematic. Addressing these and other data challenges requires collaborative efforts among funders, publishers, repositories, societies, and researchers. One example is the Coalition on Publishing Data in the the Earth and space sciences, where most major publishers and repositories have signed a joint statement of commitment (COPDESS.org), and are starting work to direct and link published data to domain repositories. Much work remains to be done. Major challenges include establishing data curation practices into the workflow of science from data collection

  18. Gender, Ethnicity, Ethnic Identity, and Language Choices of Malaysian Youths: the Case of the Family Domain

    Directory of Open Access Journals (Sweden)

    Mehdi Granhemat

    2017-04-01

    Full Text Available This study examined the relationships between gender, ethnicity, ethnic identity, and language choices of Malaysian multilingual youths in the family domain of language use. Five hundred undergraduate students who belonged to different Malaysian ethnic groups were selected as participants of the study. The participant aged between 17 to 25 years old. To select the participants, a random proportional stratified sampling strategy was developed. A self administered questionnaire survey comprising three sections was used for gathering information about participants’ demographic profiles, their language choices in the family domain, and the concepts of their ethnic identity. To make analyses about the most used languages of the participants and the relationships between variables, SPSS software was run. Descriptive statistics was used to describe the participants’ profiles as well as participants’ used languages in the family domain of language use. Inferential statistics was used to examine relationships between variables. According to results of the study, in the family domain five codes were mostly used by the participants. These five codes were respectively, the Malay language, mixed use of Malay and English, Chinese, Mixed use of Chinese and English, and English. Furthermore, in the family domain, gender did not exert any influence on the choice of language of the multilingual participants, but ethnicity was found to be a determinant of language choice. Ethnic identity was found to influence the language choices of the Malays as well, but it did not affect the Chinese and Indian participants’ language choices in this domain of language use.

  19. Anthropomorphic Coding of Speech and Audio: A Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    W. Bastiaan Kleijn

    2005-06-01

    Full Text Available Auditory modeling is a well-established methodology that provides insight into human perception and that facilitates the extraction of signal features that are most relevant to the listener. The aim of this paper is to provide a tutorial on perceptual speech and audio coding using an invertible auditory model. In this approach, the audio signal is converted into an auditory representation using an invertible auditory model. The auditory representation is quantized and coded. Upon decoding, it is then transformed back into the acoustic domain. This transformation converts a complex distortion criterion into a simple one, thus facilitating quantization with low complexity. We briefly review past work on auditory models and describe in more detail the components of our invertible model and its inversion procedure, that is, the method to reconstruct the signal from the output of the auditory model. We summarize attempts to use the auditory representation for low-bit-rate coding. Our approach also allows the exploitation of the inherent redundancy of the human auditory system for the purpose of multiple description (joint source-channel coding.

  20. Barriers and facilitators to the implementation of a school-based physical activity policy in Canada: application of the theoretical domains framework.

    Science.gov (United States)

    Weatherson, Katie A; McKay, Rhyann; Gainforth, Heather L; Jung, Mary E

    2017-10-23

    In British Columbia Canada, a Daily Physical Activity (DPA) policy was mandated that requires elementary school teachers to provide students with opportunities to achieve 30 min of physical activity during the school day. However, the implementation of school-based physical activity policies is influenced by many factors. A theoretical examination of the factors that impede and enhance teachers' implementation of physical activity policies is necessary in order to develop strategies to improve policy practice and achieve desired outcomes. This study used the Theoretical Domains Framework (TDF) to understand teachers' barriers and facilitators to the implementation of the DPA policy in one school district. Additionally, barriers and facilitators were examined and compared according to how the teacher implemented the DPA policy during the instructional school day. Interviews were conducted with thirteen teachers and transcribed verbatim. One researcher performed barrier and facilitator extraction, with double extraction occurring across a third of the interview transcripts by a second researcher. A deductive and inductive analytical approach in a two-stage process was employed whereby barriers and facilitators were deductively coded using TDF domains (content analysis) and analyzed for sub-themes within each domain. Two researchers performed coding. A total of 832 items were extracted from the interview transcripts. Some items were coded into multiple TDF domains, resulting in a total of 1422 observations. The most commonly coded TDF domains accounting for 75% of the total were Environmental context and resources (ECR; n = 250), Beliefs about consequences (n = 225), Social influences (n = 193), Knowledge (n = 100), and Intentions (n = 88). Teachers who implemented DPA during instructional time differed from those who relied on non-instructional time in relation to Goals, Behavioural regulation, Social/professional role and identity, Beliefs about

  1. Barriers and facilitators to the implementation of a school-based physical activity policy in Canada: application of the theoretical domains framework

    Directory of Open Access Journals (Sweden)

    Katie A. Weatherson

    2017-10-01

    Full Text Available Abstract Background In British Columbia Canada, a Daily Physical Activity (DPA policy was mandated that requires elementary school teachers to provide students with opportunities to achieve 30 min of physical activity during the school day. However, the implementation of school-based physical activity policies is influenced by many factors. A theoretical examination of the factors that impede and enhance teachers’ implementation of physical activity policies is necessary in order to develop strategies to improve policy practice and achieve desired outcomes. This study used the Theoretical Domains Framework (TDF to understand teachers’ barriers and facilitators to the implementation of the DPA policy in one school district. Additionally, barriers and facilitators were examined and compared according to how the teacher implemented the DPA policy during the instructional school day. Methods Interviews were conducted with thirteen teachers and transcribed verbatim. One researcher performed barrier and facilitator extraction, with double extraction occurring across a third of the interview transcripts by a second researcher. A deductive and inductive analytical approach in a two-stage process was employed whereby barriers and facilitators were deductively coded using TDF domains (content analysis and analyzed for sub-themes within each domain. Two researchers performed coding. Results A total of 832 items were extracted from the interview transcripts. Some items were coded into multiple TDF domains, resulting in a total of 1422 observations. The most commonly coded TDF domains accounting for 75% of the total were Environmental context and resources (ECR; n = 250, Beliefs about consequences (n = 225, Social influences (n = 193, Knowledge (n = 100, and Intentions (n = 88. Teachers who implemented DPA during instructional time differed from those who relied on non-instructional time in relation to Goals, Behavioural regulation, Social

  2. Presence of an SH2 domain in the actin-binding protein tensin.

    Science.gov (United States)

    Davis, S; Lu, M L; Lo, S H; Lin, S; Butler, J A; Druker, B J; Roberts, T M; An, Q; Chen, L B

    1991-05-03

    The molecular cloning of the complementary DNA coding for a 90-kilodalton fragment of tensin, an actin-binding component of focal contacts and other submembraneous cytoskeletal structures, is reported. The derived amino acid sequence revealed the presence of a Src homology 2 (SH2) domain. This domain is shared by a number of signal transduction proteins including nonreceptor tyrosine kinases such as Abl, Fps, Src, and Src family members, the transforming protein Crk, phospholipase C-gamma 1, PI-3 (phosphatidylinositol) kinase, and guanosine triphosphatase-activating protein (GAP). Like the SH2 domain found in Src, Crk, and Abl, the SH2 domain of tensin bound specifically to a number of phosphotyrosine-containing proteins from v-src-transformed cells. Tensin was also found to be phosphorylated on tyrosine residues. These findings suggest that by possessing both actin-binding and phosphotyrosine-binding activities and being itself a target for tyrosine kinases, tensin may link signal transduction pathways with the cytoskeleton.

  3. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  4. Ultra-high resolution coded wavefront sensor

    KAUST Repository

    Wang, Congli

    2017-06-08

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor, the Coded Wavefront Sensor, which provides high spatio-temporal resolution using a simple masked sensor under white light illumination. Specifically, we demonstrate megapixel spatial resolution and phase accuracy better than 0.1 wavelengths at reconstruction rates of 50 Hz or more, thus opening up many new applications from high-resolution adaptive optics to real-time phase retrieval in microscopy.

  5. Building codes: An often overlooked determinant of health.

    Science.gov (United States)

    Chauvin, James; Pauls, Jake; Strobl, Linda

    2016-05-01

    Although the vast majority of the world's population spends most of their time in buildings, building codes are not often thought of as 'determinants of health'. The standards that govern the design, construction, and use of buildings affect our health, security, safety, and well-being. This is true for dwellings, schools, and universities, shopping centers, places of recreation, places of worship, health-care facilities, and workplaces. We urge proactive engagement by the global public health community in developing these codes, and in the design and implementation of health protection and health promotion activities intended to reduce the risk of injury, disability, and death, particularly when due to poor building code adoption/adaption, application, and enforcement.

  6. RNA-Seq analysis of D. radiodurans find non coding RNAs expressed in response to radiation stress

    International Nuclear Information System (INIS)

    Gadewal, Nikhil; Mukhopadhyaya, Rita

    2015-01-01

    In bacteria discovery of functional RNA molecules that are not translated into protein, noncoding RNAs, became possible with advent of Next Generation Sequencing technology. Bacterial non coding RNAs are typically 50-300 nucleotides long and work as internal signals controlling various levels of gene expression. Deep sequencing of total cellular RNA captures all coding and noncoding transcripts with their differential levels of expression in the transcriptome. It provides a powerful approach to study bacterial gene expression and mechanisms of gene regulation. We subjected the 3 h transcriptome of Deinococcus radiodurans R1 cells post exposure to 6 KGy gamma radiation to 100 x 2 cycles of deep sequencing on the Illumina HiSeq 2000 to look for ncRNA transcripts. Bioinformatics pipeline for analysis and interpretation of RNA Seq data was done in house using Softwares available in public domains. Our sequence data aligned with 21 putative ncRNAs expressed in the intergenic regions of annotated genome of D radiodurans. Verification of 2 ncRNA candidates and 3 transcription factor genes by Real Time PCR confirmed presence of these transcripts in the 3 h transcriptome sequenced by us. Any relationship between ncRNAs and control of radiation induced gene expression in D radiodurans can be proved only after specific gene knock outs in future. (author)

  7. A role for chromatin topology in imprinted domain regulation.

    Science.gov (United States)

    MacDonald, William A; Sachani, Saqib S; White, Carlee R; Mann, Mellissa R W

    2016-02-01

    Recently, many advancements in genome-wide chromatin topology and nuclear architecture have unveiled the complex and hidden world of the nucleus, where chromatin is organized into discrete neighbourhoods with coordinated gene expression. This includes the active and inactive X chromosomes. Using X chromosome inactivation as a working model, we utilized publicly available datasets together with a literature review to gain insight into topologically associated domains, lamin-associated domains, nucleolar-associating domains, scaffold/matrix attachment regions, and nucleoporin-associated chromatin and their role in regulating monoallelic expression. Furthermore, we comprehensively review for the first time the role of chromatin topology and nuclear architecture in the regulation of genomic imprinting. We propose that chromatin topology and nuclear architecture are important regulatory mechanisms for directing gene expression within imprinted domains. Furthermore, we predict that dynamic changes in chromatin topology and nuclear architecture play roles in tissue-specific imprint domain regulation during early development and differentiation.

  8. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  9. Performance of synthetic antiferromagnetic racetrack memory: domain wall versus skyrmion

    International Nuclear Information System (INIS)

    Tomasello, R; Puliafito, V; Martinez, E; Manchon, A; Ricci, M; Carpentieri, M; Finocchio, G

    2017-01-01

    A storage scheme based on racetrack memory, where the information can be coded in a domain or a skyrmion, seems to be an alternative to conventional hard disk drive for high density storage. Here, we perform a full micromagnetic study of the performance of synthetic antiferromagnetic (SAF) racetrack memory in terms of velocity and sensitivity to defects by using experimental parameters. We find that, to stabilize a SAF skyrmion, the Dzyaloshinskii–Moriya interaction in the top and the bottom ferromagnet should have an opposite sign. The velocity of SAF skyrmions and SAF Néel domain walls are of the same order and can reach values larger than 1200 m s −1 if a spin–orbit torque from the spin-Hall effect with opposite sign is applied to both ferromagnets. The presence of disordered anisotropy in the form of randomly distributed grains introduces a threshold current for both SAF skyrmions and SAF domain walls motions. (paper)

  10. Performance of synthetic antiferromagnetic racetrack memory: domain wall versus skyrmion

    KAUST Repository

    Tomasello, R

    2017-06-20

    A storage scheme based on racetrack memory, where the information can be coded in a domain or a skyrmion, seems to be an alternative to conventional hard disk drive for high density storage. Here, we perform a full micromagnetic study of the performance of synthetic antiferromagnetic (SAF) racetrack memory in terms of velocity and sensitivity to defects by using experimental parameters. We find that, to stabilize a SAF skyrmion, the Dzyaloshinskii–Moriya interaction in the top and the bottom ferromagnet should have an opposite sign. The velocity of SAF skyrmions and SAF Néel domain walls are of the same order and can reach values larger than 1200 m s−1 if a spin–orbit torque from the spin-Hall effect with opposite sign is applied to both ferromagnets. The presence of disordered anisotropy in the form of randomly distributed grains introduces a threshold current for both SAF skyrmions and SAF domain walls motions.

  11. Is Self-Regulation Sufficient? Case of the German Transparency Code

    Directory of Open Access Journals (Sweden)

    Kristin Buske

    2016-02-01

    Full Text Available The German pharmaceutical industry is stepping ahead with its implementation of a new transparency disclosure code for cooperation between pharmaceutical companies and health care professionals (HCPs and health care organisations (HCOs. In Germany, this transparency code (“Transparenzkodex” is applicable since January 2015, and data will be publicly available around mid-2016. No empirical work has been done that addresses the impact of the transparency code on cooperation between HCPs, HCOs and the pharmaceutical companies, including the possibilities of competitive analysis of the available data. In this paper, we interviewed experts from 11 pharmaceutical companies representing small, medium-sized as well as multinational corporations which represent 80% of the German pharmaceutical market. Besides interviews, the authors designed a game to evaluate possible financial investments in key opinion leaders. The market can be regarded as a zero sum game. By allowing public identification of such key HCPs and HCOs, the amount spent on them might increase and not decrease. In a way, the transparency code may foster more and not less spending; in our simulation game, the financial investment in marketing key HCPs and HCOs exceeded sustainable limits.

  12. Joint ICTP-IAEA advanced workshop on model codes for spallation reactions

    International Nuclear Information System (INIS)

    Filges, D.; Leray, S.; Yariv, Y.; Mengoni, A.; Stanculescu, A.; Mank, G.

    2008-08-01

    The International Atomic Energy Agency (IAEA) and the Abdus Salam International Centre for Theoretical Physics (ICTP) organised an expert meeting at the ICTP from 4 to 8 February 2008 to discuss model codes for spallation reactions. These nuclear reactions play an important role in a wide domain of applications ranging from neutron sources for condensed matter and material studies, transmutation of nuclear waste and rare isotope production to astrophysics, simulation of detector set-ups in nuclear and particle physics experiments, and radiation protection near accelerators or in space. The simulation tools developed for these domains use nuclear model codes to compute the production yields and characteristics of all the particles and nuclei generated in these reactions. These codes are generally Monte-Carlo implementations of Intra-Nuclear Cascade (INC) or Quantum Molecular Dynamics (QMD) models, followed by de-excitation (principally evaporation/fission) models. Experts have discussed in depth the physics contained within the different models in order to understand their strengths and weaknesses. Such codes need to be validated against experimental data in order to determine their accuracy and reliability with respect to all forms of application. Agreement was reached during the course of the workshop to organise an international benchmark of the different models developed by different groups around the world. The specifications of the benchmark, including the set of selected experimental data to be compared to the models, were also defined during the workshop. The benchmark will be organised under the auspices of the IAEA in 2008, and the first results will be discussed at the next Accelerator Applications Conference (AccApp'09) to be held in Vienna in May 2009. (author)

  13. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  14. A Parallel Numerical Micromagnetic Code Using FEniCS

    Science.gov (United States)

    Nagy, L.; Williams, W.; Mitchell, L.

    2013-12-01

    Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.

  15. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  16. Is Ontario Moving to Provincial Negotiation of Teaching Contracts?

    Science.gov (United States)

    Jefferson, Anne L.

    2008-01-01

    In Canada, the statutes governing public school teachers' collective bargaining are a combination of the provincial Labour Relations Act or Code and the respective provincial Education/School/Public Schools Act. As education is within the provincial, not federal, domain of legal responsibility, the specifics of each act or code can vary.…

  17. Fractal image coding by an approximation of the collage error

    Science.gov (United States)

    Salih, Ismail; Smith, Stanley H.

    1998-12-01

    In fractal image compression an image is coded as a set of contractive transformations, and is guaranteed to generate an approximation to the original image when iteratively applied to any initial image. In this paper we present a method for mapping similar regions within an image by an approximation of the collage error; that is, range blocks can be approximated by a linear combination of domain blocks.

  18. DCOMP Award Lecture (Metropolis): A 3D Spectral Anelastic Hydrodynamic Code for Shearing, Stratified Flows

    Science.gov (United States)

    Barranco, Joseph

    2006-03-01

    We have developed a three-dimensional (3D) spectral hydrodynamic code to study vortex dynamics in rotating, shearing, stratified systems (eg, the atmosphere of gas giant planets, protoplanetary disks around newly forming protostars). The time-independent background state is stably stratified in the vertical direction and has a unidirectional linear shear flow aligned with one horizontal axis. Superposed on this background state is an unsteady, subsonic flow that is evolved with the Euler equations subject to the anelastic approximation to filter acoustic phenomena. A Fourier-Fourier basis in a set of quasi-Lagrangian coordinates that advect with the background shear is used for spectral expansions in the two horizontal directions. For the vertical direction, two different sets of basis functions have been implemented: (1) Chebyshev polynomials on a truncated, finite domain, and (2) rational Chebyshev functions on an infinite domain. Use of this latter set is equivalent to transforming the infinite domain to a finite one with a cotangent mapping, and using cosine and sine expansions in the mapped coordinate. The nonlinear advection terms are time integrated explicitly, whereas the Coriolis force, buoyancy terms, and pressure/enthalpy gradient are integrated semi- implicitly. We show that internal gravity waves can be damped by adding new terms to the Euler equations. The code exhibits excellent parallel performance with the Message Passing Interface (MPI). As a demonstration of the code, we simulate vortex dynamics in protoplanetary disks and the Kelvin-Helmholtz instability in the dusty midplanes of protoplanetary disks.

  19. Side Information and Noise Learning for Distributed Video Coding using Optical Flow and Clustering

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The coding efficiency of DVC critically depends on the quality of side information generation and accuracy of noise modeling. This paper considers...... Transform Domain Wyner-Ziv (TDWZ) coding and proposes using optical flow to improve side information generation and clustering to improve noise modeling. The optical flow technique is exploited at the decoder side to compensate weaknesses of block based methods, when using motion-compensation to generate...... side information frames. Clustering is introduced to capture cross band correlation and increase local adaptivity in the noise modeling. This paper also proposes techniques to learn from previously decoded (WZ) frames. Different techniques are combined by calculating a number of candidate soft side...

  20. Regular periodical public disclosure obligations of public companies

    Directory of Open Access Journals (Sweden)

    Marjanski Vladimir

    2011-01-01

    Full Text Available Public companies in the capacity of capital market participants have the obligation to inform the public on their legal and financial status, their general business operations, as well as on the issuance of securities and other financial instruments. Such obligations may be divided into two groups: The first group consists of regular periodical public disclosures, such as the publication of financial reports (annual, semi-annual and quarterly, and the management's reports on the public company's business operations. The second group comprises the obligation of occasional (ad hoc public disclosure. The thesis analyses the obligation of public companies to inform the public in course of their regular reporting. The new Capital Market Law based on two EU Directives (the Transparency Directive and the Directive on Public Disclosure of Inside Information and the Definition of Market Manipulation regulates such obligation of public companies in substantially more detail than the prior Law on the Market of Securities and Other Financial Instruments (hereinafter: ZTHV. Due to the above the ZTHV's provisions are compared to the new solutions within the domain of regular periodical disclosure of the Capital Market Law.

  1. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  2. Development of the three dimensional flow model in the SPACE code

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Kim, Shin Whan

    2014-01-01

    SPACE (Safety and Performance Analysis CodE) is a nuclear plant safety analysis code, which has been developed in the Republic of Korea through a joint research between the Korean nuclear industry and research institutes. The SPACE code has been developed with multi-dimensional capabilities as a requirement of the next generation safety code. It allows users to more accurately model the multi-dimensional flow behavior that can be exhibited in components such as the core, lower plenum, upper plenum and downcomer region. Based on generalized models, the code can model any configuration or type of fluid system. All the geometric quantities of mesh are described in terms of cell volume, centroid, face area, and face center, so that it can naturally represent not only the one dimensional (1D) or three dimensional (3D) Cartesian system, but also the cylindrical mesh system. It is possible to simulate large and complex domains by modelling the complex parts with a 3D approach and the rest of the system with a 1D approach. By 1D/3D co-simulation, more realistic conditions and component models can be obtained, providing a deeper understanding of complex systems, and it is expected to overcome the shortcomings of 1D system codes. (author)

  3. Ethical Principles of Psychologists and Code of Conduct.

    Science.gov (United States)

    American Psychologist, 2002

    2002-01-01

    Describes the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct, focusing on introduction and applicability; preamble; general principles; and ethical standards (resolving ethical issues, competence, human relations, privacy and confidentiality, advertising and other public statements, record keeping and…

  4. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  5. A Psychoacoustic-Based Multiple Audio Object Coding Approach via Intra-Object Sparsity

    Directory of Open Access Journals (Sweden)

    Maoshen Jia

    2017-12-01

    Full Text Available Rendering spatial sound scenes via audio objects has become popular in recent years, since it can provide more flexibility for different auditory scenarios, such as 3D movies, spatial audio communication and virtual classrooms. To facilitate high-quality bitrate-efficient distribution for spatial audio objects, an encoding scheme based on intra-object sparsity (approximate k-sparsity of the audio object itself is proposed in this paper. The statistical analysis is presented to validate the notion that the audio object has a stronger sparseness in the Modified Discrete Cosine Transform (MDCT domain than in the Short Time Fourier Transform (STFT domain. By exploiting intra-object sparsity in the MDCT domain, multiple simultaneously occurring audio objects are compressed into a mono downmix signal with side information. To ensure a balanced perception quality of audio objects, a Psychoacoustic-based time-frequency instants sorting algorithm and an energy equalized Number of Preserved Time-Frequency Bins (NPTF allocation strategy are proposed, which are employed in the underlying compression framework. The downmix signal can be further encoded via Scalar Quantized Vector Huffman Coding (SQVH technique at a desirable bitrate, and the side information is transmitted in a lossless manner. Both objective and subjective evaluations show that the proposed encoding scheme outperforms the Sparsity Analysis (SPA approach and Spatial Audio Object Coding (SAOC in cases where eight objects were jointly encoded.

  6. BEAVRS full core burnup calculation in hot full power condition by RMC code

    International Nuclear Information System (INIS)

    Liu, Shichang; Liang, Jingang; Wu, Qu; Guo, JuanJuan; Huang, Shanfang; Tang, Xiao; Li, Zeguang; Wang, Kan

    2017-01-01

    Highlights: • TMS and thermal scattering interpolation were developed to treat cross sections OTF. • Hybrid coupling system was developed for HFP burnup calculation of BEAVRS benchmark. • Domain decomposition was applied to handle memory problem of full core burnup. • Critical boron concentration with burnup by RMC agrees with the benchmark results. • RMC is capable of multi-physics coupling for simulations of nuclear reactors in HFP. - Abstract: Monte Carlo method can provide high fidelity neutronics analysis of different types of nuclear reactors, owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. However, nuclear reactors are complex systems with multi-physics interacting and coupling. MC codes can couple with depletion solver and thermal-hydraulics (T/H) codes simultaneously for the “transport-burnup-thermal-hydraulics” coupling calculations. MIT BEAVRS is a typical “transport-burnup-thermal-hydraulics” coupling benchmark. In this paper, RMC was coupled with sub-channel code COBRA, equipped with on-the-fly temperature-dependent cross section treatment and large-scale detailed burnup calculation based on domain decomposition. Then RMC was applied to the full core burnup calculations of BEAVRS benchmark in hot full power (HFP) condition. The numerical tests show that domain decomposition method can achieve the consistent results compared with original version of RMC while enlarging the computational burnup regions. The results of HFP by RMC agree well with the reference values of BEAVRS benchmark and also agree well with those of MC21. This work proves the feasibility and accuracy of RMC in multi-physics coupling and lifecycle simulations of nuclear reactors.

  7. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  8. Numerical modeling of time domain 3-D problems in accelerator physics

    International Nuclear Information System (INIS)

    Harfoush, F.A.; Jurgens, T.G.

    1990-06-01

    Time domain analysis is relevant in particle accelerators to study the electromagnetic field interaction of a moving source particle on a lagging test particle as the particles pass an accelerating cavity or some other structure. These fields are called wake fields. The travelling beam inside a beam pipe may undergo more complicated interactions with its environment due to the presence of other irregularities like wires, thin slots, joints and other types of obstacles. Analytical solutions of such problems is impossible and one has to resort to a numerical method. In this paper we present results of our first attempt to model these problems in 3-D using our finite difference time domain (FDTD) code. 10 refs., 9 figs

  9. Code of practice in industrial radiography

    International Nuclear Information System (INIS)

    Karma, S. E. M.

    2010-12-01

    The aim of this research is to developing a draft for a new radiation protection code of practice in industrial radiography without ignoring that one issued in 1998 and meet the current international recommendation. Another aim of this study was to assess the current situation of radiation protection in some of the industrial radiography department in Sudan. To achieve the aims of this study, a draft of a code of practice has been developed which is based on international and local relevant recommendations. The developed code includes the following main issues: regulatory responsibilities, radiation protection program and design of radiation installation. The practical part of this study includes scientific visits to two of industrial radiography departments in Sudan so as to assess the degree of compliance of that department with what state in the developed code. The result of each scientific visits revealed that most of the department do not have an effective radiation protection program and that could lead to exposure workers and public to unnecessary dose. Some recommendations were stated that, if implemented could improve the status of radiation protection in industrial radiography department. (Author)

  10. The PP1 binding code: a molecular-lego strategy that governs specificity.

    Science.gov (United States)

    Heroes, Ewald; Lesage, Bart; Görnemann, Janina; Beullens, Monique; Van Meervelt, Luc; Bollen, Mathieu

    2013-01-01

    Ser/Thr protein phosphatase 1 (PP1) is a single-domain hub protein with nearly 200 validated interactors in vertebrates. PP1-interacting proteins (PIPs) are ubiquitously expressed but show an exceptional diversity in brain, testis and white blood cells. The binding of PIPs is mainly mediated by short motifs that dock to surface grooves of PP1. Although PIPs often contain variants of the same PP1 binding motifs, they differ in the number and combination of docking sites. This molecular-lego strategy for binding to PP1 creates holoenzymes with unique properties. The PP1 binding code can be described as specific, universal, degenerate, nonexclusive and dynamic. PIPs control associated PP1 by interference with substrate recruitment or access to the active site. In addition, some PIPs have a subcellular targeting domain that promotes dephosphorylation by increasing the local concentration of PP1. The diversity of the PP1 interactome and the properties of the PP1 binding code account for the exquisite specificity of PP1 in vivo. © 2012 The Authors Journal compilation © 2012 FEBS.

  11. Assessment of damage domains of the High-Temperature Engineering Test Reactor (HTTR)

    International Nuclear Information System (INIS)

    Flores, Alain; Izquierdo, José María; Tuček, Kamil; Gallego, Eduardo

    2014-01-01

    Highlights: • We developed an adequate model for the identification of damage domains of the HTTR. • We analysed an anticipated operational transient, using the HTTR5+/GASTEMP code. • We simulated several transients of the same sequence. • We identified the corresponding damage domains using two methods. • We calculated exceedance frequency using the two methods. - Abstract: This paper presents an assessment analysis of damage domains of the 30 MW th prototype High-Temperature Engineering Test Reactor (HTTR) operated by the Japan Atomic Energy Agency (JAEA). For this purpose, an in-house deterministic risk assessment computational tool was developed based on the Theory of Stimulated Dynamics (TSD). To illustrate the methodology and applicability of the developed modelling approach, assessment results of a control rod (CR) withdrawal accident during subcritical conditions are presented and compared with those obtained by the JAEA

  12. Health domains for sale: the need for global health Internet governance.

    Science.gov (United States)

    Mackey, Tim Ken; Liang, Bryan A; Kohler, Jillian C; Attaran, Amir

    2014-03-05

    A debate on Internet governance for health, or "eHealth governance", is emerging with the impending award of a new dot-health (.health) generic top-level domain name (gTLD) along with a host of other health-related domains. This development is critical as it will shape the future of the health Internet, allowing largely unrestricted use of .health second-level domain names by future registrants, raising concerns about the potential for privacy, use and marketing of health-related information, credibility of online health content, and potential for Internet fraud and abuse. Yet, prospective .health gTLD applicants do not provide adequate safeguards for use of .health or related domains and have few or no ties to the global health community. If approved, one of these for-profit corporate applicants would effectively control the future of the .health address on the Internet with arguably no active oversight from important international public health stakeholders. This would represent a lost opportunity for the public health, medical, and broader health community in establishing a trusted, transparent and reliable source for health on the Internet. Countries, medical associations, civil society, and consumer advocates have objected to these applications on grounds that they do not meet the public interest. We argue that there is an immediate need for action to postpone awarding of the .health gTLD and other health-related gTLDs to address these concerns and ensure the appropriate development of sound eHealth governance rules, principles, and use. This would support the crucial need of ensuring access to quality and evidence-based sources of health information online, as well as establishing a safe and reliable space on the Internet for health. We believe, if properly governed, .health and other domains could represent such a promise in the future.

  13. Combined spatial/angular domain decomposition SN algorithms for shared memory parallel machines

    International Nuclear Information System (INIS)

    Hunter, M.A.; Haghighat, A.

    1993-01-01

    Several parallel processing algorithms on the basis of spatial and angular domain decomposition methods are developed and incorporated into a two-dimensional discrete ordinates transport theory code. These algorithms divide the spatial and angular domains into independent subdomains so that the flux calculations within each subdomain can be processed simultaneously. Two spatial parallel algorithms (Block-Jacobi, red-black), one angular parallel algorithm (η-level), and their combinations are implemented on an eight processor CRAY Y-MP. Parallel performances of the algorithms are measured using a series of fixed source RZ geometry problems. Some of the results are also compared with those executed on an IBM 3090/600J machine. (orig.)

  14. Td4IN2: A drought-responsive durum wheat (Triticum durum Desf.) gene coding for a resistance like protein with serine/threonine protein kinase, nucleotide binding site and leucine rich domains.

    Science.gov (United States)

    Rampino, Patrizia; De Pascali, Mariarosaria; De Caroli, Monica; Luvisi, Andrea; De Bellis, Luigi; Piro, Gabriella; Perrotta, Carla

    2017-11-01

    Wheat, the main food source for a third of world population, appears strongly under threat because of predicted increasing temperatures coupled to drought. Plant complex molecular response to drought stress relies on the gene network controlling cell reactions to abiotic stress. In the natural environment, plants are subjected to the combination of abiotic and biotic stresses. Also the response of plants to biotic stress, to cope with pathogens, involves the activation of a molecular network. Investigations on combination of abiotic and biotic stresses indicate the existence of cross-talk between the two networks and a kind of overlapping can be hypothesized. In this work we describe the isolation and characterization of a drought-related durum wheat (Triticum durum Desf.) gene, identified in a previous study, coding for a protein combining features of NBS-LRR type resistance protein with a S/TPK domain, involved in drought stress response. This is one of the few examples reported where all three domains are present in a single protein and, to our knowledge, it is the first report on a gene specifically induced by drought stress and drought-related conditions, with this particular structure. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  15. openQ*D simulation code for QCD+QED

    Science.gov (United States)

    Campos, Isabel; Fritzsch, Patrick; Hansen, Martin; Krstić Marinković, Marina; Patella, Agostino; Ramos, Alberto; Tantalo, Nazario

    2018-03-01

    The openQ*D code for the simulation of QCD+QED with C* boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion action, the multilevel integrator, the 4th order OMF integrator, the SSE/AVX intrinsics, etc. The photon field is treated as fully dynamical and C* boundary conditions can be chosen in the spatial directions. We discuss the main features of openQ*D, and we show basic test results and performance analysis. An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/.

  16. Non-Coding Transcript Heterogeneity in Mesothelioma: Insights from Asbestos-Exposed Mice.

    Science.gov (United States)

    Felley-Bosco, Emanuela; Rehrauer, Hubert

    2018-04-11

    Mesothelioma is an aggressive, rapidly fatal cancer and a better understanding of its molecular heterogeneity may help with making more efficient therapeutic strategies. Non-coding RNAs represent a larger part of the transcriptome but their contribution to diseases is not fully understood yet. We used recently obtained RNA-seq data from asbestos-exposed mice and performed data mining of publicly available datasets in order to evaluate how non-coding RNA contribute to mesothelioma heterogeneity. Nine non-coding RNAs are specifically elevated in mesothelioma tumors and contribute to human mesothelioma heterogeneity. Because some of them have known oncogenic properties, this study supports the concept of non-coding RNAs as cancer progenitor genes.

  17. 28 CFR 36.608 - Guidance concerning model codes.

    Science.gov (United States)

    2010-07-01

    ... Section 36.608 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION ON THE BASIS OF DISABILITY BY PUBLIC ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building... private entity responsible for developing a model code, the Assistant Attorney General may review the...

  18. Codes of practice and related issues in biomedical waste management

    Energy Technology Data Exchange (ETDEWEB)

    Moy, D.; Watt, C. [Griffith Univ. (Australia)

    1996-12-31

    This paper outlines the development of a National Code of Practice for biomedical waste management in Australia. The 10 key areas addressed by the code are industry mission statement; uniform terms and definitions; community relations - public perceptions and right to know; generation, source separation, and handling; storage requirements; transportation; treatment and disposal; disposal of solid and liquid residues and air emissions; occupational health and safety; staff awareness and education. A comparison with other industry codes in Australia is made. A list of outstanding issues is also provided; these include the development of standard containers, treatment effectiveness, and reusable sharps containers.

  19. Domains and domain loss

    DEFF Research Database (Denmark)

    Haberland, Hartmut

    2005-01-01

    politicians and in the media, especially in the discussion whether some languages undergo ‘domain loss’ vis-à-vis powerful international languages like English. An objection that has been raised here is that domains, as originally conceived, are parameters of language choice and not properties of languages...

  20. A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG-Based Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Xin Chai

    2017-05-01

    Full Text Available Electroencephalography (EEG-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects. Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE, which

  1. Improving the performance of DomainDiscovery of protein domain boundary assignment using inter-domain linker index

    Directory of Open Access Journals (Sweden)

    Zomaya Albert Y

    2006-12-01

    Full Text Available Abstract Background Knowledge of protein domain boundaries is critical for the characterisation and understanding of protein function. The ability to identify domains without the knowledge of the structure – by using sequence information only – is an essential step in many types of protein analyses. In this present study, we demonstrate that the performance of DomainDiscovery is improved significantly by including the inter-domain linker index value for domain identification from sequence-based information. Improved DomainDiscovery uses a Support Vector Machine (SVM approach and a unique training dataset built on the principle of consensus among experts in defining domains in protein structure. The SVM was trained using a PSSM (Position Specific Scoring Matrix, secondary structure, solvent accessibility information and inter-domain linker index to detect possible domain boundaries for a target sequence. Results Improved DomainDiscovery is compared with other methods by benchmarking against a structurally non-redundant dataset and also CASP5 targets. Improved DomainDiscovery achieves 70% accuracy for domain boundary identification in multi-domains proteins. Conclusion Improved DomainDiscovery compares favourably to the performance of other methods and excels in the identification of domain boundaries for multi-domain proteins as a result of introducing support vector machine with benchmark_2 dataset.

  2. Two-phase flow steam generator simulations on parallel computers using domain decomposition method

    International Nuclear Information System (INIS)

    Belliard, M.

    2003-01-01

    Within the framework of the Domain Decomposition Method (DDM), we present industrial steady state two-phase flow simulations of PWR Steam Generators (SG) using iteration-by-sub-domain methods: standard and Adaptive Dirichlet/Neumann methods (ADN). The averaged mixture balance equations are solved by a Fractional-Step algorithm, jointly with the Crank-Nicholson scheme and the Finite Element Method. The algorithm works with overlapping or non-overlapping sub-domains and with conforming or nonconforming meshing. Computations are run on PC networks or on massively parallel mainframe computers. A CEA code-linker and the PVM package are used (master-slave context). SG mock-up simulations, involving up to 32 sub-domains, highlight the efficiency (speed-up, scalability) and the robustness of the chosen approach. With the DDM, the computational problem size is easily increased to about 1,000,000 cells and the CPU time is significantly reduced. The difficulties related to industrial use are also discussed. (author)

  3. Supplementing Public Health Inspection via Social Media

    Science.gov (United States)

    Schomberg, John P.; Haimson, Oliver L.; Hayes, Gillian R.; Anton-Culver, Hoda

    2016-01-01

    Foodborne illness is prevented by inspection and surveillance conducted by health departments across America. Appropriate restaurant behavior is enforced and monitored via public health inspections. However, surveillance coverage provided by state and local health departments is insufficient in preventing the rising number of foodborne illness outbreaks. To address this need for improved surveillance coverage we conducted a supplementary form of public health surveillance using social media data: Yelp.com restaurant reviews in the city of San Francisco. Yelp is a social media site where users post reviews and rate restaurants they have personally visited. Presence of keywords related to health code regulations and foodborne illness symptoms, number of restaurant reviews, number of Yelp stars, and restaurant price range were included in a model predicting a restaurant’s likelihood of health code violation measured by the assigned San Francisco public health code rating. For a list of major health code violations see (S1 Table). We built the predictive model using 71,360 Yelp reviews of restaurants in the San Francisco Bay Area. The predictive model was able to predict health code violations in 78% of the restaurants receiving serious citations in our pilot study of 440 restaurants. Training and validation data sets each pulled data from 220 restaurants in San Francisco. Keyword analysis of free text within Yelp not only improved detection of high-risk restaurants, but it also served to identify specific risk factors related to health code violation. To further validate our model we applied the model generated in our pilot study to Yelp data from 1,542 restaurants in San Francisco. The model achieved 91% sensitivity 74% specificity, area under the receiver operator curve of 98%, and positive predictive value of 29% (given a substandard health code rating prevalence of 10%). When our model was applied to restaurant reviews in New York City we achieved 74

  4. Code and papers: computing publication patterns in the LHC era

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Publications in scholarly journals establish the body of knowledge deriving from scientific research; they also play a fundamental role in the career path of scientists and in the evaluation criteria of funding agencies. This presentation reviews the evolution of computing-oriented publications in HEP following the start of operation of LHC. Quantitative analyses are illustrated, which document the production of scholarly papers on computing-related topics by HEP experiments and core tools projects (including distributed computing R&D), and the citations they receive. Several scientometric indicators are analyzed to characterize the role of computing in HEP literature. Distinctive features of scholarly publication production in the software-oriented and hardware-oriented experimental HEP communities are highlighted. Current patterns and trends are compared to the situation in previous generations' HEP experiments at LEP, Tevatron and B-factories. The results of this scientometric analysis document objec...

  5. Extracellular vesicle associated long non-coding RNAs functionally enhance cell viability

    Directory of Open Access Journals (Sweden)

    Chris Hewson

    2016-10-01

    Full Text Available Cells communicate with one another to create microenvironments and share resources. One avenue by which cells communicate is through the action of exosomes. Exosomes are extracellular vesicles that are released by one cell and taken up by neighbouring cells. But how exosomes instigate communication between cells has remained largely unknown. We present evidence here that particular long non-coding RNA molecules are preferentially packaged into exosomes. We also find that a specific class of these exosome associated non-coding RNAs functionally modulate cell viability by direct interactions with l-lactate dehydrogenase B (LDHB, high-mobility group protein 17 (HMG-17, and CSF2RB, proteins involved in metabolism, nucleosomal architecture and cell signalling respectively. Knowledge of this endogenous cell to cell pathway, those proteins interacting with exosome associated non-coding transcripts and their interacting domains, could lead to a better understanding of not only cell to cell interactions but also the development of exosome targeted approaches in patient specific cell-based therapies. Keywords: Non-coding RNA, Extracellular RNA, Exosomes, Retroelement, Pseudogene

  6. Columbia Public Health Core Curriculum: Short-Term Impact.

    Science.gov (United States)

    Begg, Melissa D; Fried, Linda P; Glover, Jim W; Delva, Marlyn; Wiggin, Maggie; Hooper, Leah; Saxena, Roheeni; de Pinho, Helen; Slomin, Emily; Walker, Julia R; Galea, Sandro

    2015-12-01

    We evaluated a transformed core curriculum for the Columbia University, Mailman School of Public Health (New York, New York) master of public health (MPH) degree. The curriculum, launched in 2012, aims to teach public health as it is practiced: in interdisciplinary teams, drawing on expertise from multiple domains to address complex health challenges. We collected evaluation data starting when the first class of students entered the program and ending with their graduation in May 2014. Students reported being very satisfied with and challenged by the rigorous curriculum and felt prepared to integrate concepts across varied domains and disciplines to solve public health problems. This novel interdisciplinary program could serve as a prototype for other schools that wish to reinvigorate MPH training.

  7. Time Domain Induced Polarization

    DEFF Research Database (Denmark)

    Fiandaca, Gianluca; Auken, Esben; Christiansen, Anders Vest

    2012-01-01

    Time-domain-induced polarization has significantly broadened its field of reference during the last decade, from mineral exploration to environmental geophysics, e.g., for clay and peat identification and landfill characterization. Though, insufficient modeling tools have hitherto limited the use...... of time-domaininduced polarization for wider purposes. For these reasons, a new forward code and inversion algorithm have been developed using the full-time decay of the induced polarization response, together with an accurate description of the transmitter waveform and of the receiver transfer function......, to reconstruct the distribution of the Cole-Cole parameters of the earth. The accurate modeling of the transmitter waveform had a strong influence on the forward response, and we showed that the difference between a solution using a step response and a solution using the accurate modeling often is above 100...

  8. Linking CATHENA with other computer codes through a remote process

    Energy Technology Data Exchange (ETDEWEB)

    Vasic, A.; Hanna, B.N.; Waddington, G.M. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Sabourin, G. [Atomic Energy of Canada Limited, Montreal, Quebec (Canada); Girard, R. [Hydro-Quebec, Montreal, Quebec (Canada)

    2005-07-01

    'Full text:' CATHENA (Canadian Algorithm for THErmalhydraulic Network Analysis) is a computer code developed by Atomic Energy of Canada Limited (AECL). The code uses a transient, one-dimensional, two-fluid representation of two-phase flow in piping networks. CATHENA is used primarily for the analysis of postulated upset conditions in CANDU reactors; however, the code has found a wider range of applications. In the past, the CATHENA thermalhydraulics code included other specialized codes, i.e. ELOCA and the Point LEPreau CONtrol system (LEPCON) as callable subroutine libraries. The combined program was compiled and linked as a separately named code. This code organizational process is not suitable for independent development, maintenance, validation and version tracking of separate computer codes. The alternative solution to provide code development independence is to link CATHENA to other computer codes through a Parallel Virtual Machine (PVM) interface process. PVM is a public domain software package, developed by Oak Ridge National Laboratory and enables a heterogeneous collection of computers connected by a network to be used as a single large parallel machine. The PVM approach has been well accepted by the global computing community and has been used successfully for solving large-scale problems in science, industry, and business. Once development of the appropriate interface for linking independent codes through PVM is completed, future versions of component codes can be developed, distributed separately and coupled as needed by the user. This paper describes the coupling of CATHENA to the ELOCA-IST and the TROLG2 codes through a PVM remote process as an illustration of possible code connections. ELOCA (Element Loss Of Cooling Analysis) is the Industry Standard Toolset (IST) code developed by AECL to simulate the thermo-mechanical response of CANDU fuel elements to transient thermalhydraulics boundary conditions. A separate ELOCA driver program

  9. Linking CATHENA with other computer codes through a remote process

    International Nuclear Information System (INIS)

    Vasic, A.; Hanna, B.N.; Waddington, G.M.; Sabourin, G.; Girard, R.

    2005-01-01

    'Full text:' CATHENA (Canadian Algorithm for THErmalhydraulic Network Analysis) is a computer code developed by Atomic Energy of Canada Limited (AECL). The code uses a transient, one-dimensional, two-fluid representation of two-phase flow in piping networks. CATHENA is used primarily for the analysis of postulated upset conditions in CANDU reactors; however, the code has found a wider range of applications. In the past, the CATHENA thermalhydraulics code included other specialized codes, i.e. ELOCA and the Point LEPreau CONtrol system (LEPCON) as callable subroutine libraries. The combined program was compiled and linked as a separately named code. This code organizational process is not suitable for independent development, maintenance, validation and version tracking of separate computer codes. The alternative solution to provide code development independence is to link CATHENA to other computer codes through a Parallel Virtual Machine (PVM) interface process. PVM is a public domain software package, developed by Oak Ridge National Laboratory and enables a heterogeneous collection of computers connected by a network to be used as a single large parallel machine. The PVM approach has been well accepted by the global computing community and has been used successfully for solving large-scale problems in science, industry, and business. Once development of the appropriate interface for linking independent codes through PVM is completed, future versions of component codes can be developed, distributed separately and coupled as needed by the user. This paper describes the coupling of CATHENA to the ELOCA-IST and the TROLG2 codes through a PVM remote process as an illustration of possible code connections. ELOCA (Element Loss Of Cooling Analysis) is the Industry Standard Toolset (IST) code developed by AECL to simulate the thermo-mechanical response of CANDU fuel elements to transient thermalhydraulics boundary conditions. A separate ELOCA driver program starts, ends

  10. Multi-domain/multi-method numerical approach for neutron transport equation; Couplage de methodes et decomposition de domaine pour la resolution de l'equation du transport des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Girardi, E

    2004-12-15

    A new methodology for the solution of the neutron transport equation, based on domain decomposition has been developed. This approach allows us to employ different numerical methods together for a whole core calculation: a variational nodal method, a discrete ordinate nodal method and a method of characteristics. These new developments authorize the use of independent spatial and angular expansion, non-conformal Cartesian and unstructured meshes for each sub-domain, introducing a flexibility of modeling which is not allowed in today available codes. The effectiveness of our multi-domain/multi-method approach has been tested on several configurations. Among them, one particular application: the benchmark model of the Phebus experimental facility at Cea-Cadarache, shows why this new methodology is relevant to problems with strong local heterogeneities. This comparison has showed that the decomposition method brings more accuracy all along with an important reduction of the computer time.

  11. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  12. Adaptive bit plane quadtree-based block truncation coding for image compression

    Science.gov (United States)

    Li, Shenda; Wang, Jin; Zhu, Qing

    2018-04-01

    Block truncation coding (BTC) is a fast image compression technique applied in spatial domain. Traditional BTC and its variants mainly focus on reducing computational complexity for low bit rate compression, at the cost of lower quality of decoded images, especially for images with rich texture. To solve this problem, in this paper, a quadtree-based block truncation coding algorithm combined with adaptive bit plane transmission is proposed. First, the direction of edge in each block is detected using Sobel operator. For the block with minimal size, adaptive bit plane is utilized to optimize the BTC, which depends on its MSE loss encoded by absolute moment block truncation coding (AMBTC). Extensive experimental results show that our method gains 0.85 dB PSNR on average compare to some other state-of-the-art BTC variants. So it is desirable for real time image compression applications.

  13. Parallel Computing Characteristics of Two-Phase Thermal-Hydraulics code, CUPID

    International Nuclear Information System (INIS)

    Lee, Jae Ryong; Yoon, Han Young

    2013-01-01

    Parallelized CUPID code has proved to be able to reproduce multi-dimensional thermal hydraulic analysis by validating with various conceptual problems and experimental data. In this paper, the characteristics of the parallelized CUPID code were investigated. Both single- and two phase simulation are taken into account. Since the scalability of a parallel simulation is known to be better for fine mesh system, two types of mesh system are considered. In addition, the dependency of the preconditioner for matrix solver was also compared. The scalability for the single-phase flow is better than that for two-phase flow due to the less numbers of iterations for solving pressure matrix. The CUPID code was investigated the parallel performance in terms of scalability. The CUPID code was parallelized with domain decomposition method. The MPI library was adopted to communicate the information at the interface cells. As increasing the number of mesh, the scalability is improved. For a given mesh, single-phase flow simulation with diagonal preconditioner shows the best speedup. However, for the two-phase flow simulation, the ILU preconditioner is recommended since it reduces the overall simulation time

  14. Stego Keys Performance on Feature Based Coding Method in Text Domain

    Directory of Open Access Journals (Sweden)

    Din Roshidi

    2017-01-01

    Full Text Available A main critical factor on embedding process in any text steganography method is a key used known as stego key. This factor will be influenced the success of the embedding process of text steganography method to hide a message from third party or any adversary. One of the important aspects on embedding process in text steganography method is the fitness performance of the stego key. Three parameters of the fitness performance of the stego key have been identified such as capacity ratio, embedded fitness ratio and saving space ratio. It is because a better as capacity ratio, embedded fitness ratio and saving space ratio offers of any stego key; a more message can be hidden. Therefore, main objective of this paper is to analyze three features coding based namely CALP, VERT and QUAD of stego keys in text steganography on their capacity ratio, embedded fitness ratio and saving space ratio. It is found that CALP method give a good effort performance compared to VERT and QUAD methods.

  15. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  16. 2017 Emerging Technology Domains Risk Survey

    Science.gov (United States)

    2017-10-01

    REV-03.18.2016.0 2017 Emerging Technology Domains Risk Survey Daniel Klinedinst Joel Land Kyle O’Meara October 2017 TECHNICAL REPORT CMU/SEI...Distribution Statement A: Approved for Public Release. Distribution is Unlimited. List of Tables Table 1: New and Emerging Technologies 2 Table 2: Security...Impact of New and Emerging Technologies 4 Table 3: Severity Classifications and Impact Scores 5 CMU/SEI-2017-TR-008 | SOFTWARE ENGINEERING

  17. 2016 Emerging Technology Domains Risk Survey

    Science.gov (United States)

    2016-04-05

    measures upon which the CERT/CC based its recommendations and how each domain was triaged for importance. 6. Exploitation Examples details concepts or...Distribution Statement A: Approved for Public Release; Distribution is Unlimited 2 Methodology A measured approach to analysis is required when...only a few vehicles had access to a cellular Internet connection, and only at 3G speeds. Some vehicles already have LTE connections, and many

  18. Development and application of a fully implicit fluid dynamics code for multiphase flow

    International Nuclear Information System (INIS)

    Morii, Tadashi; Ogawa, Yumi

    1996-01-01

    Multiphase flow frequently occurs in a progression of accidents of nuclear reactor severe core damage. The CHAMPAGNE code has been developed to analyze thermohydraulic behavior of multiphase and multicomponent fluid, which requires for its characterization more than one set of velocities, temperatures, masses per unit volume, and so forth at each location in the calculation domain. Calculations of multiphase flow often show physical and numerical instability. The effect of numerical stabilization obtained by the upwind differencing and the fully implicit techniques gives one a convergent solution more easily than other techniques. Several results calculated by the CHAMPAGNE code are explained

  19. Using Sentence-Level Classifiers for Cross-Domain Sentiment Analysis

    Science.gov (United States)

    2014-09-01

    National Defence, 2014 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2014 DRDC-RDDC...domain sentiment classification via spectral feature alignment. In Proceedings of the 19th international conference on World Wide Web, WWW ’10...Dennis, S. 5. DATE OF PUBLICATION (Month and year of publication of document.) September 2014 6a. NO. OF PAGES (Total containing information

  20. Soliton coding for secured optical communication link

    CERN Document Server

    Amiri, Iraj Sadegh; Idrus, Sevia Mahdaliza

    2015-01-01

    Nonlinear behavior of light such as chaos can be observed during propagation of a laser beam inside the microring resonator (MRR) systems. This Brief highlights the design of a system of MRRs to generate a series of logic codes. An optical soliton is used to generate an entangled photon. The ultra-short soliton pulses provide the required communication signals to generate a pair of polarization entangled photons required for quantum keys. In the frequency domain, MRRs can be used to generate optical millimetre-wave solitons with a broadband frequency of 0?100 GHz. The soliton signals are multi

  1. Reclaiming public space: designing for public interaction with private devices

    DEFF Research Database (Denmark)

    Eriksson, Eva; Hansen, Thomas Riisgaard; Lykke-Olesen, Andreas

    2007-01-01

    . In this paper we explore the implications of interacting in public space and how technology can be rethought to not only act as personal devices, but be the tool to reclaim the right and possibility to interact in public spaces. We introduce information exchange, social support and regulation as three central......Public spaces are changing from being ungoverned places for interaction to be more formalized, controlled, less interactive, and designed places aimed at fulfilling a purpose. Simultaneously, new personal mobile technology aims at providing private individual spaces in the public domain...... aspects for reclaiming public space. The PhotoSwapper application is presented as an example of a system designed to integrate pervasive technology in a public setting. The system is strongly inspired by the activities at a traditional market place. Based on the design of the application we discuss four...

  2. User instructions for the DESCARTES environmental accumulation code

    International Nuclear Information System (INIS)

    Miley, T.B.; Eslinger, P.W.; Nichols, W.E.; Lessor, K.S.; Ouderkirk, S.J.

    1994-05-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the Hanford Site near Richland, Washington. The HEDR Project work is conducted under several technical and administrative tasks, among which is the Environmental Pathways and Dose Estimates task. The staff on this task have developed a suite of computer codes which are used to estimate doses to individuals in the public. This document contains the user instructions for the DESCARTES (Dynamic estimates of concentrations and Accumulated Radionuclides in Terrestrial Environments) suite of codes. In addition to the DESCARTES code, this includes two air data preprocessors, a database postprocessor, and several utility routines that are used to format input data needed for DESCARTES

  3. AECL's advanced code program

    Energy Technology Data Exchange (ETDEWEB)

    McGee, G.; Ball, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-07-01

    This paper discusses the advanced code project at AECL.Current suite of Analytical, Scientific and Design (ASD) computer codes in use by Canadian Nuclear Power Industry is mostly developed 20 or more years ago. It is increasingly difficult to develop and maintain. It consist of many independent tools and integrated analysis is difficult, time consuming and error-prone. The objectives of this project is to demonstrate that nuclear facility systems, structures and components meet their design objectives in terms of function, cost, and safety; demonstrate that the nuclear facility meets licensing requirements in terms of consequences of off-normal events; dose to public, workers, impact on environment and demonstrate that the nuclear facility meets operational requirements with respect to on-power fuelling and outage management.

  4. McBits: fast constant-time code-based cryptography

    NARCIS (Netherlands)

    Bernstein, D.J.; Chou, T.; Schwabe, P.

    2015-01-01

    This paper presents extremely fast algorithms for code-based public-key cryptography, including full protection against timing attacks. For example, at a 2^128 security level, this paper achieves a reciprocal decryption throughput of just 60493 cycles (plus cipher cost etc.) on a single Ivy Bridge

  5. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  6. Calculus domains modelled using an original bool algebra based on polygons

    Science.gov (United States)

    Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.

    2016-08-01

    Analytical and numerical computer based models require analytical definitions of the calculus domains. The paper presents a method to model a calculus domain based on a bool algebra which uses solid and hollow polygons. The general calculus relations of the geometrical characteristics that are widely used in mechanical engineering are tested using several shapes of the calculus domain in order to draw conclusions regarding the most effective methods to discretize the domain. The paper also tests the results of several CAD commercial software applications which are able to compute the geometrical characteristics, being drawn interesting conclusions. The tests were also targeting the accuracy of the results vs. the number of nodes on the curved boundary of the cross section. The study required the development of an original software consisting of more than 1700 computer code lines. In comparison with other calculus methods, the discretization using convex polygons is a simpler approach. Moreover, this method doesn't lead to large numbers as the spline approximation did, in that case being required special software packages in order to offer multiple, arbitrary precision. The knowledge resulted from this study may be used to develop complex computer based models in engineering.

  7. Depiction of global trends in publications on mobile health

    Directory of Open Access Journals (Sweden)

    Shahla Foozonkhah

    2017-07-01

    Full Text Available Background: Variety of mobile health initiatives in different levels have been undertaken across many countries. Trends of these initiatives can be reflected in the research published in m-health domain. Aim: This paper aims to depict global trends in the published works on m-health topic. Materials and Methods: The Web of Science database was used to identify all relevant published papers on mobile health domain worldwide. The search was conducted on documents published from January 1898 to December 2014. The criteria for searching were set to be “mHealth” or “Mobile health” or “m health” or “m_health” or “m-health” in topics. Results: Findings revealed an increasing trend of citations and publications on m-health research since 2012. English was the first most predominant language of the publication. The US had the highest number of publication with 649 papers; however, the Netherlands ranked first after considering publication number in terms of countries population. “Studies in Health Technology and Informatics” was the source title with highest number of publications on mobile health topics. Conclusion: Trend of research observed in this study indicates the continuing growth is happening in mobile health domain. This may imply that the new model of health-care delivery is emerging. Further research is needed to specify directions of mobile health research. It is necessary to identify and prioritize the research gaps in this domain.

  8. Multiple Description Coding Based on Optimized Redundancy Removal for 3D Depth Map

    Directory of Open Access Journals (Sweden)

    Sen Han

    2016-06-01

    Full Text Available Multiple description (MD coding is a promising alternative for the robust transmission of information over error-prone channels. In 3D image technology, the depth map represents the distance between the camera and objects in the scene. Using the depth map combined with the existing multiview image, it can be efficient to synthesize images of any virtual viewpoint position, which can display more realistic 3D scenes. Differently from the conventional 2D texture image, the depth map contains a lot of spatial redundancy information, which is not necessary for view synthesis, but may result in the waste of compressed bits, especially when using MD coding for robust transmission. In this paper, we focus on the redundancy removal of MD coding based on the DCT (discrete cosine transform domain. In view of the characteristics of DCT coefficients, at the encoder, a Lagrange optimization approach is designed to determine the amounts of high frequency coefficients in the DCT domain to be removed. It is noted considering the low computing complexity that the entropy is adopted to estimate the bit rate in the optimization. Furthermore, at the decoder, adaptive zero-padding is applied to reconstruct the depth map when some information is lost. The experimental results have shown that compared to the corresponding scheme, the proposed method demonstrates better rate central and side distortion performance.

  9. 77 FR 49818 - Agency Information Collection Activities; Proposed Collection; Comment Request; Bar Code Label...

    Science.gov (United States)

    2012-08-17

    ...] Agency Information Collection Activities; Proposed Collection; Comment Request; Bar Code Label... allow 60 days for public comment in response to the notice. This notice solicits comments on the bar... technology. Bar Code Label Requirement for Human Drug and Biological Products--(OMB Control Number 0910-0537...

  10. Refocusing your domain (until a better title)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    This forum's subject is from 'Zero to Production'. But what exactly goes on during the 'zero' phase? What are the questions that you and your team ask yourselves before starting to code, and which are the first decisions you make? 'Apache or Nginx? Flask or Django? PostreSQL or MongoDB? Containers or VMs?', are usual questions that are brought forward during the design phase. Albeit cool and exciting, these issues are the wrong things to consider ourselves with during the first stages of software development. What is the domain your application operates on? What are your core entities and how do they interract with each other on a logical level? Which are the business rules that would persist whether you would be creating a Web application, a Command Line Interface or simple drawings on a whiteboard? Not only should these issues be the first for a developer to consider but their importance should be clearly reflected in the application's code. In 2012, Robert C. Martin (Uncle Bob) proposed a layered soft...

  11. Development of a multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3 and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other

  12. Error-correcting pairs for a public-key cryptosystem

    International Nuclear Information System (INIS)

    Pellikaan, Ruud; Márquez-Corbella, Irene

    2017-01-01

    Code-based Cryptography (CBC) is a powerful and promising alternative for quantum resistant cryptography. Indeed, together with lattice-based cryptography, multivariate cryptography and hash-based cryptography are the principal available techniques for post-quantum cryptography. CBC was first introduced by McEliece where he designed one of the most efficient Public-Key encryption schemes with exceptionally strong security guarantees and other desirable properties that still resist to attacks based on Quantum Fourier Transform and Amplitude Amplification. The original proposal, which remains unbroken, was based on binary Goppa codes. Later, several families of codes have been proposed in order to reduce the key size. Some of these alternatives have already been broken. One of the main requirements of a code-based cryptosystem is having high performance t -bounded decoding algorithms which is achieved in the case the code has a t -error-correcting pair (ECP). Indeed, those McEliece schemes that use GRS codes, BCH, Goppa and algebraic geometry codes are in fact using an error-correcting pair as a secret key. That is, the security of these Public-Key Cryptosystems is not only based on the inherent intractability of bounded distance decoding but also on the assumption that it is difficult to retrieve efficiently an error-correcting pair. In this paper, the class of codes with a t -ECP is proposed for the McEliece cryptosystem. Moreover, we study the hardness of distinguishing arbitrary codes from those having a t -error correcting pair. (paper)

  13. HangOut: generating clean PSI-BLAST profiles for domains with long insertions.

    Science.gov (United States)

    Kim, Bong-Hyun; Cong, Qian; Grishin, Nick V

    2010-06-15

    Profile-based similarity search is an essential step in structure-function studies of proteins. However, inclusion of non-homologous sequence segments into a profile causes its corruption and results in false positives. Profile corruption is common in multidomain proteins, and single domains with long insertions are a significant source of errors. We developed a procedure (HangOut) that, for a single domain with specified insertion position, cleans erroneously extended PSI-BLAST alignments to generate better profiles. HangOut is implemented in Python 2.3 and runs on all Unix-compatible platforms. The source code is available under the GNU GPL license at http://prodata.swmed.edu/HangOut/. Supplementary data are available at Bioinformatics online.

  14. Nuclear component design ontology building based on ASME codes

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2005-01-01

    The adoption of ontology analysis in the study of concept knowledge acquisition and representation for the nuclear component design process based on computer-supported cooperative work (CSCW) makes it possible to share and reuse numerous concept knowledge of multi-disciplinary domains. A practical ontology building method is accordingly proposed based on Protege knowledge model in combination with both top-down and bottom-up approaches together with Formal Concept Analysis (FCA). FCA exhibits its advantages in the way it helps establish and improve taxonomic hierarchy of concepts and resolve concept conflict occurred in modeling multi-disciplinary domains. With Protege-3.0 as the ontology building tool, a nuclear component design ontology based ASME codes is developed by utilizing the ontology building method. The ontology serves as the basis to realize concept knowledge sharing and reusing of nuclear component design. (authors)

  15. Functional significance of SRJ domain mutations in CITED2.

    Directory of Open Access Journals (Sweden)

    Chiann-mun Chen

    Full Text Available CITED2 is a transcriptional co-activator with 3 conserved domains shared with other CITED family members and a unique Serine-Glycine Rich Junction (SRJ that is highly conserved in placental mammals. Loss of Cited2 in mice results in cardiac and aortic arch malformations, adrenal agenesis, neural tube and placental defects, and partially penetrant defects in left-right patterning. By screening 1126 sporadic congenital heart disease (CHD cases and 1227 controls, we identified 19 variants, including 5 unique non-synonymous sequence variations (N62S, R92G, T166N, G180-A187del and A187T in patients. Many of the CHD-specific variants identified in this and previous studies cluster in the SRJ domain. Transient transfection experiments show that T166N mutation impairs TFAP2 co-activation function and ES cell proliferation. We find that CITED2 is phosphorylated by MAPK1 in vitro at T166, and that MAPK1 activation enhances the coactivation function of CITED2 but not of CITED2-T166N. In order to investigate the functional significance in vivo, we generated a T166N mutation of mouse Cited2. We also used PhiC31 integrase-mediated cassette exchange to generate a Cited2 knock-in allele replacing the mouse Cited2 coding sequence with human CITED2 and with a mutant form deleting the entire SRJ domain. Mouse embryos expressing only CITED2-T166N or CITED2-SRJ-deleted alleles surprisingly show no morphological abnormalities, and mice are viable and fertile. These results indicate that the SRJ domain is dispensable for these functions of CITED2 in mice and that mutations clustering in the SRJ region are unlikely to be the sole cause of the malformations observed in patients with sporadic CHD. Our results also suggest that coding sequence mutations observed in case-control studies need validation using in vivo models and that predictions based on structural conservation and in vitro functional assays, or even in vivo global loss of function models, may be

  16. CARF and WYL domains: ligand-binding regulators of prokaryotic defense systems

    Directory of Open Access Journals (Sweden)

    Kira eMakarova

    2014-04-01

    Full Text Available CRISPR-Cas adaptive immunity systems of bacteria and archaea insert fragments of virus or plasmid DNA as spacer sequences into CRISPR repeat loci. Processed transcripts encompassing these spacers guide the cleavage of the cognate foreign DNA or RNA. Most CRISPR-Cas loci, in addition to recognized cas genes, also include genes that are not directly implicated in spacer acquisition, CRISPR transcript processing or interference. Here we comprehensively analyze sequences, structures and genomic neighborhoods of one of the most widespread groups of such genes that encode proteins containing a predicted nucleotide-binding domain with a Rossmann-like fold, which we denote CARF (CRISPR-associated Rossmann fold. Several CARF protein structures have been determined but functional characterization of these proteins is lacking. The CARF domain is most frequently combined with a C-terminal winged helix-turn-helix DNA-binding domain and effector domains most of which are predicted to possess DNase or RNase activity. Divergent CARF domains are also found in RtcR proteins, sigma-54 dependent regulators of the rtc RNA repair operon. CARF genes frequently co-occur with those coding for proteins containing the WYL domain with the Sm-like SH3 β-barrel fold, which is also predicted to bind ligands. CRISPR-Cas and possibly other defense systems are predicted to be transcriptionally regulated by multiple ligand-binding proteins containing WYL and CARF domains which sense modified nucleotides and nucleotide derivatives generated during virus infection. We hypothesize that CARF domains also transmit the signal from the bound ligand to the fused effector domains which attack either alien or self nucleic acids, resulting, respectively, in immunity complementing the CRISPR-Cas action or in dormancy/programmed cell death.

  17. Selected DOE Headquarters publications received by the Energy Library

    International Nuclear Information System (INIS)

    1978-07-01

    This bibliography provides listings of (mainly policy and programmatic) publications issued from the U.S. Department of Energy, Washington, D.C. The listings are arranged by the ''report code'' assigned to the major organizations at DOE Headquarters, followed by the three categories of environmental reports issued from DOE Headquarters. All of the publications listed, except for those shown as still ''in preparation,'' may be seen in the Energy Library. A title index arranged by title keywords follows the listings. Certain publications have been omitted. They include such items as pamphlets, ''fact sheets,'' bulletins and weekly/monthly issuances of DOE's Energy Information Administration and Economic Regulatory Administration, and employee bulletins and newsletters. Omitted from the bibliography altogether are headquarters publications assigned other types of report codes--e.g., ''HCP'' (Headquarters Contractor Publication) and ''CONF''

  18. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  19. Pilot-Assisted Channel Estimation for Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization

    Science.gov (United States)

    Shima, Tomoyuki; Tomeba, Hiromichi; Adachi, Fumiyuki

    Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of time-domain spreading and orthogonal frequency division multiplexing (OFDM). In orthogonal MC DS-CDMA, the frequency diversity gain can be obtained by applying frequency-domain equalization (FDE) based on minimum mean square error (MMSE) criterion to a block of OFDM symbols and can improve the bit error rate (BER) performance in a severe frequency-selective fading channel. FDE requires an accurate estimate of the channel gain. The channel gain can be estimated by removing the pilot modulation in the frequency domain. In this paper, we propose a pilot-assisted channel estimation suitable for orthogonal MC DS-CDMA with FDE and evaluate, by computer simulation, the BER performance in a frequency-selective Rayleigh fading channel.

  20. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    Science.gov (United States)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  1. Do 'good governance' codes enhance financial accountability? Evidence on managerial pay in Dutch charities

    NARCIS (Netherlands)

    Perego, P.; Verbeeten, F.H.M.

    2015-01-01

    This paper examines the initial impact of a 'good governance' code for charitable organisations that was promulgated in the Netherlands in 2005. Data are gathered from publicly available annual reports of 138 charities in the postimplementation phase of the code (2005-2008). We first examine whether

  2. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  3. Public health program capacity for sustainability: a new framework.

    Science.gov (United States)

    Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C

    2013-02-01

    Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing

  4. The domain theory: patterns for knowledge and software reuse

    National Research Council Canada - National Science Library

    Sutcliffe, Alistair

    2002-01-01

    ..., retrieval system, or any other means, without prior written permission of the publisher. Lawrence Erlbaum Associates, Inc., Publishers 10 Industrial Avenue Mahwah, New Jersey 07430 Library of Congress Cataloging-in-Publication Data Sutcliffe, Alistair, 1951- The domain theory : patterns for knowledge and software reuse / Alistair Sutcl...

  5. Contribution to the development of methods for nuclear reactor core calculations with APOLLO3 code: domain decomposition in transport theory with nonlinear diffusion acceleration for 2D and 3D geometries

    International Nuclear Information System (INIS)

    Lenain, Roland

    2015-01-01

    This thesis is devoted to the implementation of a domain decomposition method applied to the neutron transport equation. The objective of this work is to access high-fidelity deterministic solutions to properly handle heterogeneities located in nuclear reactor cores, for problems' size ranging from color-sets of assemblies to large reactor cores configurations in 2D and 3D. The innovative algorithm developed during the thesis intends to optimize the use of parallelism and memory. The approach also aims to minimize the influence of the parallel implementation on the performances. These goals match the needs of APOLLO3 project, developed at CEA and supported by EDF and AREVA, which must be a portable code (no optimization on a specific architecture) in order to achieve best estimate modeling with resources ranging from personal computer to compute cluster available for engineers analyses. The proposed algorithm is a Parallel Multigroup-Block Jacobi one. Each sub-domain is considered as a multi-group fixed-source problem with volume-sources (fission) and surface-sources (interface flux between the sub-domains). The multi-group problem is solved in each sub-domain and a single communication of the interface flux is required at each power iteration. The spectral radius of the resolution algorithm is made similar to the one of a classical resolution algorithm with a nonlinear diffusion acceleration method: the well-known Coarse Mesh Finite Difference. In this way an ideal scalability is achievable when the calculation is parallelized. The memory organization, taking advantage of shared memory parallelism, optimizes the resources by avoiding redundant copies of the data shared between the sub-domains. Distributed memory architectures are made available by a hybrid parallel method that combines both paradigms of shared memory parallelism and distributed memory parallelism. For large problems, these architectures provide a greater number of processors and the amount of

  6. Analysis of communication costs for domain decomposed Monte Carlo methods in nuclear reactor analysis

    International Nuclear Information System (INIS)

    Siegel, A.; Smith, K.; Fischer, P.; Mahadevan, V.

    2012-01-01

    A domain decomposed Monte Carlo communication kernel is used to carry out performance tests to establish the feasibility of using Monte Carlo techniques for practical Light Water Reactor (LWR) core analyses. The results of the prototype code are interpreted in the context of simplified performance models which elucidate key scaling regimes of the parallel algorithm.

  7. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  8. PHoToNs–A parallel heterogeneous and threads oriented code for cosmological N-body simulation

    Science.gov (United States)

    Wang, Qiao; Cao, Zong-Yan; Gao, Liang; Chi, Xue-Bin; Meng, Chen; Wang, Jie; Wang, Long

    2018-06-01

    We introduce a new code for cosmological simulations, PHoToNs, which incorporates features for performing massive cosmological simulations on heterogeneous high performance computer (HPC) systems and threads oriented programming. PHoToNs adopts a hybrid scheme to compute gravitational force, with the conventional Particle-Mesh (PM) algorithm to compute the long-range force, the Tree algorithm to compute the short range force and the direct summation Particle-Particle (PP) algorithm to compute gravity from very close particles. A self-similar space filling a Peano-Hilbert curve is used to decompose the computing domain. Threads programming is advantageously used to more flexibly manage the domain communication, PM calculation and synchronization, as well as Dual Tree Traversal on the CPU+MIC platform. PHoToNs scales well and efficiency of the PP kernel achieves 68.6% of peak performance on MIC and 74.4% on CPU platforms. We also test the accuracy of the code against the much used Gadget-2 in the community and found excellent agreement.

  9. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  10. A Review on Block Matching Motion Estimation and Automata Theory based Approaches for Fractal Coding

    Directory of Open Access Journals (Sweden)

    Shailesh Kamble

    2016-12-01

    Full Text Available Fractal compression is the lossy compression technique in the field of gray/color image and video compression. It gives high compression ratio, better image quality with fast decoding time but improvement in encoding time is a challenge. This review paper/article presents the analysis of most significant existing approaches in the field of fractal based gray/color images and video compression, different block matching motion estimation approaches for finding out the motion vectors in a frame based on inter-frame coding and intra-frame coding i.e. individual frame coding and automata theory based coding approaches to represent an image/sequence of images. Though different review papers exist related to fractal coding, this paper is different in many sense. One can develop the new shape pattern for motion estimation and modify the existing block matching motion estimation with automata coding to explore the fractal compression technique with specific focus on reducing the encoding time and achieving better image/video reconstruction quality. This paper is useful for the beginners in the domain of video compression.

  11. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    Science.gov (United States)

    2016-05-01

    subject to code matrices that follows the structure given by (113). [⃗ yR y⃗I ] = √ Es 2L [ GR1 −GI1 GI2 GR2 ] [ QR −QI QI QR ] [⃗ bR b⃗I ] + [⃗ nR n⃗I... QR ] [⃗ b+ b⃗− ] + [⃗ n+ n⃗− ] (115) The average likelihood for type 4 CDMA (116) is a special case of type 1 CDMA with twice the code length and...AVERAGE LIKELIHOOD METHODS OF CLASSIFICATION OF CODE DIVISION MULTIPLE ACCESS (CDMA) MAY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  12. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  13. Computer codes in nuclear safety, radiation transport and dosimetry; Les codes de calcul en radioprotection, radiophysique et dosimetrie

    Energy Technology Data Exchange (ETDEWEB)

    Bordy, J M; Kodeli, I; Menard, St; Bouchet, J L; Renard, F; Martin, E; Blazy, L; Voros, S; Bochud, F; Laedermann, J P; Beaugelin, K; Makovicka, L; Quiot, A; Vermeersch, F; Roche, H; Perrin, M C; Laye, F; Bardies, M; Struelens, L; Vanhavere, F; Gschwind, R; Fernandez, F; Quesne, B; Fritsch, P; Lamart, St; Crovisier, Ph; Leservot, A; Antoni, R; Huet, Ch; Thiam, Ch; Donadille, L; Monfort, M; Diop, Ch; Ricard, M

    2006-07-01

    The purpose of this conference was to describe the present state of computer codes dedicated to radiation transport or radiation source assessment or dosimetry. The presentations have been parted into 2 sessions: 1) methodology and 2) uses in industrial or medical or research domains. It appears that 2 different calculation strategies are prevailing, both are based on preliminary Monte-Carlo calculations with data storage. First, quick simulations made from a database of particle histories built though a previous Monte-Carlo simulation and secondly, a neuronal approach involving a learning platform generated through a previous Monte-Carlo simulation. This document gathers the slides of the presentations.

  14. WDEC: A Code for Modeling White Dwarf Structure and Pulsations

    Science.gov (United States)

    Bischoff-Kim, Agnès; Montgomery, Michael H.

    2018-05-01

    The White Dwarf Evolution Code (WDEC), written in Fortran, makes models of white dwarf stars. It is fast, versatile, and includes the latest physics. The code evolves hot (∼100,000 K) input models down to a chosen effective temperature by relaxing the models to be solutions of the equations of stellar structure. The code can also be used to obtain g-mode oscillation modes for the models. WDEC has a long history going back to the late 1960s. Over the years, it has been updated and re-packaged for modern computer architectures and has specifically been used in computationally intensive asteroseismic fitting. Generations of white dwarf astronomers and dozens of publications have made use of the WDEC, although the last true instrument paper is the original one, published in 1975. This paper discusses the history of the code, necessary to understand why it works the way it does, details the physics and features in the code today, and points the reader to where to find the code and a user guide.

  15. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  16. A new 3D maser code applied to flaring events

    Science.gov (United States)

    Gray, M. D.; Mason, L.; Etoka, S.

    2018-06-01

    We set out the theory and discretization scheme for a new finite-element computer code, written specifically for the simulation of maser sources. The code was used to compute fractional inversions at each node of a 3D domain for a range of optical thicknesses. Saturation behaviour of the nodes with regard to location and optical depth was broadly as expected. We have demonstrated via formal solutions of the radiative transfer equation that the apparent size of the model maser cloud decreases as expected with optical depth as viewed by a distant observer. Simulations of rotation of the cloud allowed the construction of light curves for a number of observable quantities. Rotation of the model cloud may be a reasonable model for quasi-periodic variability, but cannot explain periodic flaring.

  17. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  18. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  19. Public sentiment and discourse about Zika virus on Instagram.

    Science.gov (United States)

    Seltzer, E K; Horst-Martz, E; Lu, M; Merchant, R M

    2017-09-01

    Social media have strongly influenced the awareness and perceptions of public health emergencies, and a considerable amount of social media content is now shared through images, rather than text alone. This content can impact preparedness and response due to the popularity and real-time nature of social media platforms. We sought to explore how the image-sharing platform Instagram is used for information dissemination and conversation during the current Zika outbreak. This was a retrospective review of publicly posted images about Zika on Instagram. Using the keyword '#zika' we identified 500 images posted on Instagram from May to August 2016. Images were coded by three reviewers and contextual information was collected for each image about sentiment, image type, content, audience, geography, reliability, and engagement. Of 500 images tagged with #zika, 342 (68%) contained content actually related to Zika. Of the 342 Zika-specific images, 299 were coded as 'health' and 193 were coded 'public interest'. Some images had multiple 'health' and 'public interest' codes. Health images tagged with #zika were primarily related to transmission (43%, 129/299) and prevention (48%, 145/299). Transmission-related posts were more often mosquito-human transmission (73%, 94/129) than human-human transmission (27%, 35/129). Mosquito bite prevention posts outnumbered safe sex prevention; (84%, 122/145) and (16%, 23/145) respectively. Images with a target audience were primarily aimed at women (95%, 36/38). Many posts (60%, 61/101) included misleading, incomplete, or unclear information about the virus. Additionally, many images expressed fear and negative sentiment, (79/156, 51%). Instagram can be used to characterize public sentiment and highlight areas of focus for public health, such as correcting misleading or incomplete information or expanding messages to reach diverse audiences. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights

  20. Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET)

    International Nuclear Information System (INIS)

    Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.

    1994-02-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters, and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user's guide, and a programmer's guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user's guide to the model with emphasis on running the code. The user's guide contains information about the model input and output. The third section is a programmer's guide to the code. It discusses the hardware and software required to run the code. The programmer's guide also discusses program structure and each of the program elements